The JavaScript blog.


libraries node modules android streams io iterator-protocol

Node Roundup: node-android, typed-morph, stdio

Posted on .


node-android (GitHub: InstantWebP2P / node-android, License: MIT) is a Node port for Android. It uses libuvpp and libuv-java, and is mostly compatible with Node 0.10.x. The authors haven't yet implemented the crypto modules, but it supports most of the other core modules as far as I can tell.

To use it you need to open the source in the Android Developer Tools plugin for Eclipse.

The people behind this project are based in Shanghai, and have also created some interesting peer-to-peer modules for Node. I don't know what kind of commercial work they do, but there's a lot of activity on the InstantWebP2P GitHub organisation.


typed-morph (GitHub: pjsteam / typed-morph, License: MIT, npm: typed-morph) by Damian Schenkelman is a set of iterators implemented using the iterator protocol so you can map, reduce, and filter typed arrays without using intermediate arrays.

The evaluation is delayed until the results are consumed, so chained calls are basically lazy:

var elements = new Uint16Array([1,4,7,10]);  
var iter = wrap(elements)  
  .map(function(e) { return e + 1; })
  .filter(function(e) { return e % 2 === 0; });

// at this point no processing has taken place
iter.reduce(function(value, current) { return value + current; }, 0);



stdio (GitHub: sgmonda / stdio, License: MIT, npm: stdio) by Sergio GarcĂ­a is a module for general standard input/output management. You can use it for things like supporting command-line options, reading input by line, and prompting for input.

Command-line options are specified using a nice JavaScript object format:

var stdio = require('stdio');  
var ops = stdio.getopt({  
  check: {key: 'c', args: 2, description: 'What this option means'},
  map: {key: 'm', description: 'Another description', mandatory: true},
  kaka: {key: 'k', args: 2, mandatory: true},
  ooo: {key: 'o'}

And options can also have a multiple flag, so you can support expressions like -f a.txt -f b.txt -f c.txt.

stdio can also automatically generate help, and it comes with Jasmine tests and documentation in the readme.


node modules streams biology

Node Roundup: 0.10.22, genome.js, Bellhop

Posted on .

Node 0.10.22

Node 0.10.22 was released this week. This version has fixes for process, the debugger and the repl, and a memory leak on closed handles.

There's also a fix for Mac OS 10.9: apparently "Not Responding" was displayed in Activity Monitor for Node processes.


DNA card

genome.js is an interesting project that proclaims "Welcome to the OpenDNA movement":

genome.js is a fully open source platform built on Node.js that utilizes streams for high-performance analysis of DNA SNPs

There are currently several related repositories for the overall project on GitHub:

Why is this useful? Perhaps you've had your genome sequenced by a site like 23andMe, and want to do something with the data. Apparently the cutting edge in web-based DNA browsing isn't particularly great, so there may be room for innovation.


Bellhop (GitHub: mscdex / bellhop, License: MIT, npm: bellhop) by Brian White is a stream for Pub/Sub and RPC. It can serialize data types that aren't supported by JSON. Bellhop streams should work over any transport, from HTTP to TCP sockets.

The project has tests, and the readme includes API examples.


node modules fs streams

Node Roundup: 0.10.15, IntervalStream, StreamToMongo, fileswap-stream

Posted on .

You can send in your Node projects for review through our contact form.


Node 0.10.15 was released last week, which quickly followed 0.10.14. The newer release adds a fix for process.getuid, to address an issue on Mac OS X:

This commit should unbreak npm on OS X - it's hitting the new 'uid must be an unsigned int' check when installing as e.g. user 'nobody' (which has an UID of -2 in /etc/passwd or 4294967294 when cast to an uid_t.)

Version 0.10.14 fixed bugs in os, url, and upgraded npm and uv.


node-interval-stream (GitHub: czzarr / node-interval-stream, License: MIT, npm: interval-stream) by Stanislas Marion is a simple module that provides a Transform stream that triggers events based on an interval. The author's example combines IntervalStream with request to display the results of a large download every 2 seconds:

var request = require('request');  
var IntervalStream  = require('interval-stream');  
var is = new IntervalStream(2000); // emit every 2 seconds



StreamToMongo (GitHub: czzarr / node-stream-to-mongo, License: MIT, npm: stream-to-mongo), also by Stanislas Marion, allows data to be streamed to MongoDB. This could be used to stream JSON data directly into a database. The example in the readme uses the npm registry, effectively allowing you to create a structured local cache in Mongo of all the module metadata on npm.


Finally, fileswap-stream (GitHub: bpostlethwaite / fileswap-stream, License: MIT, npm: fileswap-stream) by Ben Postlethwaite allows underlying file resources to be swapped. This might be useful if you're streaming data to log files, and want to split the files:

Write to a writable file-stream that swaps out its underlying file resources according to swapper and naming functions. This can be used for a persistent log or data stream - just stream to it 24/7 and let it swap out to new files whenever you trigger it to.


node modules network security grunt streams

Node Roundup: 0.10.5, Node Task, cap

Posted on .

You can send in your Node projects for review through our contact form.

Node 0.10.5

Node 0.10.5 is out. Apparently it now builds under Visual Studio 2012.

One small change I noticed was added by Ryan Doenges, where the assert module now puts information into the message property:

4716dc6 made assert.equal() and related functions work better by generating a better toString() from the expected, actual, and operator values passed to fail(). Unfortunately, this was accomplished by putting the generated message into the error's name property. When you passed in a custom error message, the error would put the custom error into name and message, resulting in helpful string representations like "AssertionError: Oh no: Oh no".

The pull request for this is nice to read (apparently Ryan is only 17, so he got his dad to sign the Contributor License Agreement document).

Node Task

Node Task, sent in by Khalid Khan, is a specification for a promise-based API that wraps around JavaScript tasks. The idea is that tasks used with projects like Grunt should be compatible, and able to be processed through an arbitrary pipeline:

Eventually, it is hoped that popular JS libraries will maintain their own node-task modules (think jshint, stylus, handlebars, etc). If/when this happens, it will be trivial to pass files through an arbitrary pipeline of interactions and transformations utilizing libraries across the entire npm ecosystem.

After reading through each specification, it seems like an interesting attempt to standardise Grunt-like tasks. The API seems streams-inspired, as it's based around EventEmitter2 with various additional methods that are left for implementors to fill in.


Brian White sent in his cross-platform packet capturing library, "cap" (GitHub: mscdex / cap, License: MIT, npm: cap). It's built using WinPcap for Windows and libpcap and libpcap-dev for Unix-like operating systems.

It's time to write your vulnerability scanning tools with Node!

Brian also sent in "dicer" (GitHub: mscdex / dicer, License: MIT, npm: dicer), which is a streaming multipart parser. It uses the streams2 base classes and readable-stream for Node 0.8 support.


node streams streams2 5min

Five Minute Guide to Streams2

Posted on .

Node 0.10 is the latest stable branch of Node. It's the branch you should be using for Real Work™. The most significant API changes can be found in the stream module. This is a quick guide to streams2 to get you up to speed.

The Base Classes

There are now five base classes for creating your own streams: Readable, Writable, Duplex, Transform, and PassThrough. These base classes inherit from EventEmitter so you can attach listeners and emit events as you normally would. It's perfectly acceptable to emit custom events -- this might make sense, for example, if you're writing a streaming parser. The parser could emit events like 'headers' to indicate the headers have been parsed, perhaps for a CSV file.

To make your own Readable stream class, inherit from stream.Readable and implement the _read(size) method. The size argument is "advisory" -- a lot of Readable implementations can safely ignore it. Once your _read method has collected data from an underlying I/O source, it can send it by calling this.push(chunk) -- internally data will be placed into a queue so "clients" of your class can deal with it when they're ready.

The Writable class should also be inherited from, but this time a _write(chunk, encoding, callback) method should be implemented. Once you've written data to the underlying I/O source, callback can be called, passing an error if required.

The Duplex class is like a Readable and Writable stream in one -- it allows data sources that transmit and receive data to be modelled. This makes sense when you think about it -- TCP network sockets transmit and receive data. To implement a Duplex stream, inherit from stream.Duplex and implement both the _read and _write methods.

The Transform class is useful for implementing parsers, like the CSV example I mentioned earlier. In general, streams that change data in some way should be implemented using stream.Transform. Although Transform sounds a bit like a Duplex stream, this time you'll need to implement a _transform(chunk, encoding, callback) method. I've noticed several projects in the wild that use Duplex streams with a stubbed _read method, and I wondered if these would be better served by using a Transform class instead.

Finally, the PassThrough stream inherits from Transform to do... nothing. It relays the input to the output. That makes it ideal for sitting inside a pipe chain to spy on streams, and people have been using this to write tests or instrument streams in some way.


Pipes must follow this pattern: readable.pipe(writable). As Duplex and Transform streams can both read and write, they can be placed in either position in the chain. For example, I've been using process.stdin.pipe(csvParser).pipe(process.stdout) where csvParser is a Transform stream.


The general pattern for inheriting from the base classes is as follows:

  1. Create a constructor function that calls the base class using baseClass.call(this, options)
  2. Correctly inherit from the base class using Object.create or util.inherits
  3. Implement the required underscored method, whether it's _read, _write, or _transform

Here's a quick stream.Writable example:

var stream = require('stream');

GreenStream.prototype = Object.create(stream.Writable.prototype, {  
  constructor: { value: GreenStream }

function GreenStream(options) {  
  stream.Writable.call(this, options);

GreenStream.prototype._write = function(chunk, encoding, callback) {  
  process.stdout.write('\u001b[32m' + chunk + '\u001b[39m');

process.stdin.pipe(new GreenStream());  

Forwards Compatibility

If you want to use streams2 with Node 0.8 projects, then readable-stream provides access to the newer APIs in an npm-installable module. Since the stream core module is implemented in JavaScript, then it makes sense that the newer API can be used in Node 0.8.

Some open source module authors are including readable-stream as a dependency and then conditionally loading it:

var PassThrough = require('stream').PassThrough;

if (!PassThrough) {  
  PassThrough = require('readable-stream/passthrough');

This example is taken from until-stream.

Streams2 in the Wild

There are some interesting open source projects that use the new streaming API that I've been collecting on GitHub. multiparser by Jesse Tane is a stream.Writable HTML form parser. until-stream by Evan Oxfeld will pause a stream when a certain signature is reached.

Hiccup by naomik uses the new streams API to simulate sporadic throughput, and the same author has also released bun which can help combine pipes into composable units, and Burro which can package objects into length-prefixed JSON byte streams. Conrad Pankoff used Burro to write Pillion, which is an RPC system for object streams.

There are also less esoteric modules, like csv-streamify which is a CSV parser.