Notch and WebGL, npm_lazy

06 Dec 2013 | By Alex Young | Comments | Tags webgl npm games node

Notch, WebGL, Dart

Brandon Jones wrote a summary of Notch’s WebGL and-Dart related activity: Notch, WebGL, Dart, and ramping up quickly:

I can’t tell you how many time I see hobby developers saying ”I’m building a game!” and what they actually have to show for it is a really complicated system for loading meshes and shaders. It’s all well and good to think about long term goals and engine structure and such, but if you’re going to build a game then please build a freaking game! Don’t build an engine that you will someday build a game on top of, because you will never get past step one.

Dart seems to appeal to Notch, maybe because of his Java background. It’s cool seeing his work come together on Twitter, from the initial ideas to working code with screenshots.


Sometimes npm goes down (which is why you should donate to npm). But there’s a solution: mirror it! If you’re too lazy to mirror the whole thing, how about just caching the modules you need to deploy your projects? That’s where npm_lazy by Mikito Takada comes in.

This is a local cache for npm. It has a configuration file that allows you to tailor it to your needs – you can set the cache lifespan, HTTP timeout, and so on. Once you’ve set that up, you can run it as a server. Then you just use npm --registry to set the server as your npm registry.

It has some caching logic – anything that isn’t local will be fetched, and metadata is updated when a module is requested the first time after a restart.

Multiprocess Firefox

05 Dec 2013 | By Alex Young | Comments | Tags firefox browsers

Multiprocess Firefox

I know a lot of DailyJS readers who use Chrome and Safari as their main browsers, partly due to iOS and Android’s popularity, and partly because Chrome’s initial performance gains enticed them away from Firefox. The big issue over the last few years has been the fact browsers are switching to using multiple processes, the idea being that resources can be shared better and fallout from crashes can be mitigated.

If Firefox isn’t your main browser, you probably use it for testing or just try it out every few months to see what’s been going on. The thing most of us have been looking for is Chrome (and apparently IE)-style multi-process support. Bill McCloskey has written a post about this very topic: Multiprocess Firefox. Bill is a programmer at Mozilla, and you may remember his post about incremental GC in Firefox.

Although the work so far sounds promising, there are some major technical hurdles. These partly relate to the nature of how JavaScript interacts with the DOM, and how Firefox handles add-ons:

JavaScript execution and layout happen on the main thread, and they block the event loop. Running these components on a separate thread is difficult because they access data, like the DOM, that are not thread-safe. As an alternative, we’ve considered allowing the event loop to run in the middle of JavaScript execution, but doing so would break a lot of assumptions made by other parts of Firefox (not to mention add-ons).

Like the threaded approach, Firefox is able to run its event loop while JavaScript and layout are running in a content process. But unlike threading, the UI code has no access to content DOM or other content data structures, so there is no need for locking or thread-safety. The downside, of course, is that any code in the Firefox UI process that needs to access content data must do so explicitly through message passing.

You might not realise it, but Firefox itself uses a lot of JavaScript:

Content scripts. IPDL takes care of passing messages in C++, but much of Firefox is actually written in JavaScript. Instead of using IPDL directly, JavaScript code relies on the message manager to communicate between processes.

We decided to do the message passing in JavaScript instead, since it’s easier and faster to prototype things there. Rather than change every docshell-using accessor to test if we’re using multiprocess browsing, we decided to create a new XBL binding that applies only to remote <browser> elements. It is called remote-browser.xml, and it extends the existing browser.xml binding.

If you’re an add-on author, you’ll be pleased to hear add-ons are being taken seriously. However, Mozilla may need your help in the future:

We realize that add-ons are extremely important to Firefox users, and we have no intention of abandoning or disrupting add-ons. At the same time, we feel strongly that users will appreciate the security and responsiveness benefits of multiprocess Firefox, so we’re willing to work very hard to get add-ons on board. We’re very interested in working with add-on developers to ensure that their add-ons work well in multiprocess Firefox.

It’s hard to imagine Firefox OS not using multiple processes, and Bill mentions this early on in the post:

Firefox OS relies heavily on the multiprocessing and IPC code introduced during Electrolysis.

Electrolysis was a project to use multiple processes, but the focus was tighter than changing the desktop browser. Firefox’s layout engine, Gecko, supports multiple threads, and the “Gecko platform” supports multiple processes. But, as the Electrolysis wiki page points out, the Firefox frontend does not currently use multiple processes.

Will we see a browser share increase when Firefox is updated to support multiple processes? I don’t know, but as a front-end developer I’m excited about seeing this feature released sooner rather than later.

Node Roundup: Ben Noordhuis,, Infect.js

04 Dec 2013 | By Alex Young | Comments | Tags node modules objective-c di

Ben Noordhuis

Every time I write about a new Node release, I notice how much work Ben Noordhuis has done. He’s been an important contributor to Node and libuv, and has always seemed patient and polite on the mailing list.

Ben Noordhuis decided to leave Node and libuv. If you’re not familiar with his work, take a look at the Node ChangeLog – Ben’s commits go back to summer 2010. Ben will be missed! (GitHub: node-app / Interpreter, License: MIT) is a project to bring Node’s API to JavaScriptCore. The aim is a drop-in replacement that’s compatible with Node’s master branch, and to reuse the JavaScript in Node’s lib/ directory (the core modules).

It needs the latest iOS or Mac OS X, and if you check out the source you’ll need to get the submodules (git submodule update --init --recursive).

The list of currently working modules includes partial fs support, util, url, events, path, stream, querystring, and assert. The process object is also supported.

I’ve only had a brief look at the source in node-app / Nodelike, but it looks like they’re writing Objective-C to add the necessary libuv bindings. This is the code you’ll find in src/ in joyent / node.


Infect.js (GitHub: amwmedia / infect.js, License: MIT, npm: infect) by Andrew Worcester is a dependency injection module. It’s not specifically for Node, but I was intrigued by the idea of bringing AngularJS-style DI to Node projects.

Registering a dependency. A simple call to infect.set() with the name you want to use, and the mutable object you’d like to register will do the trick. In the example below we are using a function, but you can register any type of mutable value (Functions, Arrays, Objects, etc).

infect.set('Logger', function(str) {
  // prepend a time to every log line
  console.log((new Date()).toLocaleTimeString() + ' ==> ' + str);

It supports function injection (infect.func) and class injection (infect.func with a constructor function). Andrew has included jsFiddle examples in the readme, so you can play around with the code.

AngularJS D3 Charts, Yo three.js, TinyCore.js

03 Dec 2013 | By Alex Young | Comments | Tags libraries webgl architecture graphics yeoman

AngularJS D3 Charts

Chinmay sent in Angular-charts (GitHub: chinmaymk / angular-charts, License: MIT, bower: angular-charts), a set of AngularJS directives for graphs that use D3. To use it, include angular-charts.min.js and then inject the dependency with angular.module('yourApp', ['angularCharts']).

Configuration options for graphs can be included using directives, or passed as options in JavaScript. There are also events for mouseover, mouseout, and click. The charts have animations, tooltips, and the values will be adapted to the graph’s size as necessary.

A Yeoman Generator for three.js

If you’re looking for a friendly way to get started with three.js, then Timmy Willison’s Yeoman generator (GitHub: timmywil / generator-threejs, License: MIT, npm: generator-threejs) may be what you’re looking for.

The template it outputs renders a red cube, and it includes the usual Yeoman stuff like a Grunt build script and a web server for development.


TinyCore.js (GitHub: mawrkus / tinycore, License: MIT) by Marc Mignonsin is a library for organising projects around modules:

We use dependency injection to provide the modules the tools they need to perform their job. Instead of having a single sandbox object with a lot of methods, a module defines explicitly the tools it needs. The mediator, that provides a way of communication for the modules, is one of the default tools that has already been implemented (located in the “tools/mediator” folder).

Modules have an extensible, event-based API. There’s also a factory class, called “Toolbox”:

In order to provide the modules the tools they need to perform their job, TinyCore uses a tools factory, TinyCore.Toolbox. A tool can be registered at any time for later use. Whenever a module is instantiated, the tools specified in the module definition will be requested and injected as parameters of the creator function.

TinyCore is written with testing in mind, and has an extension for Jasmine.

connect-cache-manifest, pushnot

02 Dec 2013 | By Alex Young | Comments | Tags express node apps


connect-cache-manifest (GitHub: dai-shi / connect-cache-manifest, License: BSD, npm: connect-cache-manifest) by Daishi Kato is Express middleware for generating a HTML5 cache manifest file. Manifests basically list the files needed by the application when it’s offline, so any essential client-side assets can be cached by the browser.

Daishi’s middleware takes an object and then generates a suitable manifest file. It can recurse through directories so including lists of JavaScript, CSS, and images is easier.

  manifestPath: '/application.manifest',
  files: [{
    file: __dirname + '/public/js/foo.js',
    path: '/js/foo.js'
  }, {
    dir: __dirname + '/public/css',
    prefix: '/css/'
  }, {
    dir: __dirname + '/views',
    prefix: '/html/',
    ignore: function(x) { return /\.bak$/.test(x); },
    replace: function(x) { return x.replace(/\.jade$/, '.html'); }
  networks: ['*'],
  fallbacks: []



pushnot (GitHub: dtinth / pushnot, License: MIT) by Thai Pangsakulyanont is a push notification server based on ØMQ, Express, and Zephyros. It supports notification encryption and can be hooked up to Growl.

Pushnot consists of three major components: The server that clients can send a notification to, and subscribers can subscribe to these notifications. The client is any application that wants to send a notification to the user. The subscriber waits for the server to push the notification and notifys the user.

It’s got an interesting mix of technologies, if you’re looking for an Express application that uses pub/sub, and it has a command-line interface as well.

List.js, _part_

29 Nov 2013 | By Alex Young | Comments | Tags libraries functional



Version 1.0 of List.js (GitHub: javve / list.js, Bower: javve/list.js, License: MIT) by Jonny Strömberg has been released. It’s a small library for making tables searchable, sortable, and filterable. It also works on unordered lists and divs, and supports templating so you can style it fairly easily. It doesn’t have any dependencies, and the API is straightforward JavaScript:

listObj.add({ name: 'Jonny', city: 'Stockholm' });

  { name: 'Gustaf', city: 'Sundsvall' }
, { name: 'Jonas', city: 'Berlin' }

There are also plugins for List.js. The project includes tests and can be installed with Bower or Component.

The _part_ Library

_part_ (GitHub: AutoSponge / part, License: MIT) by Paul Grenier is a small library for making native methods available as partially applied functions.

In part, you use typical OO functions (like the ones in native prototypes) to create two functional counterparts, “left-part” and “right-part”, which partially apply the receiver or parameters respectively.

While I was reading about this library I noticed the author has several other interesting posts on his blog:

JavaScript Developer Survey 2013

28 Nov 2013 | By Alex Young | Comments | Tags community surveys

Here is the JavaScript Developer Survey for 2013! You have two weeks from now to complete the survey (the 12th of December).

I asked for help with the questions last week, and the response was incredible! I really appreciate the suggestions, and I’ve made a list of the pull requests that I accepted. Any that weren’t accepted were either due to a clash with another suggestion, or lack of time on my part.

Scale npm

27 Nov 2013 | By Alex Young | Comments | Tags node modules npm


The official Node blog has a post about issues scaling npm: Keeping The npm Registry Awesome. It explains some of the recent downtime and outlines plans to improve the situation.

The root cause of these downtime was insufficient resources: both hardware and human. This is a full post-mortem where we will be look at how works, what went wrong, how we changed the previous architecture of The npm Registry to fix it, as well next steps we are taking to prevent this from happening again.

The post is relatively lengthy and buried at the end is a plea for funding:

But we need your help! All of these necessary improvements require more servers, more time from Nodejitsu staff and an overall increase to what we spend maintaining the public npm registry as a free service for the Node.js community.

Please take a minute now to donate at!

By burying the funding plea at the end the author was presumably trying to avoid making the post sound spammy, but seeing as most people don’t read anything on the Internet I thought I’d reiterate the point because I’m not scared of sounding spammy: DONATE TO NPM! SAVE FERRIS!

If you pledge $75 you’ll get benefits, like $25 credit on Nodejitsu or Iriscouch. Sounds good to me!

GitHub Avatar Chrome Extension, AMDClean

26 Nov 2013 | By Alex Young | Comments | Tags jquery chrome amd build

GitHub Avatar Chrome Extension

GitHub Avatar Chrome Extension

Writing Firefox add-ons or Chrome extensions can be off-putting for those of us who are good at JavaScript but not so great at browser plugin APIs. Anas Nakawa sent in chrome-github-avatars (GitHub: anasnakawa / chrome-github-avatars, License: MIT) which is a Chrome extension for displaying GitHub avatars on the news feed page.

It might seem like a modest extension, but the reason I liked it was he used a Yeoman generator. Anas’ project includes all the stuff I’m familiar with, like Bower and jQuery, but also things that I’m not too familiar with, like Chrome’s manifest.json. It seems cool that you can use tools popular in the JavaScript community to create browser plugins.


AMDClean (GitHub: gfranko / amdclean, License: MIT) by Greg Franko is a build tool for converting AMD code into standard JavaScript that works with RequireJS’s optimiser.

By incorporating amdclean.js into the build process, there is no need for Require or Almond.

Since AMDclean rewrites your source code into standard JavaScript, it is a great fit for JavaScript library authors who want a tiny download in one file after using the RequireJS Optimizer.

So, you get great code cleanliness with AMD, reduced file sizes, improved code readability, and easy integration with other developers who may not use AMD.

Greg notes that it also supports Grunt, so it should be easy to drop into your existing projects.

ResponsiveComments, jQuery Evergreen

25 Nov 2013 | By Alex Young | Comments | Tags responsive design jquery es6


ResponsiveComments (GitHub: chambaz / ResponsiveComments, License: MIT) by Adam Chambers is designed to support conditional loading using HTML comments:

Through the use of HTML comments, markup can be introduced to progressively enhance an experience as various media queries or feature detections evaluate to true.

Data attributes are used with valid media queries to conditionally display HTML. For example:

<div data-responsive-comment-media="(min-width: 769px)">
  <!-- <div><p>Any content can go in here</p></div> -->

IE 9 and below support requires the matchMedia.js polyfill, but otherwise browser support is pretty good.

jQuery Evergreen

What would jQuery look like if it was written for modern browsers with ES6 modules? jQuery Evergreen (GitHub: webpro / jquery-evergreen, License: MIT, Bower: jquery-evergreen) by Lars Kappert is an attempt at answering that question.

jQuery Evergreen works with modern browsers. It has the same familiar API as jQuery, and is lean and mean with the following, optional modules: selector, class, DOM, event, attr and html. The source is written in the ES6 Modules format, and transpiled to an AMD version, and a “browser global” version using the ES6 Module Transpiler.

It’ll work with current versions of most browsers thanks to transpilation and an IE9 polyfill for classList.

You can even create custom builds with Grunt, like this:

grunt --exclude=attr,class,dom,event,html,mode,selector

WegGL Hobbit, Zombies, Debugging and Profiling Tools

22 Nov 2013 | By Alex Young | Comments | Tags games webgl

WegGL Hobbit, Zombies

There’s a Chrome Experiment called The Hobbit: The Desolation of Smaug that has some pretty fancy effects. While I was playing with it I wondered what open source WebGL stuff people had been making, which is when I found this simple zombie game.

This ain't The Walking Dead, but where's your zombie game?

The source is here: Goobuzz / NavMesh-Project, and there’s a reddit thread which I think the author started.

WebGL Debugging and Profiling Tools

WebGL Debugging and Profiling Tools by Patrick Cozzi has a whole load of resources for working with WebGL. He covers a Firefox WebGL shader editor, WebGL Inspector, Chrome Canvas Inspector, Google Web Tracing Framework, and more.

He even includes useful performance tips:

Depending on how many frames the GPU is behind, a better practice would be to do all the texSubImage2D calls, followed by all the reprojection draw calls, or even move the reprojection draw calls to the end of the frame with the scene draw calls. The idea here is to ensure that the texture upload is complete by the time the reprojection draw call is executed. This trades the latency of completing any one for the throughput of computing many. I have not tried it in this case so I can’t say for certain if the driver lagging behind isn’t already enough time to cover the upload.

And is glad to see browsers including developer tools for WebGL:

Building WebGL tools, such as the Firefox Shader Editor and Chrome Canvas Inspector, directly into the browser developer tools is the right direction. It makes the barrier to entry low, especially for projects with limited time or developers. It helps more developers use the tools and encourages using them more often, for the same reason that unit tests that run in the blink of an eye are then used frequently.

JavaScript Developer Survey 2013: RFC

21 Nov 2013 | By Alex Young | Comments | Tags community surveys

Every year I like to run a survey for the readers of DailyJS. It helps me figure out what I should write about, but I also share the results with the community so you can use the data however you wish.

This year I’ve decided to change the approach. A draft of the survey questions can be found on GitHub, here: alexyoung / dailyjs-survey. You can fork it and send pull requests for questions you’d like to add or change.

After a week or so I’ll compile the changes into a Google Drive form and announce the survey has gone live so people can submit their responses.

I’d really appreciate input on the survey before publishing it, because it helps us get a better idea about what’s going on in the world of client-side and server-side JavaScript development.

Node Roundup: Fowl, grunt-ec2, connect-body-rewrite

20 Nov 2013 | By Alex Young | Comments | Tags node modules grunt amazon foundationdb express


Fowl (GitHub: OptimalBits / fowl, License: MIT, npm: fowl) by Manuel Astudillo is a document and query layer for FoundationDB. It provides a similar API to NoSQL databases like MongoDB, but has support for multidocument transactions:

Transaction support is an incredibly powerful feature that simplifies server logic and helps avoiding difficult to solve race conditions.

Fowl provides a low level API based on keypaths for describing documents and its properties following CRUD semantics.

It includes tests and each API method is documented in the readme file. Basic usage looks like this:

// Open a foundationDB database;

// Create a document (if _id not specify a GUID will be generated)
var john = fowl.create('people', {
  _id: 'john',
  name: 'John',
  lastname: 'Smith',
  balance: 100

// Use transactions to transfer money from one account to another
var tr = fowl.transaction()

tr.get(['people', 'john', 'balance']).then(function(johnBalance) {
  tr.put(['people', 'john', 'balance'], johnBalance - 10);


grunt-ec2 (GitHub: bevacqua / grunt-ec2, License: MIT, npm: grunt-ec2) by Nicolas Bevacqua is a set of Grunt tasks for creating, terminating, and deploying Node applications to AWS EC2 instances.

The deployed Node applications are served from behind an Nginx proxy. The task reference explains what each task does – there are quite a few.

It supports most of the things you want to do when setting up Node applications, including SSL, SSH keys for each instance, rsync support for fast and painless uploads, and hot code swaps.


There are times when the logic of my Node web applications have seemed to need the response body to be rewritten, but in the middleware rather than the main route logic. The connect-body-rewrite (GitHub: rubenv / connect-body-rewrite, License: MIT, npm: connect-body-rewrite) by Ruben Vermeersch makes this possible. The examples use regular expressions to replace text, based on the request headers:

  accept: function (res) {
    return res.getHeader('content-type').match(/text\/html/);
  rewrite: function (body) {
    return body.replace(/<\/body>/, "Copyright 2013 </body>");

I like the way it’s designed to use an accept callback, because it makes it easy to see what the rewriter actually does by keeping the logic close together.

Chained, Door

19 Nov 2013 | By Alex Young | Comments | Tags jquery es6 components libraries


Chained (GitHub: vzaccaria / chained, License: MIT) is another ES6 experiment. It allows APIs that return promises to be mixed with functions that take parameters and return mutated objects. In Vittorio’s example he mixes jQuery’s network methods with Underscore.js to download JSON and then filter it:

getUser = (user) ->
    .filter(-> /package/.test(arguments[0]))
    .map(-> "{arguments[0]}")


In this CoffeeScript example, methods that use promises (get) are mixed with functions that take objects as the first argument (filter, map), using a consistent chainable API. To make this work, Vittorio has used ES6’s introspection features.

The project has detailed notes in the readme about how this works. He mentions that the library came about after trying to create DSLs with JavaScript.


Olivier Wietrich sent in Doors (GitHub: bredele / doors, License: MIT, component: bredele/doors), a module for conditionally restricting access to open events that are triggered when all locks are unlocked.

[State machines and promises] both have something missing. A transition occurs when one condition is triggered. Things are not so simple in real life. You will probably have more than one condition to do something, but one condition is sufficient to not do it. Think about a door with multiple locks: you can’t open the door until all locks are unlocked.

Looking at the documentation, it seems like the author wants to use it to restrict access to an API until certain authentication preconditions are met. There’s a simple HTML example that uses a graphical door, and two locks. You can toggle locks and add more.

Moonjs, jQuery Removes Sourcemap Comments

18 Nov 2013 | By Alex Young | Comments | Tags jquery space simulation


Moonjs (GitHub: siravan / moonjs, License: GPL) by Shahriar Iravanian is a port of the Apollo Guidance Computer using Emscripten.

AGC was the main computer system of the Apollo program that successfully landed 12 astronauts on Moon. There was one AGC on each of the Apollo Command Modules and another one on each Lunar Module. There was also a second backup computer system called Abort Guidance System (AGS) on the Lunar Modules, which is simulated by Virtual AGC, but not the current version of Moonjs.

Recent advances in the JavaScript language - such as optimized engines, ahead-of-time (AOT) compilation, and asm.js - make it possible to write computationally extensive applications in JavaScript. My previous experience with online JavaScript-based simulation (svtsim and hemosim) was very positive and convinced me of the suitability of the HTML5/JavaScript combination in writing portable, easy-to-use simulators.

I was going to try figuring it out, but it reminded me of Kerbal Space Program and I got distracted…

jQuery 1.11.0/2.1.0

jQuery 1.11.0/2.1.0 Beta 2 were released last week. The beta includes AMD support, which is still the headline feature.

Something that I found interesting was the removal of the sourcemap comment:

One of the changes we’ve made in this beta is to remove the sourcemap comment. Sourcemaps have proven to be a very problematic and puzzling thing to developers, generating scores of confused questions on forums like StackOverflow and causing users to think jQuery itself was broken.

We’ll still be generating and distributing sourcemaps, but you will need to add the appropriate sourcemap comment at the end of the minified file if the browser does not support manually associating map files (currently, none do). If you generate your own jQuery file using the custom build process, the sourcemap comment will be present in the minified file and the map is generated; you can either leave it in and use sourcemaps or edit it out and ignore the map file entirely.

That fact sourcemaps generate so much confusion is worth thinking about, because it’s one of those things that people cite as making compile-to-JavaScript languages easier to work with.

Negative Array Indexes

15 Nov 2013 | By Alex Young | Comments | Tags node modules es6 code-review

Sindre Sorhus sent in negative-array, a module for supporting negative array indexes. It’s built using ES6’s Proxy.

Proxies allow you to run methods when certain conditions are met, which means things like profilers become easier to implement. They’re created with var proxy = Proxy(target, handler), where target is an object that will be wrapped with the proxy, and handler is an object that implements the proxy API.

The handler can include methods like has, defineProperty, getPrototypeOf, and more, for controlling access to an object. For more details on how this works, see the Direct Proxies page on the ECMAScript Harmony Wiki.

Sindre’s module allows you to do this:

var negativeArray = require('negative-array');

// adds negative array index support to any passed array
var unicorn = negativeArray(['pony', 'cake', 'rainbow']);

// get the last item by using an negative index

It’ll work in Node 0.8+ with the --harmony flag, and Chrome with Harmony enabled. Visit chrome://flags/#enable-javascript-harmony to set it up.

The implementation is what will probably become a classic pattern: Proxy is used to wrap the array instance with get and set methods that dynamically map the requested array index to something native JavaScript can handle.

Proxy(arr, {
  get: function (target, name) {
    var i = +name;
    return target[i < 0 ? target.length + i : i];
  set: function (target, name, val) {
    var i = +name;
    return target[i < 0 ? target.length + i : i] = val;

I like this example because it adds new functionality that feels like a language feature without changing built-in prototypes. It’s clean and fairly easy to understand once you know what Proxy does. If you wanted to learn about proxies but couldn’t find any good examples, then check out the source on GitHub.

Assertion Counting in Mocha

14 Nov 2013 | By Alex Young | Comments | Tags node libraries testing mocha

A few weeks ago I wrote about node-tap, in Why Don’t You Use Tap?. I usually use Mocha for my tests, and one thing I liked about node-tap was the idea of test plans:

test('Check out my plan', function(t) {
  t.ok(true, "It's ok to plan, and also end.  Watch.");

Test plans help in situations where you want to put assertions inside asynchronous events. For example, if you’re testing a web application and it makes HTTP requests in the test cases, but you also intercept events that indicate various lifecycle events. These could be things like ensuring a notification email was sent when a user signs up, and also checking that their account was saved to the database.

In Mocha, such a test might look like this:

describe('Account creation', function() {
  it('should allow users to sign up', function(done) {
    app.on('notify:accounts:create', function(account) {
      assert(account, 'expected a user account object');

    request(app).post('/accounts').send(userDetails).expect(200, done);

That’s OK, but it has some problems: done will be called twice – once when the web request finishes, and again when the email event is sent (notify:accounts:create).

We could fix this by counting how many assertions have been called:

describe('Account creation', function() {
  it('should allow users to sign up', function(done) {
    var expected = 2;

    function checkDone() {
      if (expected === 0) {

    app.on('notify:accounts:create', function(account) {
      assert(account, 'expected a user account object');

    request(app).post('/accounts').send(userDetails).expect(200, checkDone);

Seeing checkDone all over my tests made me create something more generic. In the following example I use instances of an object called Plan that allows assertions to be counted, and done to only get called once the specified number of assertions have passed. This example can be run with the mocha command-line script.

var assert = require('assert');

function Plan(count, done) {
  this.done = done;
  this.count = count;

Plan.prototype.ok = function(expression) {

  if (this.count === 0) {
    assert(false, 'Too many assertions called');
  } else {

  if (this.count === 0) {

describe('Asynchronous example', function() {
  it('should run two asynchronous methods', function(done) {
    var plan = new Plan(2, done);

    setTimeout(function() {
    }, 50);

    setTimeout(function() {
    }, 25);

This code could be expanded to tie in with Mocha’s timeouts to display the number of missed assertions, if any. Otherwise it does the job: if any of the asynchronous callbacks don’t fire, Mocha will raise a timeout error. It also protects against calling assertions too many times, which can actually happen if you’re making your own asynchronous APIs: I’ve had cases where I’ve triggered callbacks twice by mistake. And, it ensures done is only called when needed.

The question of ensuring assertions were actually called was brought up in this issue for Chai.js: Asserting that assertions were made. And, the Mocha wiki has this assertion counting snippet: Assertion counting.

Outside of Mocha, I found QUnit has asyncTest which allows assertions to be planned like TAP-style tests. With this approach we don’t need to broker calls to done because it uses start instead.

I’ve never quite found the perfect solution to this problem, however. How do you ensure assertions are triggered in Mocha, and how do you handle calling done in tests where there are multiple asynchronous operations?

Node Roundup: 0.10.22, genome.js, Bellhop

13 Nov 2013 | By Alex Young | Comments | Tags node modules biology streams

Node 0.10.22

Node 0.10.22 was released this week. This version has fixes for process, the debugger and the repl, and a memory leak on closed handles.

There’s also a fix for Mac OS 10.9: apparently “Not Responding” was displayed in Activity Monitor for Node processes.


DNA card

genome.js is an interesting project that proclaims “Welcome to the OpenDNA movement”:

genome.js is a fully open source platform built on Node.js that utilizes streams for high-performance analysis of DNA SNPs

There are currently several related repositories for the overall project on GitHub:

Why is this useful? Perhaps you’ve had your genome sequenced by a site like 23andMe, and want to do something with the data. Apparently the cutting edge in web-based DNA browsing isn’t particularly great, so there may be room for innovation.


Bellhop (GitHub: mscdex / bellhop, License: MIT, npm: bellhop) by Brian White is a stream for Pub/Sub and RPC. It can serialize data types that aren’t supported by JSON. Bellhop streams should work over any transport, from HTTP to TCP sockets.

The project has tests, and the readme includes API examples.

Recreating core.async with ES6 Generators, JSON Mask

12 Nov 2013 | By Alex Young | Comments | Tags tutorials node json es6 generators clojure

Recreating core.async with ES6 Generators

Recreating core.async with ES6 Generators is a post by Andrey Popp inspired by Clojure’s core.async. He uses Browserify with a transpiler module that allows ES6 generators to be used in browsers:

function *listen(el, evType) {
  while (true)
    yield function(cb) {
      var fire = function(ev) {
        el.removeEventListener(evType, fire);
        cb(null, ev);
      el.addEventListener(evType, fire);

The tutorial goes on to use JSONP to fetch data from the Wikipedia API. The full source is available as a gist: andreypopp / index.html.


JSON Mask (GitHub: nemtsov / json-mask, License: MIT, npm: json-mask) by Yuriy Nemtsov is a module that provides a DSL for filtering JSON. It works in browsers, includes unit tests, and there’s also Express middleware that can help you filter responses in your Express applications.

One use-case is when you have an HTTP service that returns {"name": "mary", "age": 25} (for example), and you have two clients: (1) that needs all of that information, and (2) that just needs the name.

Now, if the server uses JSON Mask, it would accept a ?fields= query-string. The (2)nd client would then add the following query-string to the request: ?fields=name; after which the server would respond with {"name": "mary"}, and filter out the rest of the information.

This saves bandwidth and improves performance; especially on large objects.

Although trivial in the simplest cases, the task of filtering objects becomes a difficult one quickly. This is why JSON Mask is actually a tiny (and highly optimized) language that is loosely based on XPath, and has support for filtering parts of objects, arrays, wild-card filtering and combinations of the three. So, here’s a less trivial example and usage (note how p/a/b is filtered out):

Getting Started with Hoodie

11 Nov 2013 | By Alex Young | Comments | Tags tutorials node couchdb
Fast ship?

Hoodie (GitHub: hoodiehq / hoodie.js, License: Apache 2.0, npm: hoodie) is a noBackend framework that uses Node, with a focus on client-side development.

noBackend is an approach to decouple apps from backends, by abstracting backend tasks with frontend code. This allows frontend developers to focus on user experience and gives backend developers more flexibility on the implementation side.

Hoodie applications are based around documents, and are backed by CouchDB. It embraces event-based APIs, JSON across the whole stack, and uses npm for plugins. It’s also designed to be easy to deploy to Nodejitsu.

Applications are instances of Hoodie objects. This provides the entry point to most functionality, including user accounts – Hoodie comes with a baked-in account system. Data is stored per-user through, which is accessible from client-side code. In this respect it’s reminiscent of Meteor.

Data lifecycle stages trigger events, so you can easily see when data is changed in some way:'add:task', function(event, changedObject) {
  // Update the view with the changedObject

The API is chainable and will remind client-side developers of jQuery.

As a teaser, here are some of the account handling method calls:

hoodie.account.signUp('', 'secret');
hoodie.account.changeUsername('currentpassword', 'newusername');

To start your own project you’ll need CouchDB installed locally. Then you can npm install -g hoodie-cli, which will allow you to run hoodie new to create a new project. The process is straightforward if you’re already using Node and don’t mind running CouchDB, and it’s fully documented on Hoodie’s website.

The documentation for Hoodie is solid, and I’ve found it easy to follow so far. Although I only learned about it at the Great British Node Conference, the developers have been working on it for over a year. It has sponsors, and is looking for more. It’s a well-presented, open source project, with momentum behind it.

Talking to Sven Lito at the Great British Node Conference I got the impression that the team are treating it as a commercial project, while believing in the open source model. This made me want to get on board and use it myself, so I suggest you give it a try.