DailyJS

DailyJS

The JavaScript blog.


Tagcode-review
Featured

node modules code-review ES6

Negative Array Indexes

Posted on .

Sindre Sorhus sent in negative-array, a module for supporting negative array indexes. It's built using ES6's Proxy.

Proxies allow you to run methods when certain conditions are met, which means things like profilers become easier to implement. They're created with var proxy = Proxy(target, handler), where target is an object that will be wrapped with the proxy, and handler is an object that implements the proxy API.

The handler can include methods like has, defineProperty, getPrototypeOf, and more, for controlling access to an object. For more details on how this works, see the Direct Proxies page on the ECMAScript Harmony Wiki.

Sindre's module allows you to do this:

var negativeArray = require('negative-array');

// adds negative array index support to any passed array
var unicorn = negativeArray(['pony', 'cake', 'rainbow']);

// get the last item by using an negative index
console.log(unicorn[-1]);  

It'll work in Node 0.8+ with the --harmony flag, and Chrome with Harmony enabled. Visit chrome://flags/#enable-javascript-harmony to set it up.

The implementation is what will probably become a classic pattern: Proxy is used to wrap the array instance with get and set methods that dynamically map the requested array index to something native JavaScript can handle.

Proxy(arr, {  
  get: function (target, name) {
    var i = +name;
    return target[i < 0 ? target.length + i : i];
  },
  set: function (target, name, val) {
    var i = +name;
    return target[i < 0 ? target.length + i : i] = val;
  }
});

I like this example because it adds new functionality that feels like a language feature without changing built-in prototypes. It's clean and fairly easy to understand once you know what Proxy does. If you wanted to learn about proxies but couldn't find any good examples, then check out the source on GitHub.

Featured

databases node modules code-review

NeDB: SQLite for Node

Posted on .

NeDB (GitHub: louischatriot / nedb, License: MIT, npm: nedb) by Louis Chatriot is datastore for Node that implements a subset of MongoDB's API. It has very modest dependencies (async, underscore, binary-search-tree, mkdirp), and can even be used in browsers:

As of v0.8.0, you can use NeDB in the browser! You can find it and its minified version in the repository, in the browser-version/out directory. You only need to require nedb.js or nedb.min.js in your HTML file and the global object NeDB can be used right away, with the same API as the server version.

The author has been enthusiastically benchmarking the project and is rather confident about its performance. And it really does look like MongoDB:

db.find({ satellites: { $lt: 'Amos' } }, function(err, docs) {  
  // docs is empty since Phobos and Deimos are after Amos in lexicographical order
});

db.update({ system: 'solar' }, { $set: { system: 'solar system' } }, { multi: true }, function(err, numReplaced) {  
  // numReplaced = 3
  // Field 'system' on Mars, Earth, Jupiter now has value 'solar system'
});

// Using a unique constraint with the index
db.ensureIndex({ fieldName: 'somefield', unique: true }, function(err) {  
});

The persistence layer isn't required: databases can be in-memory if desired. To understand how it all works, first take a look at how each database operation is implemented. For example, Datastore.prototype._insert on lines 265 to 268 in datastore.js calls persistence.persistNewState. The persistNewState method is called for anything that changes data (insert, update, remove). The persistNewState method itself returns early if inMemoryOnly has been set, otherwise it appends UTF8-encoded serialized models to the file that backs the database.

Models are serialized using serialize in model.js. This uses JSON.stringify with a callback that maps undefined values to null, and checks if keys are valid using similar rules to MongoDB (you can't have keys that start with a dollar or contain a full-stop).

When data is retrieved from disk, async.waterfall is used, from Caolan McMahon's popular module:

Runs an array of functions in series, each passing their results to the next in the array. However, if any of the functions pass an error to the callback, the next function is not executed and the main callback is immediately called with the error.

Elsewhere the async module is used to queue commands, which ensures they're executed in the desired order. All commands are passed through an instance of the Executor class, including the loading of data from disk when the persistence layer is initialised.

By combining async and the native JSON encoder and parser, Louis has made a convincing yet lightweight MongoDB implementation. It ties together a lot of the skills required to work as a Node developer -- asynchronous I/O, and event-based programming. NeDB currently has 886 stars on GitHub, so it's clearly a popular project: I suggest taking a look at the source if you're interested in how people use modules like async.

Featured

code-review observer

How Does Watch.js Work?

Posted on .

Last week noticed a lot of interest in Watch.js (License: MIT) by Gil Lopes Bueno, so I thought it would be interesting to take a look at how it works. It allows changes to properties on objects to be observed -- whenever a change is made, a callback receives the new and old values, as well as the property name:

var ex1 = {  
  attr1: 'initial value of attr1'
, attr2: 'initial value of attr2'
};

ex1.watch('attr1', function() {  
  alert('attr1 changed');
});

The watch method can also accept an array of property names, and omitting it will cause the callback to run whenever any property changes. The unwatch method will remove a watcher.

First, let's get the elephant out of the room: this implementation modifies Object.prototype -- that's where the watch method comes from in the previous example. The author is planning to change the API to avoid modifying Object.prototype.

Second, this method is polymorphic in that it behaves differently based on the supplied arguments. This is quite common in client-side code. It's implemented by looking at the number of arguments without requiring too much type checking, in watch:

WatchJS.defineProp(Object.prototype, "watch", function() {

    if (arguments.length == 1) 
        this.watchAll.apply(this, arguments);
    else if (WatchJS.isArray(arguments[0])) 
        this.watchMany.apply(this, arguments);
    else
        this.watchOne.apply(this, arguments);

});

You're probably wondering what WatchJS.defineProp is. It's actually a convenience method to use ES5's Object.defineProperty in browsers that support it:

defineProp: function(obj, propName, value){  
    try{
        Object.defineProperty(obj, propName, {
                enumerable: false
            , configurable: true
            , writable: false
            , value: value
            });
    }catch(error){
        obj[propName] = value;
    }
}

The watchMany method uses a utility method, WatchJS.isArray, to determine how to loop over the supplied arguments, calling watchOne on each in turn. The watchAll method calls watchMany, so there's a lot of internal code reuse.

Most of the work gets carried out by watchOne. This calls WatchJS.defineGetAndSet(obj, prop, getter, setter) with a custom getter and setter to wrap around values so they can be watched. However, watching values change has a few complications.

For one thing, arrays have mutator methods like push and pop. Therefore, watchFunctions is called to wrap each of these methods with a suitable call to WatchJS.defineProp. Also, when a property is set to a new value, all of these wrapped methods will be lost. To get around this, the custom setter calls obj.watchFunctions(prop) again.

When a value has changed, callWatchers is called. An internal list of watcher callbacks indexed on property names is maintained and called by a for-in loop. It's important to note that these callbacks only run if the values are actually different. This is tested by calling JSON.stringify(oldval) != JSON.stringify(newval) -- presumably the author used this approach because it's an easy way to compare the value of objects. I'd consider benchmarking this against other solutions.

Finally, we get to WatchJS.defineGetAndSet, which attempts to use Object.defineProperty if available, otherwise Object.prototype.__defineGetter__.call and Object.prototype.__defineSetter__.call are used.

Conclusion

Rather than using exceptions to track browser support, it might be better to use capability testing to check for Object.defineProperty and Object.prototype.__defineGetter__.call when the first call is made, then cache the result. As noted in the project's issues, Object.observe should provide a more efficient approach in the future -- this is an accepted Harmony proposal.

To really mature this project, changing the API to avoid modifying Object.prototype would be a good idea, and adding benchmarks and unit tests would be useful as well.

This article was based on watch.js commit dc9aac6a6e.

Featured

tutorials mvc backbone.js code-review

Backbone.js: Internals Summary

Posted on .

Over the last month we've looked at Backbone's internals in detail. This post summarises these findings.

To recap, here's a list of all the parts in Backbone.js: Hacker's Guide:

  • Part 1: Setup, Events, Models
  • Part 2: Constructor, Inheritance, Collections, Chainable API
  • Part 3: Router, History, Views
  • Part 4: Inheritance, Sync

Leverage Events

Backbone's classes are designed to be inherited from. Every single one of these classes inherits from Backbone.Events:

  • Backbone.Model
  • Backbone.Collection
  • Backbone.Router
  • Backbone.History
  • Backbone.View

That means when designing applications built with Backbone, events are a key architectural component. Events are the standard way to deal with user interface actions, through the declarative event bindings on views, and also model and collection changes. However, you can easily add your own custom events.

When learning Backbone it's important to get a feel for the built-in event names. Incorrectly binding a collection reset event, for example, could cause your application to render more often than it should. Mastering events is one of the quickest ways to become more productive with Backbone.

Underscore.js

Since Backbone depends on Underscore, it's worth keeping this in mind when dealing with any kind of arrays or collections of data. Also, familiarity with Underscore's methods will help work with Backbone.Collection effectively.

Views

It's easy to slip into using $, but avoid this where possible. Backbone caches a view's element, so use this.$el instead. Design views based on the single responsibility principle.

It might be tempting to let "container" view render HTML directly by using $().html, but resisting the temptation and creating a hierarchy of views will make it much easier to debug your code and write automated tests.

Interestingly, Backbone doesn't have a lot of code dedicated to templates, but it can work with the template method. I use this with RequireJS text file dependencies to load remote templates during development, then I use the RequireJS build script to generate something suitable for deployment. This makes code easy to test and fast to load.

API Style

Backbone's API is thankfully very consistent. Even the history API accepts a silent option, which is used throughout the library to stop events from firing when they're not required.

Backbone's collections have Underscore's chainable API, which can be handy, but care must be taken to use this correctly.

Testing Backbone

So far I've been reviewing Backbone's code to demystify the framework as a whole. However, it's worth noting that other technologies work very well with Backbone and Underscore. RequireJS and AMD modules can be a great way to break up projects.

However, one area that Backbone doesn't address is testing. This is unfortunate, because testing Backbone projects definitely isn't obvious. In Unit Testing Backbone.js Apps With QUnit And SinonJS, Addy Osmani describes one method in detail.

I have the following rules for testing Backbone projects:

  1. The full application should be running during testing
  2. Tests shouldn't depend on any markup in the test harness HTML file (or as little as possible)
  3. Tests shouldn't touch the network for data

The second rule in particular is aided by using templates loaded by RequireJS and avoiding those pesky calls to $() in views.

Featured

tutorials mvc backbone.js code-review

Backbone.js: Hacker's Guide Part 4

Posted on .

Last week we looked at Backbone's History and View APIs. We're coming to the end of this detailed look at Backbone's internals, but there are still a few interesting things left:

  • Backbone's inheritance implementation
  • Backbone.sync

Backbone's inheritance Implementation

The comments indicate that the inherits function is inspired by goog.inherits. Google's implementation is from the Closure Library, but Backbone's API accepts two objects (incorrectly referred to as a hash) containing "instance" and "static" methods. Each of Backbone's objects has an extend method:

Model.extend = Collection.extend = Router.extend = View.extend = extend;  

Most development with Backbone is based around inheriting from these objects, and they're designed to mimic a classical object-oriented implementation.

Backbone uses Underscore's extend method:

each(slice.call(arguments, 1), function(source) {  
  for (var prop in source) {
    obj[prop] = source[prop];
  }
});
return obj;  

This isn't the same as ES5's Object.create, it's actually copying properties (methods and values) from one object to another. Since this isn't enough to support Backbone's inheritance and class model, the following steps are performed:

  1. The instance methods are checked to see if there's a constructor property. If so, the class's constructor is used, otherwise the parent's constructor is used (i.e., Backbone.Model)
  2. Underscore's extend method is called to add the parent class's methods to the new child class
  3. The prototype property of a blank constructor function is assigned with the parent's prototype, and a new instance of this is set to the child's prototype property
  4. Underscore's extend method is called twice to add the static and instance methods to the child class
  5. The child's prototype's constructor and a __super__ property are assigned

This pattern is also used for classes in CoffeeScript, so Backbone classes are compatible with CoffeeScript classes.

Update: Jeremy Ashkenas clarified this process on Twitter:

... it's just your basic prototype chain, plus one extra goodie: any constructor properties (static) are copied over as well.

Backbone's Sync API

The Backbone.sync method is intended to be overridden to support other backends. The built-in method is tailed to a certain breed of RESTful JSON APIs -- Backbone was originally extracted from a Ruby on Rails application, which uses HTTP methods like PUT the same way.

The way this works is the model and collection classes have a sync method that calls Backbone.sync. Both will call this.sync internally when fetching, saving, or deleting items.

The sync method is called with three parameters:

  • method: One of create, update, delete, read
  • model: The Backbone model object
  • options: May include success and error methods

Implementing a new sync method can use the following pattern:

Backbone.sync = function(method, model, options) {  
  var requestContent = {}, success, error;

  function success(result) {
    // Handle results from MyAPI
    if (options.success) {
      options.success(result);
    }
  }

  function error(result) {
    // Handle results from MyAPI
    if (options.error) {
      options.error(result);
    }
  }

  options || (options = {});

  switch (method) {
    case 'create':
      requestContent['resource'] = model.toJSON();
      return MyAPI.create(model, success, error);

    case 'update':
      requestContent['resource'] = model.toJSON();
      return MyAPI.update(model, success, error);

    case 'delete':
      return MyAPI.destroy(model, success, error);

    case 'read':
      if (model.attributes[model.idAttribute]) {
        return MyAPI.find(model, success, error);
      } else {
        return MyAPI.findAll(model, success, error);
      }
  }
};

This pattern delegates API calls to a new object, which could be a Backbone-style class that supports events. This can be safely tested separately, and potentially used with libraries other than Backbone.

There are quite a few sync implementations out there: