AngularJS: Form Validation

06 Jun 2013 | By Alex Young | Comments | Tags angularjs angularfeeds mvc bower

This week we’re going to look at form validation with AngularJS. Angular has several directives that support form field validation, and they’re based on the HTML5 form validators. You can specify that a field is required, a certain size, a certain type, and should match a given pattern.

URL Validation

Chrome's validation message

This tutorial series is about a feed reader, so it’s lucky that one of the standard HTML5 validators is for checking URLs. It can be used by adding the type="url" attribute to an input. Angular supports this through the input url directive. It takes various options, of which we’re interested in required and ng-model.

The ng-model directive allows the input to be linked to a model, but any Angular expression can be used. The form directive allows forms to be managed with Angular, and bound to controllers.

Just by adding a form and an input with type="url" will result in some basic validation support (in app/views/main.html):

<form name="newFeed">
  URL: <input size="80" name="url" ng-model="newFeed.url" type="url" required>
  <button ng-click="addFeed(newFeed)">Add Feed</button>
</form>

However, this won’t quite work with the controller code that I wrote in the previous parts because addFeed isn’t set up to check validation.

Checking Validation State

In a controller, a bound value can be interrogated for the validation status by checking the $valid property. The previous addFeed, in app/scripts/controllers/main.js, can be changed as follows:

$scope.addFeed = function(feed) {
  if (feed.$valid) {
    // Copy this feed instance and reset the URL in the form
    $scope.feeds.push(feed);
    $scope.newFeed.url = {};
  }
};

This should work, but it does one thing wrong: $scope.newFeed.url can’t be reset by assigning it to an object literal, because newFeed is now decorated with internal properties to support validation. Instead, copy the new object, and reset the values in newFeed:

$scope.addFeed = function(feed) {
  if (feed.$valid) {
    // Copy this feed instance and reset the URL in the form
    var newFeed = angular.copy(feed);
    $scope.feeds.push(newFeed);
    $scope.fetchFeed(newFeed);
    $scope.newFeed.url = '';
  }
};

Fighting with HTML5

We should probably add error messages that are cross-browser compatible. To do that, you can use the ng-show directive:

<form name="newFeed" novalidate>
  URL: <input size="80" name="url" ng-model="newFeed.url" type="url" required>
  <button ng-click="addFeed(newFeed)">Add Feed</button>
  <span class="error" ng-show="newFeed.$error.required">Required!</span>
  <span class="error" ng-show="newFeed.$error.url">Invalid URL format!</span>
</form>

The ngShow directive can conditionally show part of the DOM based on an Angular expression – in this case the validation results are checked. Incidentally, validation results can be found in the $error property fo the model.

Also notice that I added the novalidate attribute to the form; if you don’t do this HTML5 validations will still kick in, which causes confusing behaviour.

Disabling the Button

Another nice touch is to use ng-disabled to disable the button when an invalid URL has been entered. The ngDisabled directive takes an Angular expression, like the previous directives discussed here:

<form name="newFeed" novalidate>
  URL: <input size="80" name="url" ng-model="newFeed.url" type="url" required>
  <button ng-disabled="!newFeed.$valid" ng-click="addFeed(newFeed)">Add Feed</button>
  <span class="error" ng-show="newFeed.$error.required">Required!</span>
  <span class="error" ng-show="newFeed.$error.url">Invalid URL format!</span>
</form>

The difference here is I’ve used ! to negate the expression: !newFeed.$valid. Yes, it’s really that easy!

Conclusion

There’s more to expressions than simple model-based truth tests – you can do pretty much anything short of control flow statements. For more, see Angular Developer Guide, Expressions.

The latest commit for this project was 0dcc996.

Node Roundup: 0.8.24, 0.10.10, speakingurl, node-xmljson

05 Jun 2013 | By Alex Young | Comments | Tags node modules web urls xml json
You can send in your Node projects for review through our contact form.

Node 0.8.24 and 0.10.10

Node 0.8.24 and Node 0.10.10 have been released. The 0.8 (maintenance) release gets an updated npm, and some fixes for the url and http core modules.

Meanwhile, 0.10.10 has a new version of the internal uv library, and unshift('') now behaves like a noop.

speakingurl

Sascha Droste sent in speakingurl (GitHub: pid / speakingurl, License: BSD, npm: speakingurl), a module for generating clean URL slugs:

slug = getSlug('Apple & Pear!');
console.log(slug);
// Output: apple-and-pear

slug = getSlug('Foo ♥ Bar');
console.log(slug);
// Output: foo-love-bar

It has tests, localisation support, and works in browsers.

node-xmljson

node-xmljson (GitHub: ExactTarget / node-xmljson, License: MIT, npm: xmljson) from Adam Alexander and Benjamin Dean of ExactTarget was just released, providing quick and simple bi-directional translation between XML and JSON formats.

XML to JSON:
// Load the module
var to_json = require('xmljson').to_json;

// An XML string
var xml = '' +
    '<data>' +
        '<prop1>val1</prop1>' +
        '<prop2>val2</prop2>' +
        '<prop3>val3</prop3>' +
    '</data>';

to_json(xml, function (error, data) {
    // Module returns a JS object
    console.log(data);
    // -> { prop1: 'val1', prop2: 'val2', prop3: 'val3' }

    // Format as a JSON string
    console.log(JSON.stringify(data));
    // -> {"prop1":"val1","prop2":"val2","prop3":"val3"}
});
JSON to XML:
// Load the module
var to_xml = require('xmljson').to_xml;

// A JSON string
var json = '' +
    '{' +
        '"prop1":"val1",' +
        '"prop2":"val2",' +
        '"prop3":"val3"' +
    '}';

to_xml(json, function (error, xml) {
    // Module returns an XML string
    console.log(xml);
    // -> <data><prop1>val1</prop1><prop2>val2</prop2><prop3>val3</prop3></data>
});

ExactTarget has also released Fuel UX (GitHub: ExactTarget / fuelux, License: MIT) a lightweight web UI library that extends Twitter Bootstrap with additional JavaScript controls.

jQuery Roundup: 1.10.1, 2.0.2, ikSelect, photoWall.js

04 Jun 2013 | By Alex Young | Comments | Tags jquery plugins select galleries images
Note: You can send your plugins and articles in for review through our contact form.

jQuery 1.10.1 and 2.0.2

“A new release already? It’s only been a week! Yes, because you deserve it. We’re greatly encouraged by all the people who upgraded and found our well-hidden ‘we completely hosed relative animations’ easter egg,” writes Dave Methvin, about the release of jQuery 1.10.1 and 2.0.2.

The full background to this bug was documented in ticket #13939. Another animation-related bug was fixed, and an IE selector/iframe issue as well.

ikSelect

ikSelect (GitHub: Igor10k / ikSelect) by “Igor10k” is another select replacement plugin! This one supports custom markup and inline-block, optgroup, adding and removing options, callbacks, and event triggers.

The API is clean and idiomatic jQuery. There’s a single entry point, $(selector).ikSelect which can be used to apply the plugin to a select, or issue commands to an instance of ikSelect. Various options are supported, like setting the width automatically and adding search support (known as filtering in this case).

photoWall.js

photoWall.js (GitHub: jeremyjcpaul / photowall, License: MIT, jQuery: photowall) by Jeremy JC Paul creates a photo gallery in a similar style to Google+/Picasa, where clicking on an image opens a panel that displays additional metadata.

The markup allows the photo’s title and description to be specified like this:

<div class="photowall">
  <div class="pw-slide">
    <img class="pw-image" src="images/image-filename.jpg" />
    <div class="pw-image-desc">
      <!-- Any HTML content can go in here. -->
    </div>
  </div>
</div>

The plugin is invoked using $(selector).photoWall(), and supported options include event handlers and animation speed.

Angular Smart Table, TurtleScript

03 Jun 2013 | By Alex Young | Comments | Tags angular rust angularjs libraries education

Smart Table

Smart Table

Smart Table (GitHub: lorenzofox3 / Smart-Table, License: MIT) by Laurent Renard helps quickly render data as tables in AngularJS projects. It provides the smart-table directive which will render a rowCollection – an array that contains objects for each row. It also supports layouts by specifying the columns with columnCollection, data formatting, and sorting.

Smart Table has some more advanced features as well, like styling and inline editing. Laurent has included API documentation and unit tests.

TurtleScript

TurtleScript (GitHub: cscott / TurtleScript, License: GPLv2) by C. Scott Ananian from One Laptop per Child aims to provide a Logo-like environment for teaching programming. TurtleScript itself is based on JavaScript, and uses a bytecode compiler/interpreter.

The TurtleScript documentation has a lot more background that explains what it does and how it works. Meanwhile, Scott has been working on rusty-turtle (GitHub: cscott / rusty-turtle, License: GPLv2). This is a TurtleScript implementation written in Rust. If you’re interested in the Rust language and want to see what a JavaScript parser in Rust might look like, check it out!

Rusty-turtle is a “native” bytecode interpreter, so it runs the TurtleScript parser and compiler in order to generate bytecode for it to run.

Generators and Suspend

31 May 2013 | By Alex Young | Comments | Tags node modules es6

ECMAScript 6 generators are at the draft stage, and available in Node 0.11 when node is run with --harmony or --harmony-generators. Generators are “first-class coroutines” – think functions that can be postponed and resumed.

Generators are denoted with function*, and return values by calling yield. The value isn’t really returned: yield could be placed inside a loop, and then generator.next() is called to fetch the yielded value. The generator is said to be an iterator – it could be provided as the expression to an iteration statement like for:

function* generator() {
  for (;;) {
    yield someValue;
  }
}

for (var value of generator()) {
  // Do something with `value`,
  // then `break` when enough values have been yielded
}

The ECMAScript 6 wiki has a Fibonacci sequence example, but generators don’t really hit their conceptual stride until you start hooking generators up to other generators. The classic example of this is consumer-producer relationships: generators that produce values, and then consumers that use them. The two generators are said to be symmetric – a continuous evaluation where coroutines yield to each other, rather than two functions that call each other.

Jeremy Martin sent in a small but novel module based on generators called suspend (GitHub: jmar777 / suspend, License: MIT, npm: suspend). As it needs Node 0.11 and for Node to be run with --harmony, let’s just say it’s academically interesting for now.

You can think of suspend as an early example of generators that feature an idiomatic Node API:

// async without suspend
async.map(['file1','file2','file3'], fs.stat, function(err, results) {
  // results is now an array of stats for each file
});

// async with suspend
var res = yield async.map(['file1','file2','file3'], fs.stat, resume);

Here the async module has been modified to use suspend, resulting in more concise code.

suspend is “red light, green light” for asynchronous code execution. yield means stop, and resume means go.

If this sounds familiar, that’s because it’s not semantically too different to node-fibers. The node-fibers documentation includes a comparison between the ES6 generators example and its own syntax.

This is the entire source to suspend:

var suspend = module.exports = function suspend(generator, opts) {
  opts || (opts = {});

  return function start() {
    Array.prototype.unshift.call(arguments, function resume(err) {
      if (opts.throw) {
        if (err) return iterator.throw(err);
        iterator.send(Array.prototype.slice.call(arguments, 1));
      } else {
        iterator.send(Array.prototype.slice.call(arguments));
      }
    });
    var iterator = generator.apply(this, arguments);
    iterator.next();
  };
};

The suspend function accepts a generator and returns a function. The callback supplied to suspend will be passed the resume function, which accepts an error argument to fit Node’s callback API style. The user-supplied callback can then call yield on an asynchronous function that accepts resume as its callback, allowing Node’s core modules (or any other asynchronous methods) to be used in a synchronous style:

suspend(function* (resume) {
  var data = yield fs.readFile(__filename, resume);
})();

I liked this twist on generators, and I think modules like this will start to become more important in the JavaScript community over the next few years.

AngularJS: Adding Dependencies

30 May 2013 | By Alex Young | Comments | Tags angularjs mvc angularfeeds bower grunt

Adding Dependencies with Bower

This tutorial is really about Yeoman, Bower, and Grunt, because I still feel like it’s worth exploring the build system that I introduced for this AngularJS project. I appreciate that the number of files installed by Yeoman is a little bit bewildering, so we’re going to take a step back from AngularJS and look at how dependencies work and how to add new dependencies to a project.

Although Yeoman helps get a new project off the ground, it takes a fair amount of digging to figure out how everything is laid out. For example: let’s say we want to add sass-bootstrap to djsreader – how exactly do we do this?

Yeoman uses Bower for managing dependencies, and Bower uses component.json (or bower.json by default in newer versions). To add sass-bootstrap to the project, open component.json and add "sass-bootstrap": "2.3.x" to the dependencies property:

{
  "name": "djsreader",
  "version": "0.0.0",
  "dependencies": {
    "angular": "~1.0.5",
    "json3": "~3.2.4",
    "es5-shim": "~2.0.8",
    "angular-resource": "~1.0.5",
    "angular-cookies": "~1.0.5",
    "angular-sanitize": "~1.0.5",
    "sass-bootstrap": "2.3.x"
  },
  "devDependencies": {
    "angular-mocks": "~1.0.5",
    "angular-scenario": "~1.0.5"
  }
}

Next run bower install to install the dependencies to app/components. If you look inside app/components you should see sass-bootstrap in there.

Now the package is installed, how do we actually use it with our project? The easiest way is to create a suitable Grunt task.

Grunt

Grunt runs the djsreader development server and compiles production builds that can be dropped onto a web server. Gruntfile.js is mostly configuration – it has the various settings needed to drive Grunt tasks so they can build our project. One task is compass – if you search the file for compass you should see a property that defines some options for compiling Sass files.

The convention for Grunt task configuration is taskName: { argument: options }. We want to add a new argument to the compass task for building the Bootstrap Sass files. We know the files are in app/components/sass-bootstrap, so we just need to tell it to compile the files in there.

Add a new property to compass called bootstrap. It should be on line 143:

compass: {
  // options/dist/server
  bootstrap: {
    options: {
      sassDir: '<%= yeoman.app %>/components/sass-bootstrap/lib',
      cssDir: '.tmp/styles'
    }
  }
}

Near the bottom of the file add an entry for compass:bootstrap to grunt.registerTask('server', [ and grunt.registerTask('build', [:

grunt.registerTask('server', [
  'clean:server',
  'coffee:dist',
  'compass:server',
  'compass:bootstrap', /* This one! */
  'livereload-start',
  'connect:livereload',
  'open',
  'watch'
]);

This causes the Bootstrap .scss files to be compiled whenever a server is started.

Now open app/index.html and add styles/bootstrap.css:

<link rel="stylesheet" href="styles/bootstrap.css">
<link rel="stylesheet" href="styles/main.css">

Conclusion

Angular/Bootstrap

The settings files Yeoman created for us makes managing dependencies easy – there’s a world of cool things you can find with bower search and try out.

This week’s code is in commit 005d1be.

Node Roundup: 0.10.8, msfnode, vnc-over-gif

29 May 2013 | By Alex Young | Comments | Tags node modules security gif vnc
You can send in your Node projects for review through our contact form.

Node 0.10.8

Node 0.10.8 was released last week. v8, uv, and npm were all upgraded, and there are fixes and improvements to be found in the http, buffer, and crypto modules. This is the third stable release so far in May.

msfnode

Metasploit

msfnode (GitHub: eviltik / msfnode, License: GPL 3, npm: msfnode) by Michel Soisson is a Metasploit API client for Node. Metasploit is a hugely popular penetration testing framework. This module allows you to use Node to script Metasploit. The Metasploit API supports things like managing jobs, loading plugins, and interacting with open sessions to compromised systems.

The module provides a metasploitClient constructor, which can be passed an object that contains the Metasploit server’s details, including login and password. The client is event-based, and the project’s readme has an example of how to get a login token and make a request against a server.

vnc-over-gif

Andrey Sidorov sent in vnc-over-gif (GitHub: sidorares / vnc-over-gif, License: MIT, npm: vnc-over-gif), a VNC viewer that uses animated gifs as the data transport. It currently has no client-side JavaScript, so it acts purely as a means of viewing a VNC session. The author is interested in expanding it further with an Ajax-based UI. There’s a good background to the project in the vnc-over-gif FAQ.

Although it isn’t interactive, it’s a great hack – the code is currently only around 50 lines, which Andrey claims only took 30 minutes to write.

jQuery Roundup: 1.10.0, 2.0.1, AopJS, Backbone.Cache and Backbone.Cleanup

28 May 2013 | By Alex Young | Comments | Tags jquery plugins backbone.js aspect-oriented
Note: You can send your plugins and articles in for review through our contact form.

1.10.0 and 2.0.1

jQuery 1.10.0 and 2.0.1 were released last week:

Our main goal with these two releases is to synchronize the features and behavior of the 1.x and 2.x lines, as we pledged a year ago when jQuery 2.0 was announced. Going forward, we’ll try to keep the two in sync so that 1.11 and 2.1 are feature-equivalent for example.

Even though these newer jQuery releases have shed the legacy IE support baggage, there are still IE-specific fixes: IE9 focus of death.

AopJS

AopJS (GitHub: victorcastroamigo / aopjs, License: MIT, jQuery: aop) by Víctor Castro Amigo is a minimal aspect oriented library for JavaScript, with a jQuery plugin. It has a chainable API that can be used to define advice, with various types: before, after, afterReturning, afterThrowing, and around.

The author has included unit tests, and the readme has plenty of examples. The jQuery portion of the project doesn’t add any specific aspect-oriented enhancements to jQuery itself, it just binds to $.aop.aspect.

Backbone.Cache and Backbone.Cleanup

Naor Ye sent in some of his Backbone.js plugins. Backbone.Cache allows you to define a cache object that models and collections can use. You’ll need to make your models and collections inherit from the right classes: Backbone.CachedModel and Backbone.CachedCollection provide a cacheObject property that can be used to point to a suitable object to use as a cache.

Backbone.Cleanup offers parent classes for views and Backbone.Router to help you clean and reuse nested views. A method called markCurrentView is used to set the current view, so when the view is no longer active its cleanup method will be triggered.

Cytoscape.js

27 May 2013 | By Alex Young | Comments | Tags libraries node maths

Cytoscape

Cytoscape.js (GitHub: cytoscape / cytoscape.js, License: LGPL, npm: cytoscape), developed at the Donnelly Centre at the University of Toronto by Max Franz, is a graph library that works with Node and browsers. This library is for working with “graphs” in the mathematical sense – interconnected sets of nodes connected by edges.

The API uses lots of sensible JavaScript idioms: it’s event-based, functions return objects so calls can be chained, JSON definitions of elements can be used, and nodes can be selected with selectors that are modelled on CSS selectors and jQuery’s API. That means you can query a graph with something like this: cy.elements('node:locked, edge:selected').

Styling graphs is also handled in a natural manner:

Graph style and data should be separate, and the library should provide core functionality with extensions adding functionality on top of the library.

Max and several contributs have been working on the project for two years now, so it’s quite mature at this point. The project comes with detailed documentation, a build script, and a test suite written with QUnit.

Node Hosting with Modulus

25 May 2013 | By Alex Young | Comments | Tags node hosting

Modulus

Modulus.io is a new hosting platform dedicated to Node. Why “platform”? Well, Modulus provides a complete stack for web application development: MongoDB is used for the database and file storage, and WebSockets are supported out of the box. Applications running on the Modulus stack get metrics – requests are logged and analysed in real-time. Horizontal scaling is supported by running multiple instances of your application.

Pricing is determined by the number of instances (servos) that you run, and storage used. The Modulus pricing page has some sliders, allowing you to see how much it’ll cost to run your application per-month.

I asked Modulus about using different versions of Node Node, as Heroku supports 0.4 to 0.10. However, at the time of writing only Node 0.8.15 is supported. Ghuffran Ali from Modulus said that they’re working on supporting multiple Node versions as soon as Monday (27th May), so keep an eye on the Modulus blog for details on that.

It’s easy to get started with Modulus – there’s a sample project, plus you can sign in with GitHub so it doesn’t take too much effort to get a basic application running. They’re also offering $15 free credit, so you could run something more substantial there to see how everything works.

Modulus uses a web-based interface for managing projects that allows various settings to be changed, like environmental variables, and a global SSL redirect. There’s also a command-line client – if you sign in with GitHub make sure you use modulus login with -g so you can sign in with your GitHub account.

On a related note, IrisCouch has joined Nodejitsu. That means CouchDB and Redis are now both supported by Nodejitsu:

This means that our users will be able to deploy their applications and databases from the same set of tools all backed by node.js. If you’re an existing IrisCouch user you will be notified and given ample time to migrate your IrisCouch account into a Nodejitsu account.

It’s impressive to see so much innovation in the Node hosting/PaaS space!

AngularJS: More on Dependency Injection

23 May 2013 | By Alex Young | Comments | Tags angularjs mvc angularfeeds

In the AngularJS tutorials I’ve been writing, you might have noticed the use of dependency injection. In this article I’m going to explain how dependency injection works, and how it relates to the small tutorial project we’ve created.

Dependency injection is a software design pattern. The motivation for using it in Angular is to make it easier to transparently load mocked objects in tests. The $http module is a great example of this: when writing tests you don’t want to make real network calls, but defer the work to a fake object that responds with fixture data.

The earlier tutorials used dependency injection for this exact use case: in main controller, the MainCtrl module is set up to load the $http module which can then be transparently replaced during testing.

angular.module('djsreaderApp')
  .controller('MainCtrl', function($scope, $http, $timeout) {

Now forget everything I just said about dependency injection, and look at the callback that has been passed to .controller in the previous example. The $http and $timeout modules have been added by me because I want to use the $http service and the $timeout service. These are built-in “services” (an Angular term), but they’re not standard arguments. In fact, I could have specified these arguments in any order:

angular.module('djsreaderApp')
  .controller('MainCtrl', function($scope, $timeout, $http) {

This is possible because Angular looks at the function argument names to load dependencies. Before you run away screaming about magic, it’s important to realise that this is just one way to load dependencies in Angular projects. For example, this is equivalent:

angular.module('djsreaderApp')
  .controller('MainCtrl', ['$scope', '$http', '$timeout'], function($scope, $http, $timeout) {

The array-based style is more like AMD, and requires a little bit of syntactical overhead. I call the first style “introspective dependency injection”. The array-based syntax allows us to use different names for the dependencies, which can be useful sometimes.

This raises the question: how does introspective dependency injection cope with minimisers, where variables are renamed to shorter values? Well, it doesn’t cope with it at all. In fact, minimisers need help to translate the first style to the second.

Yeoman and ngmin

One reason I built the tutorial series with Yeoman was because the Angular generator includes grunt-ngmin. This is a Grunt task that uses ngmin – an Angular-aware “pre-minifier”. It allows you to use the shorter, introspective dependency injection syntax, while still generating valid minimised production builds.

Therefore, building a production version of djsreader with grunt build will correctly generate a deployable version of the project.

Why is it that almost all of Angular’s documentation and tutorials include the potentially dangerous introspective dependency injection syntax? I’m not sure, and I haven’t looked into it. I’d be happier if the only valid solution was the array-based approach, which looks more like AMD which most of us are already comfortable with anyway.

Just to prove I’m not making things up, here is the minimised source for djsreader:

"use strict";angular.module("djsreaderApp",[]).config(["$routeProvider",function(e){e.when("/",{templateUrl:"views/main.html",controller:"MainCtrl"}).otherwise({redirectTo:"/"})}]),angular.module("djsreaderApp").controller("MainCtrl",["$scope","$http","$timeout",function(e,r,t){e.refreshInterval=60,e.feeds=[{url:"http://dailyjs.com/atom.xml"}],e.fetchFeed=function(n){n.items=[];var o="http://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20xml%20where%20url%3D'";o+=encodeURIComponent(n.url),o+="'%20and%20itemPath%3D'feed.entry'&format=json&diagnostics=true&callback=JSON_CALLBACK",r.jsonp(o).success(function(e){e.query.results&&(n.items=e.query.results.entry)}).error(function(e){console.error("Error fetching feed:",e)}),t(function(){e.fetchFeed(n)},1e3*e.refreshInterval)},e.addFeed=function(r){e.feeds.push(r),e.fetchFeed(r),e.newFeed={}},e.deleteFeed=function(r){e.feeds.splice(e.feeds.indexOf(r),1)},e.fetchFeed(e.feeds[0])}]);

The demangled version shows that we’re using the array-based syntax, thanks to ngmin:

angular.module("djsreaderApp").controller("MainCtrl", ["$scope", "$http", "$timeout",

Internals

In case you’re wondering how the introspective dependency injection style works, then look no further than annotate(fn). This function uses Function.prototype.toString to extract the argument names from the JavaScript source code. The results are effectively cached, so even though this sounds horrible it doesn’t perform as badly as it could.

Conclusion

Nothing I’ve said here is new – while researching this post I found The Magic Behind Dependency Injection by Alex Rothenberg, which covers the same topic, and the Angular Dependency Injection documentation outlines the issues caused by the introspective approach and suggests that it should only be used for pretotyping.

However, I felt like it was worth writing an overview of the matter, because although Yeoman is great for a quick start to a project, you really need to understand what’s going on behind the scenes!

Node Roundup: 0.10.7, JSON Editor, puid, node-mac

22 May 2013 | By Alex Young | Comments | Tags node modules mac windows json uuid
You can send in your Node projects for review through our contact form.

Node 0.10.7

Node 0.10.7 was released last week. This version includes fixes for the buffer and crypto modules, and timers. The buffer/crypto fix relates to encoding issues that could crash Node: #5482.

JSON Editor Online

JSON Editor Online

JSON Editor Online (GitHub: josdejong / jsoneditor, License: Apache 2.0, npm: jsoneditor, bower: jsoneditor) by Jos de Jong is a web-based JSON editor. It uses Node for building the project, but it’s actually 100% web-based. It uses the Ace editor, and includes features for searching and sorting JSON.

It’s installable with Bower, so you could technically use it as a component and embed it into another project.

english-time

Azer Koçulu sent in a bunch of new modules again, and one I picked out this time was english-time (GitHub: azer / english-time, License: BSD, npm: english-time). He’s using it with some of the CLI tools he’s written, so rather than specifying a date in an ISO format users can express durations in English.

The module currently supports milliseconds, seconds, minutes, hours, days, weeks, and shortened expressions based on combinations of these. For example, 3 weeks, 5d 6h would work.

puid

puid (GitHub: pid / puid, License: MIT, npm: puid) by Sascha Droste can generate unique IDs suitable for use in a distributed system. The IDs are based on time, machine, and process, and can be 24, 14, or 12 characters long.

Each ID is comprised of an encoded timestamp, machine ID, process ID, and a counter. The counter is based on nanoseconds, and the machine ID is based on the network interface ID or the machine’s hostname.

node-mac

node-windows provides integration for Windows-specific services, like creating daemons and writing to eventlog. The creator of node-windows, Corey Butler, has also released node-mac (GitHub: coreybutler / node-mac, License: MIT, npm: node-mac). This supports Mac-friendly daemonisation and logging.

Services can be created using an event-based API:

var Service = require('node-mac').Service;

// Create a new service object
var svc = new Service({
  name: 'Hello World',
  description: 'The nodejs.org example web server.',
  script: '/path/to/helloworld.js')
});

// Listen for the "install" event, which indicates the
// process is available as a service.
svc.on('install', function() {
  svc.start();
});

svc.install();

It also supports service removal, and event logging.

jQuery Roundup: Anchorify.js, Minimalect

21 May 2013 | By Alex Young | Comments | Tags jquery plugins select
Note: You can send your plugins and articles in for review through our contact form.

Anchorify.js

Anchorify.js (GitHub: willdurand / anchorify.js, License: MIT) by William Durand automatically inserts unique anchored headings. The default markup is an anchor with a pilcrow sign, but this can be overridden if desired.

Even though the plugin is relatively simple, William has included QUnit tests and put the project up on jQuery’s new plugin site.

Minimalect

Minimalect (GitHub: groenroos / minimalect, License: MIT) by Oskari Groenroos is a select element replacement that supports optgroups, searching, keyboard navigation, and themes. It comes with two themes that are intentionally simple, allowing you to easily customise them using CSS, and no images are required by default.

Options include placeholder text, a message when no search results are found, class name overrides, and lifecycle callbacks.

Terminology: Modules

20 May 2013 | By Alex Young | Comments | Tags modules commonjs amd terminology basics js101

Learning modern modular frameworks like Backbone.js and AngularJS involves mastering a large amount of terminology, even just to understand a Hello, World application. With that in mind, I wanted to take a break from higher-level libraries to answer the question: what is a module?

The Background Story

Client-side development has always been rife with techniques for patching missing behaviour in browsers. Even the humble <script> tag has been cajoled and beaten into submission to give us alternative ways to load scripts.

It all started with concatenation. Rather than loading many scripts on a page, they are instead joined together to form a single file, and perhaps minimised. One school of thought was that this is more efficient, because a long HTTP request will ultimately perform better than many smaller requests.

That makes a lot of sense when loading libraries – things that you want to be globally available. However, when writing your own code it somehow feels wrong to place objects and functions at the top level (the global scope).

If you’re working with jQuery, you might organise your own code like this:

$(function() {
  function MyConstructor() {
  }

  MyConstructor.prototype = {
    myMethod: function() {
    }
  };

  var instance = new MyConstructor();
});

That neatly tucks everything away while also only running the code when the DOM is ready. That’s great for a few weeks, until the file is bustling with dozens of objects and functions. That’s when it seems like this monolithic file would benefit from being split up into multiple files.

To avoid the pitfalls caused by large files, we can split them up, then load them with <script> tags. The scripts can be placed at the end of the document, causing them to be loaded after the majority of the document has been parsed.

At this point we’re back to the original problem: we’re loading perhaps dozens of <script> tags inefficiently. Also, scripts are unable to express dependencies between each other. If dependencies between scripts can be expressed, then they can be shared between projects and loaded on demand more intelligently.

Loading, Optimising, and Dependencies

The <script> tag itself has an async attribute. This helps indicate which scripts can be loaded asynchronously, potentially decreasing the time the browser blocks when loading resources. If we’re going to use an API to somehow express dependencies between scripts and load them quickly, then it should load scripts asynchronously when possible.

Five years ago this was surprisingly complicated, mainly due to legacy browsers. Then solutions like RequireJS appeared. Not only did RequireJS allow scripts to be loaded programmatically, but it also had an optimiser that could concatenate and minimise files. The lines between loading scripts, managing dependencies, and file optmisation are inherently blurred.

AMD

The problem with loading scripts is it’s asynchronous: there’s no way to say load('/script.js') and have code that uses script.js directly afterwards. The CommonJS Modules/AsynchronousDefinition, which became AMD (Asynchronous Module Definition), was designed to get around this. Rather than trying to create the illusion that scripts can be loaded synchronously, all scripts are wrapped in a function called define. This is a global function inserted by a suitable AMD implementation, like RequireJS.

The define function can be used to safely namespace code, express dependencies, and give the module a name (id) so it can be registered and loaded. Module names are “resolved” to script names using a well-defined format.

Although this means every module you write must be wrapped in a call to define, the authors of RequireJS realised it meant that build tools could easily interpret dependencies and generate optimised builds. So your development code can use RequireJS’s client-side library to load the necessary scripts, then your production version can preload all scripts in one go, without having to change your HTML templates (r.js is used to do this in practice).

CommonJS

Meanwhile, Node was becoming popular. Node’s module system is characterised by using the require statement to return a value that contains the module:

var User = require('models/user');
User.find(1);

Can you imagine if every Node module had to be wrapped in a call to define? It might seem like an acceptable trade-off in client-side code, but it would feel like too much boilerplate in server-side scripting when compared to languages like Python.

There have been many projects to make this work in browsers. Most use a build tool to load all of the modules referenced by require up front – they’re stored in memory so require can simply return them, creating the illusion that scripts are being loaded synchronously.

Whenever you see require and exports you’re looking at CommonJS Modules/1.1. You’ll see this referred to as “CommonJS”.

Now you’ve seen CommonJS modules, AMD, and where they came from, how are they being used by modern frameworks?

Modules in the Wild

Dojo uses AMD internally and for creating your own modules. It didn’t originally – it used to have its own module system. Dojo adopted AMD early on.

AngularJS uses its own module system that looks a lot like AMD, but with adaptations to support dependency injection.

RequireJS supports AMD, but it can load scripts and other resources without wrapping them in define. For example, a dependency between your own well-defined modules and a jQuery plugin that doesn’t use AMD can be defined by using suitable configuration options when setting up RequireJS.

There’s still a disparity between development and production builds. Even though RequireJS can be used to create serverless single page applications, most people still use a lightweight development server that serves raw JavaScript files, before deploying concatenated and minimised production builds.

The need for script loading and building, and tailoring for various environments (typically development, test, and production) has resulted in a new class of projects. Yeoman is a good example of this: it uses Grunt for managing builds and running a development server, Bower for defining the source of dependencies so they can be fetched, and then RequireJS for loading and managing dependencies in the browser. Yeoman generates skeleton projects that set up development and build environments so you can focus on writing code.

Hopefully now you know all about client-side modules, so the next time you hear RequireJS, AMD, or CommonJS, you know what people are talking about!

Impossible Mission, Full-Text Indexing, Backbone.Projections

17 May 2013 | By Alex Young | Comments | Tags games backbone.js node pdf

Impossible Mission

Impossible Mission

Impossible Mission by Krisztián Tóth is a JavaScript remake of the C64 classic. You can view the source to see how it all works, if things like this.dieByZapFrames and this.searchedFurniture sound appealing to you.

Krisztián previously sent in Boulder Dash and Wizard of Wor which were similar remakes written using the same approach.

Client-Side Full-Text Indexing

Gary Sieling sent in a post he wrote about full-text indexing with client-side JavaScript, in which he looks at PDF.js and Lunr: Building a Full-Text Index in JavaScript. I briefly mentioned Lunr by Oliver Nightingale back in March.

One great thing about this type of index is that the work can be done in parallel and then combined as a map-reduce job. Only three entries from the above object need to be combined, as “fields” and “pipeline” are static.

Backbone.Projections

In relational databases, a projection is a subset of available data. Backbone.Projections (GitHub: andreypopp / backbone.projections, License: MIT, npm: backbone.projections) by Andrey Popp is the equivalent for Backbone collections – they allow a transformed subset of values in a collection can be represented and synced.

The supported projections are Capped and Filtered. Capped collections are limited based on a size and function – the function will be used to order the results prior to truncating them. Filtered projections filter out results based on a function that returns a boolean.

Projections can be composed by passing one project to another. This example creates a Filtered projection, and then passes it to a Capped projection to limit and order the results:

var todaysPosts = new Filtered(posts, {
  filter: function(post) {
    return post.get('date').isToday();
  }
});

var topTodaysPosts = new Capped(todaysPosts, {
  cap: 5,
  comparator: function(post) {
    return post.get('likes');
  }
});

The author has written unit tests with Mocha, and documentation is available in the readme.

AngularJS: Tests

16 May 2013 | By Alex Young | Comments | Tags angularjs mvc angularfeeds

Previously

In the last part we changed the app to support multiple feeds.

This week you’ll learn how to write a short unit test to test the app’s main controller. This will involve mocking data.

If you get stuck at any part of this tutorial, check out the full source here: commit 7b4bda.

Neat and Tidy Tests

The goal of this tutorial is to demonstrate one method for writing neat and tidy tests. Ideally mocked data should be stored in separate files and loaded when required. What we absolutely don’t want is global variables littering memory.

To run tests with the Yeoman-generated app we’ve been working on, type grunt test. It’ll use Karma and Jasmine to run tests through Chrome using WebSockets. The workflow in the console is effortless, despite Chrome appearing and disappearing in the background (it won’t trample on your existing Chrome session, it’ll make a separate process). It doesn’t steal focus away, which means you can invoke tests and continue working on code without getting interrupted.

My workflow: mock, controller, test, and a terminal for running tests

The basic approach is to use $httpBackend.whenJSONP to tell AngularJS to return some mock data when the tests are run, instead of fetching the real feed data from Yahoo. That sounds simple enough, but there’s a slight compilation: leaving mock data in the test sucks. So, what do we do about this? The karma.conf.js file that was created for us by the Yeoman generator contains a line for loading files from a mocks directory: 'test/mock/**/*.js. These will be loaded before the tests, so let’s dump some JSON in there.

Interestingly, if you run grunt test right now it’ll fail, because the app makes a JSONP request, and the angular-mocks library will flag this as an error. Using $httpBackend.whenJSONP will fix this.

JSON Mocks

Open a file called test/mock/feed.js (you’ll need to mkdir test/mock first), then add this:

'use strict';

angular.module('mockedFeed', [])
  .value('defaultJSON', {
    query: {
      count: 2,
      created: '2013-05-16T15:01:31Z',
      lang: 'en-US',
      results: {
        entry: [
          {
            title: 'Node Roundup: 0.11.2, 0.10.6, subscribe, Omelette',
            link: { href: 'http://dailyjs.com/2013/05/15/node-roundup' },
            updated: '2013-05-15T00:00:00+01:00',
            id: 'http://dailyjs.com/2013/05/15/node-roundup',
            content: { type: 'html', content: 'example' }
          },
          {
            title: 'jQuery Roundup: 1.10, jquery-markup, zelect',
            link: { href: 'http://dailyjs.com/2013/05/14/jquery-roundup' },
            updated: '2013-05-14T00:00:00+01:00',
            id: 'http://dailyjs.com/2013/05/14/jquery-roundup',
            content: { type: 'html', content: 'example 2' }
          }
        ]
      }
    }
  });

This uses angular.module().value to set a value that contains some JSON. I derived this JSON from Yahoo’s API by running the app and looking at the network traffic in WebKit Inspector, then edited out the content properties because they were huge (DailyJS has full articles in feeds).

Loading the Mocked Value

Open test/spec/controllers/main.js and change the first beforeEach to load mockedFeed:

beforeEach(module('djsreaderApp', 'mockedFeed'));

The beforeEach method is provided by Jasmine, and will make the specified function run before each test. Now the defaultJSON value can be injected, along with the HTTP backend:

var MainCtrl, scope, mockedFeed, httpBackend;

// Initialize the controller and a mock scope
beforeEach(inject(function($controller, $rootScope, $httpBackend, defaultJSON) {
  // Set up the expected feed data
  httpBackend = $httpBackend;
  $httpBackend.whenJSONP(/query.yahooapis.com/).respond(defaultJSON);

  scope = $rootScope.$new();
  MainCtrl = $controller('MainCtrl', {
    $scope: scope
  });
}));

You should be able to guess what’s happening with $httpBackend.whenJSONP(/query.yahooapis.com/) – whenever the app tries to contact Yahoo’s service, it’ll trigger our mocked HTTP backend and return the defaultJSON value instead. Cool!

The Test

The actual test is quite a comedown after all that mock wrangling:

it('should have a list of feeds', function() {
  expect(scope.feeds.length).toBe(1);
  httpBackend.flush();
  expect(scope.feeds[0].items[0].title).toBe('Node Roundup: 0.11.2, 0.10.6, subscribe, Omelette');
});

The test checks $scope has the expected data. httpBackend.flush will make sure the (fake) HTTP request has finished first. The scope.feeds value is the one that MainCtrl from last week derives from the raw JSON returned by Yahoo.

Conclusion

You should now be able to run grunt test and see some passing tests (just like in my screenshot). If not, check out djsreader on GitHub to see what’s different.

Most of the work for this part can be found in commit 7b4bda.

Node Roundup: 0.11.2, 0.10.6, subscribe, Omelette

15 May 2013 | By Alex Young | Comments | Tags node modules cli events pubsub
You can send in your Node projects for review through our contact form.

Node 0.11.2 and 0.10.6

Clearly the Node core developers have had an early summer holiday, and are now back to unleash new releases. In the space of a few days 0.11.2 and 0.10.6 were released. I was intrigued by the Readable.prototype.wrap update, which makes it support objectMode for streams that emit objects rather than strings or other data.

The 0.11.2 release has an update that guarantees the order of 'finish' events, and another that adds some new methods: cork and uncork. Corking basically forces buffering of all writes – data will be flushed when uncork is called or when end is called.

There is a detailed discussion about cork and the related _writev method on the Node Google Group: Streams writev API. There are some interesting comments about a similar earlier implementation by Ryan Dahl, the validity of which Isaac questions due to Node’s code changing significantly since then.

If you want to read about writev, check out the write(2) (man 2 write) manual page:

Write() attempts to write nbyte of data to the object referenced by the descriptor fildes from the buffer pointed to by buf. Writev() performs the same action, but gathers the output data from the iovcnt buffers specified by the members of the iov array…

subscribe

Azer Koçulu has been working on a suite of modules for subscribing and observing to changes on objects:

Now he’s also released a module for subscribing to multiple pub/sub objects:

var subscribe = require('subscribe');
var a = pubsub();
var b = pubsub();
var c = pubsub();

subscribe(a, b, c, function(updates) {
  updates[0].pubsub;
  // => a.onUpdate
  updates[0].params;
  // => 3, 4
  updates[1].pubsub;
  // => c.onUpdate
  updates[1].params;
  // => 5, 6
});

a.publish(3, 4);
c.publish(5, 6);

This brings a compositional style of working to Azer’s other modules, allowing subscriptions to multiple lists and objects at the same time. The next example uses subscribe to combine the new-list module with new-object:

var fruits = newList('apple', 'banana', 'grapes');
var people = newObject({ smith: 21, joe: 23 });

subscribe(fruits, people, function(updates) {
  updates[0].params[0].add;
  // => melon

  updates[1].params[1].set;
  // => { alex: 25 }
});

fruits.push('melon');
people('alex', '25');

Omelette

Omelette (GitHub: f / omelette, License: MIT, npm: omelette) by Fatih Kadir Akın is a template-based autocompletion module.

Programs and their arguments are defined using an event-based completion API, and then they can generate the zsh or Bash completion rules. There’s an animated gif in the readme that illustrates how it works in practice.

jQuery Roundup: 1.10, jquery-markup, zelect

14 May 2013 | By Alex Young | Comments | Tags jquery plugins markup templating select
Note: You can send your plugins and articles in for review through our contact form.

jQuery 1.10

A new 1.x branch of jQuery has been released, jQuery 1.10. This builds on the work in the 1.9 line:

It’s our goal to keep the 1.x and 2.x lines in sync functionally so that 1.10 and 2.0 are equal, followed by 1.11 and 2.1, then 1.12 and 2.2…

A few of the inclded fixes were things originally planned for 1.9.x, and others are new to this branch. As always the announcement blog post contains links to full details of each change.

jquery-markup

jquery-markup (GitHub: rse / jquery-markup, License: MIT) by Ralf S. Engelschall is a markup generator that works with several template engines (including Jade and handlebars). By adding a tag, <markup>, $(selector).markup can be used render templates interpolated with values.

Ralf said this about the plugin:

I wanted to use template languages like Handlebars but instead of having to store each fragment into its own file I still wanted to assemble all fragments together. Even more: I wanted to logically group and nest them to still understood the view markup code as a whole.

The <markup> tag can include a type attribute that is used to determine the templating language – this means you can use multiple template languages in the same document.

zelect

zelect (GitHub: mtkopone / zelect, License: WTFPL) by Mikko Koponen is a <select> component. It’s unit tested, and has built-in support for asynchronous pagination.

Unlike Chosen, it doesn’t come with any CSS, but that might be a good thing because it keeps the project simple. Mikko has provided an example with suitable CSS that you can use to get started.

If Chosen seems too large or inflexible for your project, then zelect might be a better choice.

Unix: It's Alive!

13 May 2013 | By Alex Young | Comments | Tags node modules unix

On a philosophical level, Node developers love Unix. I like to think that’s why Node’s core modules are relatively lightweight compared to other standard libraries (is an FTP library really necessary?) – Node’s modules quietly get out of the way, allowing the community to provide solutions to higher-level problems.

As someone who sits inside tmux/Vim/ssh all day, I’m preoccupied with command-line tools and ways to work more efficiently in the shell. That’s why I was intrigued to find bashful (GitHub: substack / bashful, License: MIT, npm: bashful) by substack. It allows Bash to be parsed and executed. To use it, hook it up with some streams:

var bash = require('bashful')(process.env);
bash.on('command', require('child_process').spawn);

var s = bash.createStream();
process.stdin.pipe(s).pipe(process.stdout);

After installing bashful, running this example with node sh.js will allow you to issue shell commands. Not all of Bash’s built-in commands are supported yet (there’s a list and to-do in the readme), but you should be able to execute commands and run true and false, then get the last exit status with echo $?.

How does this work? Well, the bashful module basically parses each line, character-by-character, to tokenise the input. It then checks anything that looks like a command against the list of built-in commands, and runs it. It mixes Node streams with a JavaScript bash parser to create a Bash-like layer that you can reuse with other streams.

This module depends on shell-quote, which correctly escapes those gnarly quotes in shell commands. I expect substack will make a few more shell-related modules as he continues work on bashful.

ShellJS (GitHub: arturadib / shelljs, npm: shelljs) by Artur Adib has been around for a while, but still receives regular updates. This module gives you shell-like commands in Node:

require('shelljs/global');

mkdir('-p', 'out/Release');
cp('-R', 'stuff/*', 'out/Release');

It can even mimic Make, so you could write your build scripts with it. This would make sense if you’re sharing code with Windows-based developers.

There are plenty of other interesting Unix-related modules that are alive and regularly updated. One I was looking at recently is suppose (GitHub: jprichardson / node-suppose, License: MIT, npm: suppose) by JP Richardson, which is an expect(1) clone:

var suppose = require('suppose')
  , fs = require('fs')
  , assert = require('assert')

suppose('npm', ['init'])
  .debug(fs.createWriteStream('/tmp/debug.txt'))
  .on(/name\: \([\w|\-]+\)[\s]*/).respond('awesome_package\n')
  .on('version: (0.0.0) ').respond('0.0.1\n')
  // ...

It uses a chainable API to allow expect-like expressions to capture and react to the output of other programs.

Unix in the Node community is alive and well, but I’m sure there’s also lots of Windows-related fun to be had – assuming you can figure out how to use Windows 9 with a keyboard and mouse that is…

P, EasyWebWorker, OpenPGP.js

10 May 2013 | By Alex Young | Comments | Tags node modules crypto p2p webworkers

P

P (GitHub: oztu / p, License: Apache 2, npm: onramp, bower: p) by Ozan Turgut is a client-side library with a WebSocket server for creating P2P networks by allowing browser-to-browser connections.

The onramp Node module is used to establish connections, but after that it isn’t necessary for communication between clients. The author has written up documentation with diagrams to explain how it works. Like other similar projects, the underlying technology is WebRTC, so it only works in Chrome or Firefox Nightly.

EasyWebWorker

EasyWebWorker (GitHub: ramesaliyev / EasyWebWorker, License: MIT) by Rameş Aliyev is a wrapper for web workers which allows functions to be executed directly, and can execute global functions in the worker.

A fallback is provided for older browsers:

# Create web worker fallback if browser doesnt support Web Workers.
if this.document isnt undefined and !window.Worker and !window._WorkerPrepared
  window.Worker = _WorkerFallback

The _WorkerFallback class is provided, and uses XMLHttpRequest or ActiveXObject.

The source code is nicely commented if you want to look at what it does in more detail: easy-web-worker.coffee.

OpenPGP.js

Jeremy Darling sent in OpenPGP.js (GitHub: openpgpjs / openpgpjs, License: LGPL), which is an OpenPGP implementation for JavaScript:

This is a JavaScript implementation of OpenPGP with the ability to generate public and private keys. Key generation can be a bit slow but you can also import your own keys.

Jeremy found that OpenPGP.js is used by Mailvelope, which is a browser extension that brings OpenPGP to webmail services like Gmail. That means Mailvelope can encrypt messages without having to upload a private key to a server.