DailyJS

DailyJS

The JavaScript blog.

Subscribe

@dailyjs

Facebook

Google+

Featured

functional libraries reactive

ADT Streams

Posted on .

More libraries are starting to blend ideas from functional programming, streams, and reactive programming. Yassine Elouafi sent in ADT Streams (License: MIT, npm: adtstream), a library that offers a stream API using promises and algebraic data types. It has methods for wrapping other streams, like Stream.fromReadable for use with Node's Readable streams, Stream.fromEmitter for objects that extend EventEmitter, and Stream.fromDomEvent for working with DOM events in the browser.

There are utility methods, like Stream.seq for yielding elements from a sequence over time. The blog post, Promises + FP = Beautiful Streams explains each of the library's methods with examples first in Haskell and then in JavaScript. It also has some background on streams and reactive programming:

The basic idea is that you can represent different asynchronous data like events or http responses by an unifying concept called observable collections. Then you can manipulate those collections by standard array operations like map, filter, reduce, forEach ...etc.

By using Yassine's library, you can compose streams with different sources: you can mix events and readable streams with the same API.

The author also wrote From callback to (Future -> Functor -> Monad), which argues why callbacks are not composable and alternative solutions:

With composition, you never hit a wall, because you must always ask yourself "What is the return value of this function?", or put another way "What is the meaning of this thing?"

I don't mind using callbacks in my code. In fact, callbacks are great for things such as representing side effects or event notification, but once your start using them to manage your control flow, you're trapped, why? because they do not compose.

Both of these posts will be of interest to you if you're a JavaScript programmer without much functional or reactive programming experience.

Featured

frameworks node ES6 iojs

Nodal: An ES6 API Server

Posted on .

Sometimes I feel like client-side developers are adopting ES6 faster than server-side developers. I certainly have projects where the client uses ES6 and the Node server code is more ES5, mainly because I don't really want to use a transpiler on the server, although I sometimes do like to use Harmony flags.

Alternatively, I could use io.js. Keith Horwood recently sent in a new module that aims to provide an API server and web framework that takes advantage of ES6 features. It's called Nodal (GitHub: keithwhor/nodal, License: MIT, npm: nodal) and currently runs on io.js.

Nodal includes support for models, controllers, templates, migrations, routing, and query composition. It has a command-line tool for creating RESTful resources, and it's designed to work with PostgreSQL.

The example code makes heavy use of new language features like const and classes. The result is very Rails/Django-like. Here's a snippet of a model class:

module.exports = (function() {  
  'use strict';

  const Nodal = require('nodal');

  class Person extends Nodal.Model {
    __preInitialize__() {}
    __postInitialize__() {}
  }

  Person.prototype.schema = Nodal.my.Schema.models.Person;

  Person.prototype.externalInterface = [
    'id',
    'name',
    'age',
    'created_at'
  ];

  return Person;
})();

It doesn't have the same kind of auto-loading magic that you see in Rails -- notice the Nodal module is loaded explicitly. It still feels like idiomatic JavaScript rather than using ES6 to pretend to be something else.

Nodal doesn't use an external ORM, it actually has its own models backed by any-db-postgres. That means Keith is developing features like model relationships. I've tried to develop my own ORM before, and the ORMs I've used before have been very mixed in terms of quality and consistency between releases, so I don't envy the work he has ahead. However, the idea of a RESTful web framework that takes advantage of ES6 for code organisation and clarity is interesting, so let's see what he does with it!

Featured

apps browser routing chat

GitterCLI, Wayfarer

Posted on .

GitterCLI

I'm pleased to see that applications made with Blessed are starting to trickle into the DailyJS inbox! GitterCLI (GitHub: RodrigoEspinosa/gitter-cli, License: MIT, npm: gitter-cli) by Rodrigo Espinosa Curbelo is a Gitter client for the terminal. Those of us that are seasoned IRC veterans aren't too fond of web chat clients, and although Gitter has IRC access, GitterCLI could potentially take advantage of Gitter-specific features.

It uses the node-gitter module, and works using the standard HTTP authentication API implemented by Gitter. One thing I noticed was it drops authentication tokens into a "secrets" JSON file, and I couldn't work out if it checks to see if the permissions are safe (i.e., not readable system-wide). So you might want to check the secrets file when running it on a shared server. It stores the authentication token rather than your password, but naturally it still needs some level of protection.

Update: Corrected the GitHub URL.

Wayfarer

Wayfarer (GitHub: yoshuawuyts/wayfarer, License: MIT, npm: wayfarer) by Yoshua Wuyts is a client-side router, a bit like react-router. Wayrarer is slightly different because it's a "method-less" router with an EventEmitter-inspired API and supports composition through mounting subrouters.

This is what the super-pretty ES6 syntax looks like:

const wayfarer = require('wayfarer')

const router = wayfarer('/404')

router.on('/', () => console.log('/'))  
router.on('/404', uri => console.log('404 %s not found', uri))  
router.on('/:user', (uri, param) => console.log('user is %s', param.user))

router('/tobi')  
// => 'user is tobi' 

router('/uh/oh')  
// => '404 /uh/oh not found' 

The combination of Wayfarer's API and fat arrow is very easy to follow, and the API should be easy to use with your other favourite client-side libraries.

Featured

node iojs

io.js Joins the Node.js Foundation

Posted on .

When io.js got started there were those that saw it as the "new Node", a more aggressively developed fork that quickly implemented features that were lacking from 0.11 and 0.12. And the lack of the mystical Node.js 1.0 in 2014 certainly encouraged that kind of attitude. We were waiting for an updated V8 with the ES6 features that we wanted and more mature core modules, but what we got was leadership changes and talk of a "foundation".

A week ago the io.js Technical Committee met and voted to merge with Node under the Node.js Foundation. This is summarised in the May 13th Technical Committee meeting notes:

Voting Question: The io.js TC agrees to:

  1. have the io.js project join the Node Foundation

  2. rename the entire "iojs" GitHub org to be "nodejs"

  3. invite the the current Node.js TC on to our TC to form the basis of a Node Foundation TSC under the policies of the Node Foundation

  4. moving the io.js Working Groups to be under the Node Foundation

Mikeal Rogers wrote a post about why io.js needs a foundation, which includes some background about why io.js forked from Node, what it achieved, and what needs to happen next:

However, without a legal entity to own property it means that various io.js assets are in reality owned by individuals and companies. The domain name is owned by Fedor, the billing contact for the GitHub org is Colin, the keys used for signing the releases are owned by NodeSource, etc. With all the current owners acting in good faith this ownership isn’t an immediate problem, just as it wasn’t a problem for node.js in 2012, but the more successful we are the worse it could be, so this is something that keeps me up at night.

There's also a GitHub issue about this with a lot of positive support from the community. Later on, Mike Dolan (Director of Strategic Programs at The Linux Foundation) summarised the io.js merge on the official Node blog in Node.js and io.js leaders are building an open, neutral Node.js Foundation to support the future of the platform:

Most recently the io.js TC voted to join in the Foundation effort and planning is already underway to begin the process of converging the codebases.

It sounds like everyone is doing the right thing: Node has shifted to an open model with support from the Linux Foundation, and io.js has recognised the need to seek a more formal management model.

The next steps from a technical perspective will be difficult: merging commits from io.js won't be trivial, and even resolving details like moving GitHub issues will take time.

I expect the merged Node.js will not be a 0.12 release, and from the io.js commits I've been watching over the last few months there may be some small backwards compatibility issues, but hopefully this brings us closer to a Node that can keep up with modern JavaScript. It seems like we're finally seeing a Node with the right kind of management and technical progress that we expected from the heady success of earlier releases.

Featured

tutorials testing angularjs error-handling

AngularJS Form Errors, Angular Integration Tests

Posted on .

AngularJS: API Error Code Handling in Forms

When you're writing forms with Angular, it's not always clear how to handle errors in a reusable way. For example, what format should the server use to send back structured error message to the client? How do you use these responses to annotate fields with errors?

Szymon Kosno sent in AngularJS: API Error Code Handling in Forms, a tutorial that shows how to use error handling without explicit error handling in view controllers. It makes suggestions for how to structure detailed JSON error responses, and how to support different types of errors: for example, global errors and errors for specific form fields.

The form's markup uses an api-validate directive for input elements that should be validated by the server and will show error annotations. This allows the underlying error handling code to determine that the field is API error aware, so errors can be displayed or hidden as required.

With the right directives and services, Szymon is able to remove custom error handling code, which really cleans up the associated controllers. This article is useful if you've ever been confused about how to handle server-side errors, which you should always do for security and sanity.

Angular Integration Tests

Torgeir Helgevold sent in Angular integration tests, which gives some background to integration tests and includes a detailed example towards the end of the article.

In the following example I will demonstrate how to test a series of nested components with shared state. My sample code includes a parent directive where a user can add an inputed number to a list by clicking a button. Nested within there are two directives, one for calculating the sum of all items, and a second directive for building a comma separated string from the items in the list.

The example uses Angular's controllerAs syntax to simulate isolated scopes and pass changes. If you've ever used something like Selenium for integration tests, then you might want to compare Torgeir's approach if you're looking for something more lightweight and focused.