The JavaScript blog.


libraries video node modules npm frontend

Node Roundup: npm and Front-end Packaging, npm Package Store

Posted on .

npm and Front-end Packaging

There's a post on the npm blog about npm and front-end packaging. It directly addresses something that lots of people are confused about: is it a good idea to use npm for front-end development, even if the project doesn't use CommonJS modules?

It's useful to see the npm team's thoughts on this, because it seems like some front-end developers use npm out of convenience, and others use it because their module works well in Node and browsers. Writing code for browsers has different requirements to Node, but the major ones are highlighted in this post and then potential solutions are discussed.

One future feature that will help front-end developers is ecosystems, which is a way to create a subset of packages for a common base. So in theory you could place React, jQuery, Gulp, and Grunt packages into separate sub-repositories.

Another recommendation suggested in the article is using additional metadata in the package.json file. I've seen lots of packages do this so it seems increasingly popular at this point.

My preferred approach to front-end development is npm with npm scripts and Browserify, so it's encouraging to see that mentioned in the post:

We also think browserify is amazing and under-utilized, and an end-to-end solution that worked with it automatically at install time is a really, really interesting idea (check out browserify’s sweet handbook for really great docs on how to use it effectively).

Building dependencies from ./node_modules is a pain because every module seems to have a different entry point filename, so it would be really nice if more front-end developers used the main property in package.json.

If you're looking for an example of an npm-based React project, then I recently received this tutorial from yiminghe : npm-based front-end development using browserify and browser loader library.

npm Package Store

npm Package Store

Travis Horn sent in npm Package Store (GitHub: travishorn/npm-package-store, License: MIT, npm: npm-package-store). This is a web application that shows updates for your globally installed npm packages.

It's an Express app that queries npm for JSON using npm ls -g --json, then fetches the latest info with request from http://registry.npmjs.org.


video node modules

Streams, Peerflix, and node-webkit

Posted on .

Recently a web service called Joker has been in the technology press. It's a web application that downloads torrents based on magnet URIs, and allows users to stream video content and fast forward to any point in the video. Of course, it's already been taken down, and it didn't always work as well as my description sells it, but it was an interesting experiment all the same.

I remember being completely astounded that Popcorn Time was built with JavaScript. Porpcorn Time is a web application that runs in a desktop wrapper so you can use it in Mac OS, Windows, and even Android. The pirates know what they're doing: cross-platform support from day one with a cool interface that integrates data from web services like IMDB.

Two things make Popcorn Time possible: node-webkit-builder and peerflix.

node-webkit-builder makes it easy to build cross-platform desktop apps with node-webkit:

node-webkit is an app runtime based on Chromium and node.js. You can write native apps in HTML and JavaScript with node-webkit. It also lets you call Node.js modules directly from the DOM and enables a new way of writing native applications with all Web technologies.

peerflix will stream torrents from a magnet link to a HTTP server that video players like VLC can connect to. It's based on torrent-stream, a Node stream torrent client that has a friendly API.


torrent-stream uses lots of small modules to do the job. For example, magnet-uri can parse magnet URIs, and peer-wire-swarm is a swarm implementation.

Reading through these modules is like a showcase of Node's stream API. Academically they're fascinating, despite the obvious grey market connotations.

Which brings me to the TV/movie/music "PVR"-like applications. Media cataloguing doesn't have to be for pirated content: I have lots of DRM-free music, video, and books that could be presented in a better way. Combining my music purchases from Amazon, Apple, and Google into a cool desktop media browser powered by Node with a friendly RESTful API would be really fun and useful.

There's actually a node-webkit apps list, but I haven't yet found my perfect Node-powered media browser. Let me know if you've made your own Node media browser (or anything similar) and I'll check it out!


video audio webaudio

Recreating the Spectrogram Face

Posted on .


Enjoying Syro? I'll admit I was a little bit too obsessed by the cover art, which contains a list of Aphex Twin's expenses and a list of the audio hardware used on the album. So I thought it was pretty cool to see Spectroface by Daniel Rapp. This project uses the Web Audio API to recreate Aphex Twin's spectrogram face that was hidden in the Windowlicker b-side.

Daniel's website explains how spectrograms work, and the source code is heavily commented, so with a little bit of effort you should be able to follow it.

A spectrogram is a visual representation of the frequencies which make up a sound. Say you whistle a pure "middle C", then a spectrogram would light up right at 261.6 Hz, which is the corresponding frequency for that tone. Likewise, the "A" note makes the spectrogram turn bright white at 440 Hz.

If you hover over "middle C" and "A" on the original page (http://danielrapp.github.io/spectroface) it'll actually play the notes, which is a nice touch.
I tried out Daniel's examples and found they work best in Chrome with the webcam and mic active. You should play the sounds from the speaker rather than headphones to see the image encoding effect in the spectrogram visualisations.

The source is concise, and amazingly doesn't require too much hacking to get the audio values translated into pixels. For example, the code that determines the shade of grey to use for position x, y in the image looks like this:

var getImageDataIntensity = function(imgData, x, y) {  
  var r = imgData.data[ ((imgData.width * y) + x) * 4     ]
    , g = imgData.data[ ((imgData.width * y) + x) * 4 + 1 ]
    , b = imgData.data[ ((imgData.width * y) + x) * 4 + 2 ]
    , avg = (r+b+g)/3
    , intensity = avg/255;

  return intensity;

The last example treats the video as an instrument, so you can wave things in front of the camera to produce different sounds. Although it sounds extremely loud and strange, it's very interesting and comes from a surprisingly small amount of code.

It starts acting like an instrument! Notice that if you place your finger over the webcam, the sound is muted. You can produce low-pass or high-pass filters by covering only part of the webcam.


video maps angularjs

Route360, ngVideo

Posted on .



The Route360 JavaScript API (GitHub: route360 / r360-js, License: MIT, Bower: route360) by Henning Hollburg and Daniel Gerber is a library for Leaflet that adds some cool features:

  • Generate polygons for reachable areas based on a point on the map
  • Supported for walk, car, bike and transit routing
  • Map controls: travel time slider, date and time chooser, and travel type chooser
  • Routing information from source to destination, including travel time and transit trips
  • Support for elevation data

This is a snippet of the polygon API:

var cpl = r360.route360PolygonLayer();  

var travelOptions = r360.travelOptions();  
travelOptions.setTravelTimes([300, 600,900, 1200, 1500, 1800]);

r360.PolygonService.getTravelTimePolygons(travelOptions, function(polygons) {  


ngVideo (GitHub: Wildhoney / ngVideo, License: MIT) by Adam Timberlake is a set of AngularJS directives for working with video. Videos are wrapped in an ng-video container:

<section class="video" ng-video>  

And should contain a video element. It includes a service which you can use to attach videos:

myApp.controller('VideoController', ['$scope', 'video', function($scope, video) {  
  video.addSource('mp4', 'http://www.example.com/alice-in-wonderland.mp4');

The project includes a lot of other directives for handling buffering, UI controls, data about play position and elapsed time, full screen support, and playlists.


video webrtc streaming

WebRTC Video Mixing with Mixology

Posted on .

Gearcloud Labs has open sourced their Mixology project under an MIT license. The source is available on GitHub: gearcloudlabs / Mixology.

This project allows video streams to be combined using a Node server. It uses WebRTC, the W3C standard for browser video, audio, and P2P. Google recently switched Google Hangouts over to WebRTC, which you can try in developer builds of Chrome:

Google+ Hangouts no longer requires a separate plugin to be installed in Chrome for video and voice chat to work. Using the Web Real-Time Communication API (WebRTC) and Native Client (NaCl) Google is able to provide a native video chat experience out of the box in Chrome.

Mixology uses collections of web pages that communicate using WebRTC. There's an additional manifest file written with JSON that defines how streams are mixed:

  "channelName": "Your mix name",
  "topology": ["filename-basename.output-streamname | filename-basename.input-streamname", ...],
  "partitionSize": integer

The Node project uses Express and Socket.IO. It's currently a monolithic file with no dependencies, so it expects you to have Express and Socket.IO installed globally. Refactoring it into a more modular Express application might be a nice exercise for someone looking to contribute to an open source project...