If you use npm private packages then there's a very serious data leak that you should be aware of. Metadata for private packages was accidentally made available from the public replication endpoint.
I've included the full announcement below:
We need to notify you of a serious security incident that was discovered today, July 2nd.
Starting on June 26th, scoped package metadata was available via the public replication endpoint from npm. This means that third parties were aware of metadata about scoped packages, including private packages. This metadata was limited to:
versions and version publication dates
It's important to make clear that this does not include the packages themselves: package contents and source code were never available. User information such as passwords and billing information was not part of the information that leaked.
If your package metadata contained sensitive information, please take mitigation steps immediately. Because this information replicated, we will be making a public disclosure of the leak. However, to give you time to react (we are aware that it is a holiday weekend in the US) we will be holding off on the public announcement until Monday, July 6th.
We apologize wholeheartedly for this mistake and have taken steps to prevent this error. We are conducting a thorough review of our processes to avoid both this specific problem and any similar errors in the future.
Thank you for your continued support of npm. If you have any further questions or concerns please reach out to firstname.lastname@example.org.
Check your readme files and package names to ensure nothing sensitive has been leaked. This is unfortunate, but npm is handling it promptly and professionally. With any service you rely on for commercial work, like GitHub, Bitbucket, npm, and CDNs, you should review what you publish before it's stored on remote systems.
If you want to try it, use npm install -g email@example.com but be careful: not only is it a beta, but you can also break your npm installation if the global install fails. How do I know? My permissions were messed up on /usr/local/bin/npm, so when I tried to upgrade I saw "Error: EACCES, unlink ... node_modules/npm/.eslintrc". From that point npm was no longer in my $PATH... I actually reinstalled my current version of Node to fix it quickly, but I can imagine people getting very confused and frustrated about seeing command not found: npm.
If you're wondering what npm 3 will do for you, then a big thing is actually the UI: installation has a different appearance (it's more like npm ls), and npm outdated has changed. The "location" column shows which module required a dependency rather than where it is on disk.
Big projects should feel a little saner thanks to less nesting:
Your dependencies will now be installed maximally flat. Insofar as is possible, all of your dependencies, and their dependencies, and THEIR dependencies will be installed in your project's node_modules folder with no nesting.
If you make use of peerDependencies for modules that you distribute, you should be aware that they don't install dependencies anymore. Instead the user is warned about an unmet dependency.
This shifts the responsibility for fulfilling peer dependencies from library framework / plugin maintainers to application authors, and is intended to get users out of the dependency hell caused by conflicting peerDependency constraints. npm's job is to keep you out of dependency hell, not put you in it.
npm 3 is considered a rewrite by the authors. The level of redevelopment means we'll hopefully see some rich new features in the near future for front-end developers. If you're like me and you use Node for lightweight server-side apps, with rich client-side logic, then this will appeal to you.
The changelog for version 3 is actually very good and you really should read the full thing if you manage Node projects or distribute modules with npm.
io.js 2.3.1 was also released this week. One of the big changes in this release is performance improvements for require:
module: The number of syscalls made during a require() have been significantly reduced again (see #1801 from v2.2.0 for previous work), which should lead to a performance improvement (Pierre Inglebert) #1920.
This sounds very nice for large projects.
NodeDay is a conference being held in London on 26th June (this Friday), for free! Speakers include Lin Clark, who writes npm's excellent blog posts, and programmers from the BBC, Red Hat, and other companies that are using Node for interesting things.
nodeday is a Node.js conference by the enterprise, for the enterprise. Now in its second year, this one-day industry conference brings together people from companies that have adopted or are planning to adopt Node.js, and focuses on the issues that these companies face. It gives participants a forum to discuss and share their experiences with Node.js, share advice, tips and tricks, and drive forward both the technology and the community.
I apologise for not writing about this sooner, but I only just found out about it! If you have conferences you want me to cover on DailyJS, you can use the contact forms or message me on Twitter (@alex_young).
I really wanted to go to NodeDay but I can't make it this time.
This is a framework for building data layers that map directly to REST APIs. It's a bit like ORM for HTTP. It comes with base classes for routing, REST resources, models, and validation, and the models can serialise data to RethinkDB. To talk to other databases a new base model class would have to be written.
To me this looks like C# without the extra syntax for strong typing. The validation API looks similar to what you might have seen before with modules like Mongoose.
I like the idea of object-resource mapping at the HTTP level. In Node web apps we seem to spend a lot of time thinking about HTTP servers and APIs, so this feels like it could reduce the amount of boilerplate required to interface from that step to the database layer.
Are you a fan of Node's crypto API? Like the other core modules, it tries to strike a balance between providing the bare minimum and just enough abstraction. You shouldn't need to be a cryptography expert to use it, but at the same time it doesn't want to be too limiting. However, whenever I need to quickly md5 or sha1 something, it feels like too much work. I also find it a little difficult to explain to beginners.
Sindre Sorhus has written a module called hasha (GitHub: sindresorhus/hasha, License: MIT, npm: hasha). It provides a more friendly wrapper around crypto that uses sensible defaults. To get a hex encoding of a string, you can just use hasha(new Buffer('unicorn')). If you want to hash a file, then use hasha.fromFile with a callback. You can supply the desired hashing algorithm as well:
Rozu (GitHub: avoidwork/rozu, License: BSD-3-Clause, npm: rozu) by Jason Mulligan is a webhook API server that uses io.js or Node, MongoDB, and Redis. MongoDB is used for persistence, and the Redis pub/sub API is used for inbound events.
It's built with tenso for the REST API, and keigai for data storage with MongoDB. It can both send and receive webhook events. To send events, you make a POST to /send with a logged in user, and to make it receive a JSON payload you use /receive. Receiving events uses a token, so it's pretty easy to use it with sites like GitHub.
Once you've installed it you can create an account by making a GET request to /register. If you visit / (whether signed in or not) you'll get a list of API methods.
The problem is that we use several different packages to build our application, taking full advantage of the awesome NPM private modules that were recently released. The problem was that it has become increasingly hard to keep track of what environmental variables were required to make the project function, or were just generally available, because they were all being accessed at different levels of the applications dependency tree.
With environmentalist, you define a JSON file that specifies each environmental variable:
"description": "Used to compute the password hash of users",
Because you can include a description it's already useful because in my experience most people export (or heroku config:set) and forget about it later. I also like the fact you can set a default value and the required state. The name is the only required field, so if the values are obvious and self-documenting then you can keep things simple.
When you run environmentalist in the command-line, you'll see a table that shows all of the environmental variables in the current package and its dependencies, which means you can construct projects from submodules with environmentalist JSON files and it'll dig them all out.
If you've created a Node application but you want to support plugins, what to do you? package.js (GitHub: s-a/package.js, License: MIT and GPL, npm: package.js) by Stephan Ahlf is designed to allow you to open up your project to plugin developers through npm or other package managers.
It allows you to load modules from a set of paths, and then ensure each module has a property (expectedPackageIdentifier) in its package.json. It's a small module with unit tests and an example in the readme.