Each npm package becomes a deb package and can be installed through apt-get
This sounds tricky, especially since node packages seem to change so often. From hearing David talk about it, it sounds like dependency versions really matter to Limn. If we had multiple nodejs apps on the same machine, they each might dependencies of different versions. It'd be difficult to globally install different versions of the same package.
I think 'npm install' might need to be part of something that a nodejs app maintainer does before they tarball it up for a release.
I'm not sure if this is the best way to go about it, but I keep saying 'tarball it up', because a few o the .deb tutorial's I have read expect that you are not actually the source code maintainer, but rather a downstream dude who just wants to make a .deb for someone else's code. It might be easier if we try to conform to that expectation. Not sure though.
On Sep 21, 2012, at 9:33 AM, Diederik van Liere dvanliere@wikimedia.org wrote:
It would be totally awesome if we can automate the conversion of node.js apps into .deb packages
I can think of two alternative methods:
Create a custom rules file that basically contains a number of copy commands, the other debian packaging steps remain the same.
Create a DAG (directed a-cyclical graph) using packages.json and convert npm packages to deb packages using https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM
Each npm package becomes a deb package and can be installed through apt-get
Definitely curious to see if https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM works,
D
On Fri, Sep 21, 2012 at 9:21 AM, Andrew Otto otto@wikimedia.org wrote: Cool!
Ori, I think David is right. If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it. That's the reason I stopped working on moving reportcard over to stat1001. For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release. After that, I'm not so sure. Heh, good luck! Lemme know if I can help.
Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.
On Sep 20, 2012, at 7:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics