Each npm package becomes a deb package and can be installed through apt-getThis sounds tricky, especially since node packages seem to change so often. From hearing David talk about it, it sounds like dependency versions really matter to Limn.
On Sep 21, 2012, at 9:33 AM, Diederik van Liere <dvanliere@wikimedia.org> wrote:It would be totally awesome if we can automate the conversion of node.js apps into .deb packagesI can think of two alternative methods:1) Create a custom rules file that basically contains a number of copy commands, the other debian packaging steps remain the same.2) Create a DAG (directed a-cyclical graph) using packages.json and convert npm packages to deb packages using https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPMEach npm package becomes a deb package and can be installed through apt-getDefinitely curious to see if https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM works,D_______________________________________________On Fri, Sep 21, 2012 at 9:21 AM, Andrew Otto <otto@wikimedia.org> wrote:
Cool!
Ori, I think David is right. If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it. That's the reason I stopped working on moving reportcard over to stat1001. For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release. After that, I'm not so sure. Heh, good luck! Lemme know if I can help.
Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.
> Hello analytics,
>
> I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
>
> Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
>
> It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
>
> Thoughts? Let me know!
>
> Thanks,
> Ori
>
> --
> Ori Livneh
> ori@wikimedia.org
>
>
>
> _______________________________________________
> Analytics mailing list
> Analytics@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/analytics
_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics
_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics