On Fri, Sep 21, 2012 at 9:39 AM, Andrew Otto <otto@wikimedia.org> wrote:
Each npm package becomes a deb package and can be installed through apt-get
This sounds tricky, especially since node packages seem to change so often.  From hearing David talk about it, it sounds like dependency versions really matter to Limn.  
But in a debian package you can specify explicitly the dependencies and versions so you would have the same level of control as with npm install. However, my first option was to make a custom rules file that would just copy all the dependency, tar it and create a regular one big dep package :)
D



On Sep 21, 2012, at 9:33 AM, Diederik van Liere <dvanliere@wikimedia.org> wrote:

It would be totally awesome if we can automate the conversion of  node.js apps into .deb packages

I can think of two alternative methods:
1) Create a custom rules file that basically contains a number of copy commands, the other debian packaging steps remain the same.

2) Create a DAG (directed a-cyclical graph)  using packages.json and convert npm packages to deb packages using  https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM
Each npm package becomes a deb package and can be installed through apt-get

Definitely curious to see if https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM works, 

D

On Fri, Sep 21, 2012 at 9:21 AM, Andrew Otto <otto@wikimedia.org> wrote:
Cool!

Ori, I think David is right.  If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it.  That's the reason I stopped working on moving reportcard over to stat1001.  For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release.  After that, I'm not so sure.  Heh, good luck!  Lemme know if I can help.

Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.


On Sep 20, 2012, at 7:10 PM, Ori Livneh <ori@wikimedia.org> wrote:

> Hello analytics,
>
> I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
>
> Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
>
> It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
>
> Thoughts? Let me know!
>
> Thanks,
> Ori
>
> --
> Ori Livneh
> ori@wikimedia.org
>
>
>
> _______________________________________________
> Analytics mailing list
> Analytics@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/analytics


_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics

_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics


_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics