Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
This is super awesome! What are the requirements for running it? Does it connect directly to MySQL? If no, the only obstacle is that ops prefers all production software installations occur via packages in our Debian repository. This presents a perhaps unobvious obstacle, as it implies that any dependencies that would otherwise be installed by npm must also be turned into debs.
At the moment, we have no expertise (to say nothing for best-practices) in transforming a node package into a .deb. The automated tools fail monstrously and/or do not apply to dependencies.
If you're interested in cracking that nut, by all means, please do! I agree this thing is awesome and would be a huge asset to the community.
\o/
On Sep 20, 2012, at 4:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
notice that the stream includes a boolean flag ("using_api") for revisions made via the API versus regular edits.
On Sep 20, 2012, at 4:38 PM, Dario Taraborelli dtaraborelli@wikimedia.org wrote:
\o/
On Sep 20, 2012, at 4:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
\o/
It's awesome, and works perfectly on Chrome-like browsers. When I run on Firefox, 15.0.1 version, it just print empty gray boxes, probably because the css "@-webkit-keyframes", I'm not sure yet, just testing.
Jonas
On 20/09/2012, Dario Taraborelli dtaraborelli@wikimedia.org wrote:
notice that the stream includes a boolean flag ("using_api") for revisions made via the API versus regular edits.
On Sep 20, 2012, at 4:38 PM, Dario Taraborelli dtaraborelli@wikimedia.org wrote:
\o/
On Sep 20, 2012, at 4:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Cool!
Ori, I think David is right. If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it. That's the reason I stopped working on moving reportcard over to stat1001. For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release. After that, I'm not so sure. Heh, good luck! Lemme know if I can help.
Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.
On Sep 20, 2012, at 7:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
It would be totally awesome if we can automate the conversion of node.js apps into .deb packages
I can think of two alternative methods: 1) Create a custom rules file that basically contains a number of copy commands, the other debian packaging steps remain the same.
2) Create a DAG (directed a-cyclical graph) using packages.json and convert npm packages to deb packages using https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM Each npm package becomes a deb package and can be installed through apt-get
Definitely curious to see if https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM works,
D
On Fri, Sep 21, 2012 at 9:21 AM, Andrew Otto otto@wikimedia.org wrote:
Cool!
Ori, I think David is right. If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it. That's the reason I stopped working on moving reportcard over to stat1001. For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release. After that, I'm not so sure. Heh, good luck! Lemme know if I can help.
Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.
On Sep 20, 2012, at 7:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at
http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With
~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily
accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Each npm package becomes a deb package and can be installed through apt-get
This sounds tricky, especially since node packages seem to change so often. From hearing David talk about it, it sounds like dependency versions really matter to Limn. If we had multiple nodejs apps on the same machine, they each might dependencies of different versions. It'd be difficult to globally install different versions of the same package.
I think 'npm install' might need to be part of something that a nodejs app maintainer does before they tarball it up for a release.
I'm not sure if this is the best way to go about it, but I keep saying 'tarball it up', because a few o the .deb tutorial's I have read expect that you are not actually the source code maintainer, but rather a downstream dude who just wants to make a .deb for someone else's code. It might be easier if we try to conform to that expectation. Not sure though.
On Sep 21, 2012, at 9:33 AM, Diederik van Liere dvanliere@wikimedia.org wrote:
It would be totally awesome if we can automate the conversion of node.js apps into .deb packages
I can think of two alternative methods:
Create a custom rules file that basically contains a number of copy commands, the other debian packaging steps remain the same.
Create a DAG (directed a-cyclical graph) using packages.json and convert npm packages to deb packages using https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM
Each npm package becomes a deb package and can be installed through apt-get
Definitely curious to see if https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM works,
D
On Fri, Sep 21, 2012 at 9:21 AM, Andrew Otto otto@wikimedia.org wrote: Cool!
Ori, I think David is right. If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it. That's the reason I stopped working on moving reportcard over to stat1001. For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release. After that, I'm not so sure. Heh, good luck! Lemme know if I can help.
Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.
On Sep 20, 2012, at 7:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
On Fri, Sep 21, 2012 at 9:39 AM, Andrew Otto otto@wikimedia.org wrote:
Each npm package becomes a deb package and can be installed through apt-get
This sounds tricky, especially since node packages seem to change so often. From hearing David talk about it, it sounds like dependency versions really matter to Limn.
But in a debian package you can specify explicitly the dependencies and versions so you would have the same level of control as with npm install. However, my first option was to make a custom rules file that would just copy all the dependency, tar it and create a regular one big dep package :) D
On Sep 21, 2012, at 9:33 AM, Diederik van Liere dvanliere@wikimedia.org wrote:
It would be totally awesome if we can automate the conversion of node.js apps into .deb packages
I can think of two alternative methods:
- Create a custom rules file that basically contains a number of copy
commands, the other debian packaging steps remain the same.
- Create a DAG (directed a-cyclical graph) using packages.json and
convert npm packages to deb packages using https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM Each npm package becomes a deb package and can be installed through apt-get
Definitely curious to see if https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM works,
D
On Fri, Sep 21, 2012 at 9:21 AM, Andrew Otto otto@wikimedia.org wrote:
Cool!
Ori, I think David is right. If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it. That's the reason I stopped working on moving reportcard over to stat1001. For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release. After that, I'm not so sure. Heh, good luck! Lemme know if I can help.
Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.
On Sep 20, 2012, at 7:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at
http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With
~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily
accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
But in a debian package you can specify explicitly the dependencies and versions so you would have the same level of control as with npm install.
Not if the dependencies were installed globally as separate packages and you wanted to run multiple nodejs apps (Ori's EditFeed, Limn, etc.) on the same machine. e.g. Limn depends on node-foo-1.2 and EditFeed on node-foo-2.1. If we create .debs for each nodejs dependency version, we couldn't install both versions of the same package at the same time (well, I think we could, but it gets complicated), and therefore couldn't run both EditFeed and Limn on the same machine.
However, my first option was to make a custom rules file that would just copy all the dependency, tar it and create a regular one big dep package :)
'npm install' by default installs all the dependencies locally. You wouldn't need a custom rules file to do any copying, just a rule to run npm install. But, actually, I think it might be better to NOT run npm install as part of rules. Instead, we should have the maintainer run it manually and have the debian/ stuff consider node_modules/ as part of the author's upstream source, rather than an explicitly handled dependency.
On Sep 21, 2012, at 9:42 AM, Diederik van Liere dvanliere@wikimedia.org wrote:
On Fri, Sep 21, 2012 at 9:39 AM, Andrew Otto otto@wikimedia.org wrote:
Each npm package becomes a deb package and can be installed through apt-get
This sounds tricky, especially since node packages seem to change so often. From hearing David talk about it, it sounds like dependency versions really matter to Limn. But in a debian package you can specify explicitly the dependencies and versions so you would have the same level of control as with npm install. However, my first option was to make a custom rules file that would just copy all the dependency, tar it and create a regular one big dep package :) D
On Sep 21, 2012, at 9:33 AM, Diederik van Liere dvanliere@wikimedia.org wrote:
It would be totally awesome if we can automate the conversion of node.js apps into .deb packages
I can think of two alternative methods:
Create a custom rules file that basically contains a number of copy commands, the other debian packaging steps remain the same.
Create a DAG (directed a-cyclical graph) using packages.json and convert npm packages to deb packages using https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM
Each npm package becomes a deb package and can be installed through apt-get
Definitely curious to see if https://github.com/jordansissel/fpm/wiki/ConvertingNodeNPM works,
D
On Fri, Sep 21, 2012 at 9:21 AM, Andrew Otto otto@wikimedia.org wrote: Cool!
Ori, I think David is right. If anything is going to be puppetized on a live production system, ops is going to want to see a .deb package for it. That's the reason I stopped working on moving reportcard over to stat1001. For a nodejs .deb, I think you're going to have to npm install all of your dependencies locally and then create a tarball of that to use as the frozen upstream source release. After that, I'm not so sure. Heh, good luck! Lemme know if I can help.
Once you've got a .deb, then I would be happy to help with the puppetization and installation on stat1001.
On Sep 20, 2012, at 7:10 PM, Ori Livneh ori@wikimedia.org wrote:
Hello analytics,
I've set up an article edit / insert feed at http://kubo.wmflabs.org/editstream.html. It's receiving updates using a node.js WebSockets server running on stat1. I'd like to productionize it and wanted to solicit your input on how to do it right. I think it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day, the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I think it'll scale quite well. Should I simply submit a puppet patch to configure this service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks, Ori
-- Ori Livneh ori@wikimedia.org
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
On Fri, Sep 21, 2012 at 6:33 AM, Diederik van Liere <dvanliere@wikimedia.org
wrote:
It would be totally awesome if we can automate the conversion of node.js apps into .deb packages
Thanks, guys. dsc and I talked about this a little and I'm going to give it a shot. We both like using npm and expect to want to deploy node apps periodically, so I think it's worth a look, especially if fpm is as straightforward as it seems. I'm going to try and package the dependencies for both the edit stream and reportcard if I can. If I'm at all successful I'll document the process on wikitech. I'll write a follow-up in this thread regardless.
Hey Ori,
Very much looking forward to your experiences with fpm, if it can deliver what it promises then that would make our lives much easier! D
On Sun, Sep 23, 2012 at 12:59 AM, Ori Livneh ori@wikimedia.org wrote:
On Fri, Sep 21, 2012 at 6:33 AM, Diederik van Liere < dvanliere@wikimedia.org> wrote:
It would be totally awesome if we can automate the conversion of node.js apps into .deb packages
Thanks, guys. dsc and I talked about this a little and I'm going to give it a shot. We both like using npm and expect to want to deploy node apps periodically, so I think it's worth a look, especially if fpm is as straightforward as it seems. I'm going to try and package the dependencies for both the edit stream and reportcard if I can. If I'm at all successful I'll document the process on wikitech. I'll write a follow-up in this thread regardless.
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics