This is super awesome! What are the requirements for running it? Does it connect directly
to MySQL? If no, the only obstacle is that ops prefers all production software
installations occur via packages in our Debian repository. This presents a perhaps
unobvious obstacle, as it implies that any dependencies that would otherwise be installed
by npm must also be turned into debs.
At the moment, we have no expertise (to say nothing for best-practices) in transforming a
node package into a .deb. The automated tools fail monstrously and/or do not apply to
dependencies.
If you're interested in cracking that nut, by all means, please do! I agree this thing
is awesome and would be a huge asset to the community.
--
David Schoonover
dsc(a)wikimedia.org
On Thursday, 20 September 2012 at 4:10 p, Ori Livneh wrote:
Hello analytics,
I've set up an article edit / insert feed at
http://kubo.wmflabs.org/editstream.html.
It's receiving updates using a node.js WebSockets server running on stat1. I'd
like to productionize it and wanted to solicit your input on how to do it right. I think
it'd be useful to provide this stream as a service to the community.
Each edit event is ~300 bytes of gzipped-compressed JSON data. With ~140,000 edits a day,
the bandwidth per client is 0.5kbps. No filtering or buffering happens on the server, so I
think it'll scale quite well. Should I simply submit a puppet patch to configure this
service to run on stat1001?
It'd be good to map the service onto a URL on bits, so that it's easily
accessible from JS code running on Wikipedia.
Thoughts? Let me know!
Thanks,
Ori
--
Ori Livneh
ori(a)wikimedia.org (mailto:ori@wikimedia.org)
_______________________________________________
Analytics mailing list
Analytics(a)lists.wikimedia.org (mailto:Analytics@lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/analytics