Hi everyone!
Wikimedia is releasing a new service today: EventStreams
<https://wikitech.wikimedia.org/wiki/EventStreams>. This service allows us
to publish arbitrary streams of JSON event data to the public. Initially,
the only stream available will be good ol’ RecentChanges
<https://www.mediawiki.org/wiki/Manual:RCFeed>. This event stream overlaps
functionality already provided by irc.wikimedia.org and RCStream
<https://wikitech.wikimedia.org/wiki/RCStream>. However, this new service
has advantages over these (now deprecated) services.
1.
We can expose more than just RecentChanges.
2.
Events are delivered over streaming HTTP (chunked transfer) instead of
IRC or socket.io. This requires less client side code and fewer special
routing cases on the server side.
3.
Streams can be resumed from the past. By using EventSource, a
disconnected client will automatically resume the stream from where it left
off, as long as it resumes within one week. In the future, we would like
to allow users to specify historical timestamps from which they would like
to begin consuming, if this proves safe and tractable.
I did say deprecated! Okay okay, we may never be able to fully deprecate
irc.wikimedia.org. It’s used by too many (probably sentient by now) bots
out there. We do plan to obsolete RCStream, and to turn it off in a
reasonable amount of time. The deadline iiiiiis July 7th, 2017. All
services that rely on RCStream should migrate to the HTTP based
EventStreams service by this date. We are committed to assisting you in
this transition, so let us know how we can help.
Unfortunately, unlike RCStream, EventStreams does not have server side
event filtering (e.g. by wiki) quite yet. How and if this should be done
is still under discussion <https://phabricator.wikimedia.org/T152731>.
The RecentChanges data you are used to remains the same, and is available
at https://stream.wikimedia.org/v2/stream/recentchange. However, we may
have something different for you, if you find it useful. We have been
internally producing new Mediawiki specific events
<https://github.com/wikimedia/mediawiki-event-schemas/tree/master/jsonschema…>
for a while now, and could expose these via EventStreams as well.
Take a look at these events, and tell us what you think. Would you find
them useful? How would you like to subscribe to them? Individually as
separate streams, or would you like to be able to compose multiple event
types into a single stream via an API? These things are all possible.
I asked for a lot of feedback in the above paragraphs. Let’s try and
centralize this discussion over on the mediawiki.org EventStreams talk page
<https://www.mediawiki.org/wiki/Talk:EventStreams>. In summary, the
questions are:
-
What RCStream clients do you maintain, and how can we help you migrate
to EventStreams? <https://www.mediawiki.org/wiki/Topic:Tkjkee2j684hkwc9>
-
Is server side filtering, by wiki or arbitrary event field, useful to
you? <https://www.mediawiki.org/wiki/Topic:Tkjkabtyakpm967t>
-
Would you like to consume streams other than RecentChanges?
<https://www.mediawiki.org/wiki/Topic:Tkjk4ezxb4u01a61> (Currently
available events are described here
<https://github.com/wikimedia/mediawiki-event-schemas/tree/master/jsonschema…>
.)
Thanks!
- Andrew Otto
It's worth noting the MCS is a collection of services used by the mobile
team . It includes endpoints such as `feed` (
https://en.wikipedia.org/api/rest_v1/#!/Feed). Why not put `summaries` in
there too?
On Thu, Jun 22, 2017 at 6:52 AM Sam Smith <samsmith(a)wikimedia.org> wrote:
> It's only just occurred to me that I've been making a serious mistake in
> conflating RESTBase and RESTful services, like MCS, in my recent
> communications up to and including my initial email.
>
> On Thu, Jun 22, 2017 at 2:43 PM, Marko Obrovac <mobrovac(a)wikimedia.org>
> wrote:
>
>> While it could be done in REDTBase as well, I think that this is not a
>> good long-term solution as it introduces a dependency on the Services team
>> for something that you ultimately own the output of.
>>
>
> This is an excellent point.
>
> With the above in mind, I think that the ideal solution is creating a new
> service for generating page summaries that can be consumed by multiple
> platforms. Just as with TextExtracts, page summaries are distinct from MCS.
> It's up to the Reading Web team to decide whether they want to implement it
> in Node or as a new MediaWiki API module that lives in the Popups extension.
>
> -Sam
>
> --
> IRC (Freenode): phuedx
> Timezone: BST (UTC+1)
>
Hello, I am investigating the 6-13-17 incident involving ores as a
volunteer for wikimedia AI, I was wondering if theres a way PDFrender can
interact less server resource wise when it comes to being on the same
server as ORES to keep ORES from failing, if not is it possible that either
ORES or PDFRender get moved to different servers perhaps? If you have any
questions, comments, or concerns please reply to me here or #wikimedia-ai
thanks!
Zppix
Volunteer developer for WMF
enwp.org/User:Zppix