There are occasional community discussions about how to preserve Wikipedia
(and hopefully the sister projects) for a very long time even if there are
any of several kinds of global disasters. This archival technology might be
of interest:
http://phys.org/news/2016-02-eternal-5d-storage-history-humankind.html
Quoting from the article: "As a very stable and safe form of portable
memory, the technology could be highly useful for organisations with big
archives, such as national archives, museums and libraries, to preserve
their information and records."
I haven't studied this technology in any depth so I can't endorse it, but
it looks like it might be of interest to archivists, preservationists, and
contingency planners.
Pine
The suggestion has been raised (
https://www.mediawiki.org/wiki/Topic:Tbyqbjcuihhkhtk8) that one of the
Topics for the upcoming Developer Summit be the Community Wishlist.
It seems to me that the community wishlist is still not completely embraced
by engineering/devs, perhaps partly because some of the items are
impossible, or already on a roadmap, or others have priorities which are
out of sync with implementation difficulty. It is excellent work by the
Community Tech team that somehow still feels "not completely integrated".
Perhaps one way to structure a "wishlist" topic at the dev summit would be
to collaborate to improve the 'status' category of
https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey/Results. It
would be helpful to have an engineering assessment for each wishlist item
detailing either:
(a) this is being actively worked on now by WMF staff
(b) this is on a roadmap for (roughly) XYZ date (with a link to the
roadmap),
(c) this depends on some other prior work (which is on a roadmap)
(d) this is technically sound but not a priority of the WMF (for
<reasons>, spelled out) so we are eager for community assistance
(e) there is serious disagreement about how to best accomplish this,
technically
(f) there is serious disagreement about how to best accomplish this,
non-technically (UX, social factors, mission creep, ongoing maintenance,
community A disagrees with community B, etc)
(g) this is, in the judgement of engineering, impossible or unwise.
It seems that this has been done for the top ten wishlist items, but we
could collaborate on filling out details on more of the items.
A follow up session could concentrate on items in categories (d) and (e),
attempting to resolve roadblocks. Category (f) would need non-engineering
participation, perhaps at the next Wikimania.
--scott
--
(http://cscott.net)
The Wikimedia Developer Summit will be taking place in San Francisco,
California USA between January 9th and 11th.
Registration is now open. The call for participation will open as soon as
the main topics of the event are defined. You are invited to join the
discussion.
See the wiki for details:
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit
The deadline to request travel sponsorship is Monday October 24th.
Hope to see you in San Francisco!
Hi!
I'd like to raise a topic of handling change notifications and
up-to-date-ness of Wiki pages data with relation to page props.
First, a little background about how I arrived at the issue. I am
maintaining Wikidata Query Service, which updates from Wikidata using
recent changes API and RDF export format for Wikidata pages. Recently,
we have implemented using certain page properties, such as link &
statement counts. This is when I discovered the issue: the page
properties are not updated when the page (Wikidata item) is edited, but
are updated later, as I understand by a job.
Now, this leads to a situation where when I have a recent changes entry,
and I look at the RDF export page - which contains page props derived
data now - I can not know if page props data is up-to-date or not.
Moreover, if the job - some unknown and undefined time later - updates
the page props, I get no notification since the modification is not
reflected in recent changes. This makes usage of information derived
from page props very hard - you never know if the data is stale or
whether the data in page props matches the data in the page.
The problem is described in more detail in
https://phabricator.wikimedia.org/T145712
I'd like to find a solution for it, but not sure how to proceed.
The data specific to this case can be easily generated from the data
already present in memory during the page update, but I assume there
were some reasons why it was deferred.
We could make some kind of notification when updating page props, though
that would probably seriously increase the number of notifications and
thus slow the updates. Also, in some cases, the second notification may
not be necessary since the page props were updated before I've processed
the first one, but I have no way of knowing it now.
Any advice on how to solve this issue?
--
Stas Malyshev
smalyshev(a)wikimedia.org
Hi,
I'm very excited to announce that Debian and Ubuntu packages for
MediaWiki are now available. These packages will follow the MediaWiki
LTS schedule and currently contain 1.27.1. If you've always wanted to
"sudo apt install mediawiki", then this is for you :)
For Debian users, you can get the mediawiki package for Jessie from the
official Debian repositories using jessie-backports, and it will be
included in the upcoming Stretch release.
Ubuntu users will need to enable a PPA for Xenial or Trusty to set up
the package.
Instructions, links, and help can all be found at
<https://www.mediawiki.org/wiki/User:Legoktm/Packages>. Please let me
know if you have any questions or feedback.
Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of
the package for making this a reality.
-- Legoktm
Hello!
Grrrit-wm[1] is an IRC bot that reports gerrit activiy to IRC. I've
been maintaining it for a few years now, but I no longer have the
bandwidth to do this. I'm going to remove myself as maintainer
shortly. It already has a bunch of people as maintainers on tool labs,
and anyone with +2 on mediawiki core can merge changes in it.
Hopefully that's all good enough :)
Thanks for all the fish!
[1] https://wikitech.wikimedia.org/wiki/Grrrit-wm
--
Yuvi Panda T
http://yuvi.in/blog