http://devhub.wmflabs.org is a prototype of the "Data and developer hub", a
portal and set of articles and links whose goal is to encourage third-party
developers to use Wikimedia data and APIs. Check it out, your feedback is
welcome! You can comment on the talk page of the project page
https://www.mediawiki.org/wiki/dev.wikimedia.org , or file Phabricator
tickets in the project dev.wikimedia.org [1].
Since December 2013 Moiz Syed and others discussed creating "a thing" to
expose our APIs and data to developers. When S Page moved to WMF tech
writer, he wrote some articles for this on mediawiki.org and with Quim Gil
developed a landing page from the wireframe designs [2].
The prototype is using the Blueprint skin and running on a labs instance,
but the articles are all regular wiki pages on mediawiki.org that we
regularly import to http://devhub.wmflabs.org
Thanks to everyone who participated in the gestation of this idea!
-- S Page and Quim Gil
== FAQ ==
Q: How can I feature my awesome API or data set?
A: Create a task in the #dev.wikimedia.org and #documentation projects [3]
with "Article" in the title. You can draft an article yourself, following
the guidelines [4].
Q: Yet another site? Arghh!
A: Agreed, T101441 "Integrate new Developer hub with mediawiki.org" [5].
It's a separate site for now in order to present a different appearance.
Q: But why a different appearance? Why a separate skin?
Our competition for developer mindshare is sites like
https://developers.google.com/ . We believe looking like a 2000s wiki page
is a *deterrent* to using Wikimedia APIs and data. We hope that many
third-party developers join our communities and eventually contribute to
MediaWiki, but "How to contribute to MediaWiki" [6] is not the focus,
providing free open knowledge is.
Q: Why the Blueprint skin?
A: The Design team (now Reading Design) developed it for the OOUI Living
Style Guide [7] and it has some nice features: a fixed header, and a
sidebar that gets out of the way and combines page navigation and the TOC
of the current page.
Q: So why not use the Blueprint skin on mediawiki.org?
A: Agreed, T93613 "Deploy Blueprint on mediawiki.org as optional and
experimental skin" is a blocker for T101441. We appreciate help with it and
its blockers.
Q: I hate the appearance.
A: That's not a question :) You can forget the prototype exists and view
the same content at
https://www.mediawiki.org/wiki/API:Data_and_developer_hub
Q: What is "dev.wikimedia.org"?
A: http://dev.wikimedia.org will be the well-known shortcut to the landing
page. And dev.wikimedia.org is the project name for this "Data and
developer hub".
Q: I thought dev.wikimedia.org was going to integrate source
documentation/replace doc.wikimedia.org/enumerate all Wikimedia software
projects/cure cancer, what happened?
A: One step at a time. For now, its goal is, to repeat, "to encourage
third-party developers to use Wikimedia data and APIs".
Q: Why are the pages in the API: namespace?
A: That's temporary, they will probably end up in a dev: namespace on
mediawiki.org that uses the Blueprint skin by default (T369).
Q: Where are the talk pages?
A: It's a bug that the sidebar doesn't have a "Discussion" link (T103785).
The talk pages on the prototype all redirect to the talk pages for the
original pages on mediawiki.org, and Flow is enabled on them.
[1]
https://phabricator.wikimedia.org/maniphest/task/create/?projects=dev.wikim…
[2] https://www.mediawiki.org/wiki/Dev.wikimedia.org#Structure
[3]
https://phabricator.wikimedia.org/maniphest/task/create/?projects=dev.wikim…
[4] https://www.mediawiki.org/wiki/dev.wikimedia.org/Contributing
[5] https://phabricator.wikimedia.org/T93613 and its blockers
[6] https://www.mediawiki.org/wiki/How_to_contribute (a fine general entry
point)
[7] http://livingstyleguide.wmflabs.org/
--
=S Page WMF Tech writer
As has been announced several times (most recently at
https://lists.wikimedia.org/pipermail/wikitech-l/2015-April/081559.html),
the default continuation mode for action=query requests to api.php will be
changing to be easier for new coders to use correctly.
*The date is now set:* we intend to merge the change to ride the deployment
train at the end of June. That should be 1.26wmf12, to be deployed to test
wikis on June 30, non-Wikipedias on July 1, and Wikipedias on July 2.
If your bot or script is receiving the warning about this upcoming change
(as seen here
<https://www.mediawiki.org/w/api.php?action=query&list=allpages>, for
example), it's time to fix your code!
- The simple solution is to simply include the "rawcontinue" parameter
with your request to continue receiving the raw continuation data (
example
<https://www.mediawiki.org/w/api.php?action=query&list=allpages&rawcontinue=1>).
No other code changes should be necessary.
- Or you could update your code to use the simplified continuation
documented at https://www.mediawiki.org/wiki/API:Query#Continuing_queries
(example
<https://www.mediawiki.org/w/api.php?action=query&list=allpages&continue=>),
which is much easier for clients to implement correctly.
Either of the above solutions may be tested immediately, you'll know it
works because you stop seeing the warning.
I've compiled a list of bots that have hit the deprecation warning more
than 10000 times over the course of the week May 23–29. If you are
responsible for any of these bots, please fix them. If you know who is,
please make sure they've seen this notification. Thanks.
AAlertBot
AboHeidiBot
AbshirBot
Acebot
Ameenbot
ArnauBot
Beau.bot
Begemot-Bot
BeneBot*
BeriBot
BOT-Superzerocool
CalakBot
CamelBot
CandalBot
CategorizationBot
CatWatchBot
ClueBot_III
ClueBot_NG
CobainBot
CorenSearchBot
Cyberbot_I
Cyberbot_II
DanmicholoBot
DeltaQuadBot
Dexbot
Dibot
EdinBot
ElphiBot
ErfgoedBot
Faebot
Fatemibot
FawikiPatroller
HAL
HasteurBot
HerculeBot
Hexabot
HRoestBot
IluvatarBot
Invadibot
Irclogbot
Irfan-bot
Jimmy-abot
JYBot
Krdbot
Legobot
Lowercase_sigmabot_III
MahdiBot
MalarzBOT
MastiBot
Merge_bot
NaggoBot
NasirkhanBot
NirvanaBot
Obaid-bot
PatruBOT
PBot
Phe-bot
Rezabot
RMCD_bot
Shuaib-bot
SineBot
SteinsplitterBot
SvickBOT
TaxonBot
Theo's_Little_Bot
W2Bot
WLE-SpainBot
Xqbot
YaCBot
ZedlikBot
ZkBot
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
Hi list,
is there a procedure/best practice to automatically transfer old SVN
style extensions to GIT?
Is this something the maintainer of an extension needs to do?
An example would be: https://www.mediawiki.org/wiki/Extension:DateDiff
/planetenxin
Hello,
Zeljko and I will upgrade Jenkins on Wednesday July 8th at 8:00am UTC
(10:00am CET).
There will be roughly half an hour downtime. Zuul will keep queueing the
changes and trigger the jobs whenever Jenkins comes back up.
In case of crazy side effect, we will revert back to the current version.
Expected downtime is half an hour but I scheduled a two hours
maintenance window.
https://phabricator.wikimedia.org/T101884https://wikitech.wikimedia.org/w/index.php?title=Deployments&diff=169423&ol…
--
Antoine "hashar" Musso
Hello!
In this thread
<https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(proposals)#Userpage_d…>,
there was a discussion about indexing of user space by search engines. In a
nutshell, user space pages are not subject to content policies so that
users can write drafts freely, and having those pages indexed by search
engines like Google is viewed as problematic since those pages can seem
fairly official.
I seem to recall that it was not the default in the past that user pages
were indexed by search engines. I'm trying to figure out if there's some
other cause for this that's happened recently, because I'd prefer to avoid
piling hacks on and not address the root issue.
Does anyone know of anything that's changed recently that might've changed
the way that search engines index user space?
Thanks,
Dan
--
Dan Garry
Product Manager, Discovery
Wikimedia Foundation
Some of you may have noticed a bot [1] providing reviews for the
Mobilefrontend and Gather extensions.
This is a grass routes experiment [2] to see if we can reduce
regressions by running browser tests against every single commit. It's
very crude, and we're going to have to maintain it but we see this as
a crude stop gap solution until we get gerrit-bot taking care of this
for us.
Obviously we want to do this for all extensions but we wanted to get
something good enough that is not scaleable to start exploring this.
So far it has caught various bugs for us and our browser test builds
are starting to finally becoming consistently green, a few beta labs
flakes aside [3].
Running tests on beta labs is still useful but now we can use it to
identify tests caused by other extensions. We were finding too often
our tests were failing due to us neglecting them.
In case others are interested in how this is working and want to set
one up themselves I've documented this here:
https://www.mediawiki.org/wiki/Reading/Setting_up_a_browser_test_bot
Please let me now if you have any questions and feel free to edit and
improve this page. If you want to jump into the code that's doing this
and know Python check out:
https://github.com/jdlrobson/Barry-the-Browser-Test-Bot
(Patches welcomed and apologies in advance for the code)
[1] https://gerrit.wikimedia.org/r/#/q/reviewer:jdlrobson%252Bbarry%2540gmail.c…
[2] https://phabricator.wikimedia.org/T100293
[3] https://integration.wikimedia.org/ci/view/Mobile/job/browsertests-MobileFro…