Hi, anybody interested in participating at the
Open Help Conference & Sprints
September 26-30
Cincinnati, Ohio, USA
http://conf.openhelp.cc/
A Wikimedia expedition attended a couple of years ago with a Wikipedia &
user help focus, and they were happy about the event.
http://blog.wikimedia.org/2013/07/03/wikipedians-open-help-conference/
Lately we are focusing on users of our APIs, datasets, tools,
infrastructure... Would it make sense to organize something at that event?
Wikimedia volunteers, remember that we might be able to support your travel
expenses.
https://meta.wikimedia.org/wiki/Grants:TPS
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi,
I want to build a tool which will generate the statistics for my native
wiki. Before starting to build a tool i was searching for an existing one.
The "Super Counter"[1] has some of the features i want to implement in my
tool.
So is there any way to use the "Super Counter" as an API? if so, can anyone
please show me the path of the documentation?
[1] - https://tools.wmflabs.org/supercount/index.php
--
*Nasir Khan Saikat*
www.nasirkhn.com
Hi all,
Starting the week of June 8th we'll be transitioning our MediaWiki +
Extensions deployment cadence to a shorter/simpler one. This will begin
with 1.26wmf9.
New cadence:
Tuesday: New branch cut, deployed to test wikis
Wednesday: deployed to non-wikipedias
Thursday: deployed to Wikipedias
This is not only a lot simpler to understand ("wait, we deploy twice on
Wednesday?") but it also shortens the time to get code to everyone (2 or
3 days from branch cut, depending on how you count).
== Transition ==
Transitions from one cadence to another are hard. Here's how we'll be
doing this transition:
Week of June 1st (next week):
* We'll complete the wmf8 rollout on June 3rd
* However, we won't be cutting wmf9 on June 3rd
Week of June 8th (in two weeks):
* We'll begin the new cadence with wmf9 on Tuesday June 9th
I hope this helps our users and developers get great new features and
fixes faster.
Greg
endnotes:
* The task: https://phabricator.wikimedia.org/T97553
* I'll be updating the relevant documentation before the transition
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
+external mobile and wikitech
Shoot. I meant to send this the external list. For those of you just
joining us, we recently got an email from google letting us know that some
of our pages are now failing the mobile-friendly test, which has an adverse
impact on our search results. It appears that most of our pages are also
blocking style info.
Google doesn't offer much insight into penalties, but if they're sending us
the email then it is or will have some impact. I'd like to see if we can
better understand the other side of the equation- what is cost of fixing
it? I think Jon's questions below are the ones to start with.
-J
On Fri, May 29, 2015 at 1:35 PM, Jon Robson <jrobson(a)wikimedia.org> wrote:
> It's all in the report. We block w/ in robots.txt always have.
>
> There have been a bunch of changes to improve performance of the site
> overall which might have led to that. Mailing this to the public mailing
> list wikitech would give you a better idea.
>
> Our site /is/ mobile friendly it's just we tell google not to load styles
> from w/load.php so no need to panic.
>
> The question is what is the penalty of us failing this tool? Does it
> impact our google search rankings?
>
> Fix is trivial.. Update robots.txt but first the question is why are we
> blocking scripts and styles on that url?
> On 29 May 2015 1:17 pm, "Jon Katz" <jkatz(a)wikimedia.org> wrote:
>
>> Hi Readership team and broader community,
>> Any changes we might have recently made to cause this warning to appear
>> about googlebot not being able to access our site?
>> I consider this to be a very serious issue.
>> The examples below are not mobile, but same issue applies when you try an
>> en.m. version of the pages.
>>
>> Best,
>> Jon
>>
>>
>> ---------- Forwarded message ----------
>> From: Wes Moran <wmoran(a)wikimedia.org>
>> Date: Fri, May 29, 2015 at 12:47 PM
>> Subject: Mobile Firendly
>> To: Jon Katz <jkatz(a)wikimedia.org>
>> Cc: Adam Baso <abaso(a)wikimedia.org>
>>
>>
>> Jon,
>>
>> Google notified us of the followin...
>>
>> "We recently noticed that there was a change with how you embed CSS & JS
>> which results in us not being able to use the CSS & JS to recognize what
>> the page looks like. That's making some of your pages fail the
>> mobile-friendly-test, for example. You used to load CSS & JS from
>> bits.wikimedia.org, but now they're loaded through /w/load.php?...
>> directly from the Wikipedia host, where that path is blocked by robots.txt.
>>
>> You can see how we render the pages with Fetch as Google in Webmaster
>> Tools / Search Console, you can also see most of that with the test-page at:
>>
>> https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fe…
>>
>> Some of the pages still pass the test there (example
>> <https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fe…>),
>> but the CSS is broken there too since it's blocked. "
>>
>> Any ideas what can be causing this?
>>
>> Regards,
>> Wes
>>
>>
>> _______________________________________________
>> reading-wmf mailing list
>> reading-wmf(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/reading-wmf
>>
>>
> _______________________________________________
> reading-wmf mailing list
> reading-wmf(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/reading-wmf
>
>
On Fri, May 29, 2015 at 2:09 AM, Brian Wolff <bawolff(a)gmail.com> wrote:
> Shouldnt such [pywikibot] tests be run against beta wiki not testwiki?
Primarily this hasnt happened because pywikibot doesnt have a family
file for the beta wiki, but that is our issue, so I've done a simple
test run on beta wiki
four initial beta wiki problems:
- no https
(not nice - that means test accounts must be created and accessed
using passwords that are sent in essentially cleartext - so sharing
passwords with the same account name on the real wikis is a security
risk)
- invalid siteinfo interwiki map; includes entries to wikis that do
not exist. That means pywikibot cant parse wikitext links, as it cant
distinguish between iwprefix: and namespace: (pywikibot team might be
able to reduce the impact of this wiki config problem)
- paramInfo failure that breaks pywikibot's dynamic api detection;
pywikibot batch loads the paraminfo for all modules, for example
issuing the following API as part of the initialisation, and it is a
503 Service Unavailable
http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfo&modules=abu…
A little digging shows that the problematic module is fancycaptchareload
http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfo&modules=fan…
"Our servers are currently experiencing a technical problem. This is
probably temporary and should be fixed soon. Please try again in a few
minutes.
If you report this error to the Wikimedia System Administrators,
please include the details below.
Request: GET http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfo&modules=fan…,
from 127.0.0.1 via deployment-cache-text02 deployment-cache-text02
([127.0.0.1]:3128), Varnish XID 855090368
Forwarded for: [your IP], 127.0.0.1
Error: 503, Service Unavailable at Fri, 29 May 2015 07:57:09 GMT "
Could someone investigate this / fix that module?
Pywikibot devs could also help reduce the impact of this type of
problem in the future, by falling back from batch mode fetch to
loading individual modules in order to skip buggy modules.
- no SUL with the real wikis
(probably the best choice given no https on the beta cluster, but it
complicates adding beta wiki to our existing Travis-CI test matrix
which includes real wikis)
actual results at
https://travis-ci.org/jayvdb/pywikibot-core/builds/64532033
--
John Vandenberg
Mailing list maintenance window - 2015-06-02 17:00 UTC to 19:00 UTC
I'll be taking the mailing list server offline for a (hopefully) short
window.
At the time of this announcement, this will be simply to process a single
request. There is a good chance another request or two will be appended to
this one before the actual window arrives.
The downtime will start on 2015-06-02 @ 17:00 UTC and end at 19:00 UTC.
Current Tasks:
T100707 - Rename pywikipedia-l to pywikibot
https://phabricator.wikimedia.org/T100707
Some may recall the last mailman maintenance window wasn't without
incident:
https://wikitech.wikimedia.org/wiki/Incident_documentation/20150519-Mailman
This has been taken into account, and further testing of changes and
mailman both before and after maintenance have been expanded.
During this time, all mailing list traffic, as well as online archives, may
become unavailable. Once the window has ended, all mail routing should
resume normally, and any messages sent during the window will then be
delivered accordingly.
All associated tasks and changes are linked from
https://phabricator.wikimedia.org/T100711.
This mail has (initially) been send to the lists administrators list and
wikitech-l. Please feel free to forward to any other lists or parties that
would need to know about this maintenance window.
--
Rob Halsell
Operations Engineer
Wikimedia Foundation, Inc.
E-Mail: rhalsell(a)wikimedia.org
Office: 415.839.6885 x6620
Fax: 415.882.0495
<quote name="Mukunda Modell" date="2015-05-28" time="13:42:50 -0500">
> This also means we need to be even more diligent about policing the error
> logs and eliminating noise which obscures real problems by burying them
> among the other log messages.
+1000
See also:
https://phabricator.wikimedia.org/maniphest/?statuses=open%28%29&projects=P…
(aka: open tasks filed in the #Wikimedia-log-errors project on
phabricator)
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
So, after looking at the backlog of GeSHi's pull requests, I decided that
my itch was irritating enough to warrant a click on that nasty little fork
button. Let me introduce you to Chechil Highlighter
<https://github.com/MaxSem/chechil>. So far, I've:
- Grabbed all pending GeSHi pull requests and reviewed them, merging
into Chechil whenever it made sense.
- Thrown out support for pre-antiquated PHP versions
- Ditched a couple of small features deprecated by GeSHi developers
themselves
- Started adding unit tests
My immediate plans are to cover stuff with at least basic unit tests before
a major rewrite, which would include:
- Kill the system where every language definition specifies its own
styles in favor of a unified CSS file. That would make things more
consistent and allow us to significantly reduce the size of our startup
module that wouldn't need to contain all those per-language modules. And
would allow restyling the highlighted text easily and consistently.
- Converting PHP language definitions to JSON for improved security.
Especially important since GeSHi allows you to use custom definitions.
- Generally improving the code base, bringing it up to modern standards.
So far, this is just a personal pet project, I'm not proposing that we use
it for highlighting on WMF or anything, but I feel like that would be a
step forward if the rewrite succeeds. Please don't hesitate to throw your
boos or suggestions at me!
--
Best regards,
Max Semenik ([[User:MaxSem]])