Ladies That FOSS is WMDE's first Open Source Hack Event aimed primarily at
women who want to join a Free and Open Source Software (FOSS) project but
don’t know where to start. This is a unique opportunity for nonbinary and
woman coders to test out uncharted waters and code FOSS by meeting some
people who already got the FOSS flu and are deeply involved in projects.
So this is how it works. This day will be a one-day hack event where women
get the chance to dive into the FOSS movement and try out projects.
Projects will give women the chance to hack with them on something which
can be accomplished on that day. While hacking, the new women on the team
can get to know the FOSS movement and see what impact they can have.
This is an event which will offer newcomers the opportunity to engage
closely with the projects. Only four to six participants are matched with a
project. A unique chance to get to know some of the coolest projects and
coders out there and become part of the FOSS community. There will be
plenty of time for learning, trying, chatting, eating and *most of all,
coding FOSS*.
We already welcome Wikidata, Mozilla's Firefox Browser, Coala and most
likely Apache - BUT the collaboration is missing MediaWiki! Please contact
us, if you would like to become a MediaWiki mentor and join this event.
Please get in touch with julia.schuetze(a)wikimedia.de and myself if you are
interested and possibly able to help.
Thanks!
Rachel
Hi,
Per[1], I have given Glaisher +2 rights in all mediawiki repositories.
Congrats, and thanks for all of your contributions so far!
[1] https://phabricator.wikimedia.org/T141730
-- Legoktm
Hi all together,
first of all: If you don't use ConfirmEdit with the ReCaptcha (NOT the
ReCaptchaNoCaptcha) module, you can STOP reading here :)
TL;DR
Please upgrade to the ReCaptchaNoCaptcha module to use a supported version
in the future and please respond to [1], if you've an opinion about the
deprecation/removal plan(s).
Long version:
As you may know, ConfirmEdit[2] (the MediaWiki extension that helps you to
fight against spam) supports different CAPTCHA modules, such as a
MathCaptcha (where the user has to solve a math problem) or QuestyCaptcha
(where the user has to answer a pre-defined question). Another module is
Google's ReCaptcha[3][4], in both versions, the old v1 and the newer v2
(also called NoCaptcha). In December 2014, Google announced a new version of
the ReCaptcha CAPTCHA-service, called ReCaptcha NoCaptcha (or ReCaptcha v2).
ConfirmEdit currently supports both versions, as Google did the same until
recently. Now, as you can read on the FAQ page of ReCaptcha[5], Google
stopped supporting the v1 of ReCaptcha. This means, that no new features are
developed and that new keys registered for ReCaptcha will work with the new
version 2 only. This let me come to the conclusion, that ReCaptcha v1 will
not be supported by Google at all anymore in the near future. In order of
this, we should ask our self, how long we want to support the old ReCaptcha
module in ConfirmEdit (which, compared to the NoCaptcha module, _seems_ to
be less effective). The mid-term plan is clear: remove the old ReCaptcha
module. Now there are two ways to achieve this:
* Deprecate the old reCaptcha in Version 1.27 of MediaWiki (would be a
so called backport) and remove it in the upcoming 1.28 release
* Deprecate the old reCaptcha in the upcoming Version 1.28 of
MediaWiki and remove it:
* in MediaWiki 1.29 or
* when Google doesn't support the old reCaptcha anymore
Because this is a huge problem for existing third-party wikis (and because
we don't have any usage statistics), I'm not sure, which plan we should
choose. That's why I sent this e-mail out, to get (hopefully) some responses
and opinions.
So, the call-to-Action: If you still use the old ReCaptcha module, please
plan to upgrade to the new version 2 of ReCaptcha as soon as possible and
let us know (the best way is a comment in the task[1]), if you need the old
ReCaptcha version supported as long as Google supports it, or if you're fine
with removing it in the upcoming 1.28 release of MediaWiki.
To be absolutely clear: The ReCaptcha module will be removed, the only
question is when: in MediaWiki 1.28 or in MediaWiki 1.29.
If you've any questions, feel free to answer to this e-mail, add a comment
to the linked task, contact me in IRC or write me an e-mail (if you need a
private conversation).
Thanks for your attention and for using ConfirmEdit!
Best,
Florian
[1] https://phabricator.wikimedia.org/T142133
[2] https://www.mediawiki.org/wiki/Extension:ConfirmEdit
[3] https://en.wikipedia.org/wiki/ReCAPTCHA
[4] https://www.google.com/recaptcha/intro/index.html
[5] https://developers.google.com/recaptcha/docs/faq
Hi all,
I am running into an issue with the dump file: wikidatawiki.xml.bz2 from
https://dumps.wikimedia.org/wikidatawiki/latest/wikidatawiki-latest-pages-a…
The dump file fails the integrity test
"bzip2 -t wikidatawiki-latest-pages-artitcles.xml.bz2"
I am not sure what is the best way to fix this. Re-downloading seems straightforward but there is no way to guarantee the re-downloaded one will pass the test.
Thanks a lot !
TLDR: migration of 2 extensions to wfLoadExtension() resulted in problems,
Logstash wasn't displaying them.
== Timeline ==
Previous days
* In a massive effort by many people, lots of extensions were converted to
extension.json, including
** Timeline in https://gerrit.wikimedia.org/r/#/c/303248/
** ContactPage in https://gerrit.wikimedia.org/r/#/c/298084/
* These changes were not compatible with our current production
configuration and thus had to be accompanied with mediawiki-config changes
and probably be deployed separately to minimize the chance of screwup.
* Furthermore, even a cursory testing of the above Timeline change would
have shown that it is broken.
August 9
* After 12:00 SF time Mukunda deploys train to stage 0 wikis
* At 16:00 Max prepares for SWAT but sees errors in fatalmonitor and
investigates:
** Creating default object from empty value in
/srv/mediawiki/wmf-config/CommonSettings.php
on line 686
** Undefined variable: wgContactConfig in
/srv/mediawiki/wmf-config/CommonSettings.php
on line 968
* Max sees no such errors in Logstash.
* After identifying the cause, Max starts reverting the affected
extensions, however there were a lot of intermediate commits and Reedy was
committing fixes so Max proceeds with deploying the fixes instead.
* Fixes produced more problems. Max contemplates a revert of group0 back to
wmf.13 but decides not to because he has never done that before and fixes
kept on coming. In the hindsight, this was a mistake.
* Config fixes to accommodate for wmf.14 started causing notices in wmf.13
so Max resets wmf.13 Timeline to wmf.14.
* Errors indicating more breakages in Timeline prompt another batch of
fixes.
* At 17:42, everything is back to normal.
== Casualties ==
* Max's liver.
* Evening SWAT didn't happen.
* For about 10 minutes, new timeline generation on production wikis was
broken.
== Conclusions ==
* Our code review practices are lax, including merging hairy patches
without testing and self-merges.
* Timeline has 0 (zero) tests while just a single parser test would have
allowed to detect problems during code review.
* Logstash fatalmonitor dashboard isn't displaying HHVM warnings/errors
right now.
* And Logstash is used by scap to verify error levels, rendering this check
useless.
* Logstash/Kibana is probably too complex a beast to be trusted to be the
definitive source of MediaWiki health information, fatalmonitor is still
more reliable. Invest time in improving it and merging with
exceptionmonitor?
* In ongoing outage with logs full of noise, testing stuff on canary
servers is hard as non-fatal errors are easy to miss on fluorine. Deployers
need access to HHVM logs on all appservers.
* Beta cluster isn't serving its purpose of being the first line of defense
against bugs (other than "oh, whole thing is down"). Errors in beta should
be watched as closely as in prod and should be treated with the same level
of seriousness, because otherwise the former will eventually turn into the
latter.
--
Best regards,
Max Semenik ([[User:MaxSem]])
Hello!
I am currently working on implementing HTML E-Mail support. I needed advice
on whether I should use mustache for implementing templates for the same.
If not, I would love suggestions on what else I can use.
Thank you!
- Galorefitz
Hello all,
I am happy to announce, chandigarh hackathon WCI2016, close with 7
application, which is useful to all community.
*1)* *WikiSpeak with native language :* (Hindi) (*Web Application* + *Android
Mobile application*).
*2)* *Edit Tamil Wiktionary **(Tamil community request)** :* - (*Android
application*)
*3)* *Audio file upload to WikiData (Punjabi community request) :* (*Android
application*)
*4) Wikipedia articles on google map :* find wikipedia articles on google
map (with latitude and longitude) as per city.
*5) OCR (Native Application) *: Convert scanned book copy to Indian
language text with google doc (Tested for Hindi and Malayalam).
*6) Communication platform[WebRTC] (Web Application)* : Community used this
to talk or conference (Audio/video web conferencing application)
*7) Notification (Event based)* : showing popup on event eg (If recent
changes happen, It will show popup which article was updated)
Thank you all participant. who participate in hackathon and deliver a big
success
*Note : I will share code, links and other details very shortly.*
Thank you
*Santosh Shingare.*
*User:cherishsantosh*
Hey Everyone,
A big shout out to the reading web team (and Adam B), who rolled out a
number of subtle, but important changes to the mobile web sites of all our
projects earlier this week.
Through a series of UX improvements, the team made it easier for people to
switch between languages in an article on mobile web. To all you
monolinguists out there, this is a very common and vital ability for people
who speak more than one language. In addition to being a big improvement
for our multi-lingual users, the development of this feature followed many
of the best practices we aspire to at the foundation:
- language switching on mobile initially discovered as a problem to be
solved through quantitative research
- tied to reading team's early strategic objective of better serving
global readers
- development stages included:
- community consultation
<https://www.mediawiki.org/wiki/Reading/Web/Projects/Improve_in-article_lang…>
- iteration based on results
- a/b testing and analysis in beta
- qualitative research with live users
<https://docs.google.com/a/wikimedia.org/presentation/d/1gbqV0fDHUDJjDZgYNDY…>
- iteration based on results
- collaboration with the language team (they are simultaneously
rolling out compact language links
<https://www.mediawiki.org/wiki/Universal_Language_Selector/Design/Interlang…>
out
of beta on desktop)
- iteration based on collaboration
Am I missing anything? More on the project here
<https://www.mediawiki.org/wiki/Reading/Web/Projects/Improve_in-article_lang…>
and
a screenshot of the new language switching button is below
Best,
J