Hi, I'd like to present a new RFC for your consideration:
https://www.mediawiki.org/wiki/Requests_for_comment/Minifier
It is about how we can shave 10-15% off the size if JavaScript
delivered to users.
Your comments are highly welcome!:)
--
Best regards,
Max Semenik ([[User:MaxSem]])
When api.php was basically the only API in MediaWiki, calling it "the API"
worked well. But now we've got a Parsoid API, Gabriel's work on a REST
content API, Gabriel's work on an internal storage API, and more on the
way. So just saying "the API" is getting confusing.
So let's bikeshed a reasonably short name for it that isn't something awful
like "the api.php API". I'm horrible at naming.
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
Thank you for the quick fix!
Best,
--
Sukyoung
On Jan 29, 2014, at 9:55 AM, Nathan wrote:
> FYI in case you aren't subscribed to the list.
>
> ---------- Forwarded message ----------
> From: Yair Rand <yyairrand(a)gmail.com>
> Date: Tue, Jan 28, 2014 at 7:25 PM
> Subject: Re: [Wikitech-l] Bug in the Wikipedia main web page
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
>
>
> Thank you for pointing out this bug. Your suggested change to
> MediaWiki:Gadget-wm-portal.js has been implemented by Meta-Wiki
> administrator User:PiRSquared17.
>
>
> On Tue, Jan 28, 2014 at 6:50 PM, Sukyoung Ryu <sukyoung.ryu(a)gmail.com>wrote:
>
> > Dear all,
> >
> > We are researchers at KAIST in Korea working on finding JavaScript bugs in
> > web pages. While analyzing top websites from Alexa.com, we found an issue,
> > which seems to be a bug, on the Wikipedia main web page (wikipedia.org).
> > We would be grateful if you can either confirm that it is a bug and even
> > better fix it or let us know what we're missing.
> >
> > Here's the issue. When a user selects a language in which search results
> > are displayed via the language selection button from the Wikipedia main web
> > page, the following JavaScript function is executed:
> >
> > 1 function setLang(lang) {
> > 2 var uiLang = navigator.language || navigator.userLanguage, date
> > = new Date();
> > 3
> > 4 if (uiLang.match(/^\w+/) === lang) {
> > 5 date.setTime(date.getTime() - 1);
> > 6 } else {
> > 7 date.setFullYear(date.getFullYear() + 1);
> > 8 }
> > 9
> > 10 document.cookie = "searchLang=" + lang + ";expires=" +
> > date.toUTCString() + ";domain=" + location.host + ";";
> > 11 }
> >
> > Depending on the evaluation result of the conditional expression on line
> > 4, "uiLang.match(/^\w+/) === lang", the function leaves or dose not leave
> > the selected language information on the user's computer through a cookie.
> > But we found that the expression, "uiLang.match(/^\w+/) === lang", always
> > evaluates to false, which results in that the function always leaves
> > cookies on users' computers. We think that changing the contidional
> > expression, "uiLang.match(/^\w+/) === lang", to the expression,
> > "uiLang.match(/^\w+/) == lang", will solve the problem.
> >
> > This problem may occur in the main web pages of all the Wikimedia sites.
> > Could you kindly let us know what you think? Thank you in advance.
> >
> > Best,
> > Changhee Park and Sukyoung Ryu
> >
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
I've written a simple MediaWiki extension that uses an instance of the
W3C Validator service (via the Services_W3C_HTMLValidator
<http://pear.php.net/package/Services_W3C_HTMLValidator> PEAR package)
to validate SVG images hosted on a wiki. It is meant to replace the
current system on Commons, that relies on individual contributors adding
templates (e.g. InvalidSVG
<https://commons.wikimedia.org/wiki/Template:InvalidSVG>) by hand to
file description pages.
It exposes a simple API (and a Scribunto module as well) to get the
validation status of existing SVG files, can emit warnings when trying
to upload invalid ones, and is well integrated with MediaWiki's native
ObjectCache mechanism.
I'm in the process of publishing the code, but have some questions I
think the community could help me answer.
* Given that the W3C Validator can also parse HTML files, would it be
useful to validate wiki pages as well? Even if sometimes the
validation errors appear to be caused by MediaWiki itself, they can
also depend on malformed templates.
* Does storing the validation status of old revisions of images
(and/or articles) make sense?
* Do you think the extension should use the extmetadata property of
ApiQueryImageInfo instead of a its own module?
* Is it advisable to store validation data permanently in the database?
Hi all,
My apologies if this is the wrong place to start a discussion on this, but
it's a better place than nowhere. I recently took part in two very
different Wikipedia workshops -- one in Uganda for schoolchildren aged
14-17, and one Bodø, Norway, for GLAM people aged 35-55. One glaringly
obvious barrier of entry that was common for both groups is that the
CAPTCHA we use is too freaking hard.
The main concern is obviously that it is really hard to read, but there are
also some other issues, namely that all the fields in the user registration
form (except for the username) are wiped if you enter the CAPTCHA
incorrectly. So when you make a mistake, not only do you have to re-type a
whole new CAPTCHA (where you may make another mistake), you also have to
re-type the password twice *and* your e-mail address. This takes a long
time, especially if you're not a fast typer (which was the case for the
first group), or if you are on a tablet or phone (which was the case for
some in the second group).
So I would like to start a discussion about changing to a CAPTCHA that is
more user-friendly, and hopefully one that isn't as
English/Latin-alphabet-centric as the one we currently use. If Ugandan
children and old Norwegian people, which all use the Latin alphabet, have
such problems deciphering the CAPTCHA, what about people speaking languages
that don't use the Latin alphabet? I would prefer something more
simplistic, like some sort of math or image-based CAPTCHA, instead of the
current CAPTCHA we use.
--
mvh
Jon Harald Søby <http://meta.wikimedia.org/wiki/User:Jon_Harald_S%C3%B8by>
Please join us for the following tech talk:
*Tech Talk**:* What's New with MediaWiki-Vagrant?: Simple Use Cases and
Beyond
*Presenter:* Bryan Davis & Dan Duvall
*Date:* November 25th
*Time:* 1830 UTC
<http://www.timeanddate.com/worldclock/fixedtime.html?msg=What%27s+New+with+…>
Link to live YouTube stream <http://www.youtube.com/watch?v=I66xR-fq2O8>
*IRC channel for questions/discussion:* #wikimedia-office
Google+ page <http://www.youtube.com/watch?v=I66xR-fq2O8>, another place
for questions
*Talk description:*We'll start off by giving a brief refresher on how
MW-Vagrant works and how it differs from stock Vagrant. Next, we'll
showcase some of the newest and most useful features of MWV such as
multi-wiki support, SSH/HTTP sharing, Labs integration, advanced
customization using Hiera and local roles. Finally, we'd like to show how
MWV can be useful in test-driven development by demonstrating how to run
unit and browser tests. The last 15 minutes will be reserved for Q/A.
== Outline (WIP) ==
* (10 minutes) What is Vagrant, MediaWiki-Vagrant, Puppet (dan + bryan)
* (15 minutes) Local customizations (bryan)
* (15 minutes) Running unit tests and browser tests under MW-V (dan)
* (5 minutes) Vagrant sharing (it's awesomesauce!!!) (bryan)
* (15 minutes) Q & A
Please join the Analytics Engineering team for...
Office Hours: EventLogging & Dashboarding
Hosts: Dan and Nuria
Date: January 14
Time: 20:00 UTC - Convert to Local Time
<http://www.timeanddate.com/worldclock/fixedtime.html?msg=EventLogging+and+D…>
Hangout: https://plus.google.com/hangouts/_/wikimedia.org/a-batcave
IRC: #wikimedia-analytics
Description:
Teams need metrics on how their product or feature is performing, then they
need to visualize those metrics. This is accomplished with instrumenting
code with EventLogging, mashing data with some queries and setting up a
Limn Dashboard. The Analytics Engineering team is open for office hours to
answer questions about the process, help solve any issues and listen to
feedback on the process. Feel free to drop in the Goolge Hangout linked
above or ask questions on the IRC channel during our Office Hours.
All,
TL;DR:
* References made using Cite will be configurable with a different system
* New approach being prototyped in Parsoid's native implementation of the
Cite extension
The Cite extension[0], which provides in-page footnotes is a crucial part
of how many of us use wikis, especially for Wikipedias. However, it was
written a very long time ago, even by MediaWiki standards, and in its
creation, we made some design choices which made sense at the time, but
don't work very well for us anymore. As CSS has gotten more features, and
those features have become more reliably implemented in modern browsers, we
have more choices now.
One of the great things about the Cite extension is its flexibility, such
as each wiki being able to choose how the footnotes display to readers.
Unfortunately, the means by which this flexibility is provided is currently
done through a series of 14 different MediaWiki namespace wikitext messages
which are interpreted to create the HTML to display to the user.
This has a number of disadvantages. Most wikis use the default values (and
even very experienced users are probably unaware of how it works or even
that it can be changed). Because they are implemented in wikitext,
per-user changes such as with a gadget or a user script aren't possible to
do neatly (and instead need JavaScript to re-write the content, which is
slower and more complex), and changes take up to 30 days to be visible to
anonymous users whilst they wait for the cache to change. Due to how
MediaWiki's messages system works, changes to the display styles need to be
copied into each of the ~300 display languages for users, else those users
with different languages will see different reference styles on the same
page. Any system that wishes to display MediaWiki references has to
implement most of MediaWiki just to format these references correctly.
To fix this, the Parsoid and editing teams are proposing to replace the
current way to configure the Cite extension with some CSS rules, which will
solve these issues. Experimenting will be much easier to do on a per-user
or gadget-basis in advance of making larger changes. Styles will be shared
by all users of the wiki, regardless of their preferred language, and
complicated user scripts or gadgets that slow down the system will not be
needed to change the style. Changes will be instant for all users, even
those logged out, and non-MediaWIki users can use styles to correctly and
consistently display references. A currently-under-review patch[1] for
Parsoid demonstrates use of CSS to do variations found on enwiki, eswiki,
and fawiki. We think that this approach will allow us to match each of the
current styles used on Wikimedia wikis. If there is any variation
(currently in use) that we should experiment with, please let us know.
We plan to continue to prototype this approach in Parsoid and work through
any rendering issues. Given that most Wikimedia wikis don't customize the
extension, for most Wikimedia wikis, we expect this will work with the
basic configuration (and for a few Wikipedias which need different
configuration, we will add styling). Once we port this solution over to the
master branch of the Cite extension, this will be a breaking configuration
change in the MediaWiki 1.25 release. If you have a gadget or user script
which changes how references appear or relies on something local to your
wikis, you may need to tweak it.
Given that older browsers (IE6 and IE7 most notably) may not implement the
CSS features that we need for this, we don't plan to rely solely on a
CSS-based approach. Our current approach is to use a default bare HTML
style on browsers that don't support content generation (like ::after and
counters).
We invite your comments, feedback, and help with this.
Thanks,
Marc, Parsoid and Editing teams.
[0] – https://www.mediawiki.org/wiki/Extension:Cite
[1] – https://gerrit.wikimedia.org/r/#/c/170936/