Hi, I'd like to present a new RFC for your consideration:
https://www.mediawiki.org/wiki/Requests_for_comment/Minifier
It is about how we can shave 10-15% off the size if JavaScript
delivered to users.
Your comments are highly welcome!:)
--
Best regards,
Max Semenik ([[User:MaxSem]])
When api.php was basically the only API in MediaWiki, calling it "the API"
worked well. But now we've got a Parsoid API, Gabriel's work on a REST
content API, Gabriel's work on an internal storage API, and more on the
way. So just saying "the API" is getting confusing.
So let's bikeshed a reasonably short name for it that isn't something awful
like "the api.php API". I'm horrible at naming.
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
Thank you for the quick fix!
Best,
--
Sukyoung
On Jan 29, 2014, at 9:55 AM, Nathan wrote:
> FYI in case you aren't subscribed to the list.
>
> ---------- Forwarded message ----------
> From: Yair Rand <yyairrand(a)gmail.com>
> Date: Tue, Jan 28, 2014 at 7:25 PM
> Subject: Re: [Wikitech-l] Bug in the Wikipedia main web page
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
>
>
> Thank you for pointing out this bug. Your suggested change to
> MediaWiki:Gadget-wm-portal.js has been implemented by Meta-Wiki
> administrator User:PiRSquared17.
>
>
> On Tue, Jan 28, 2014 at 6:50 PM, Sukyoung Ryu <sukyoung.ryu(a)gmail.com>wrote:
>
> > Dear all,
> >
> > We are researchers at KAIST in Korea working on finding JavaScript bugs in
> > web pages. While analyzing top websites from Alexa.com, we found an issue,
> > which seems to be a bug, on the Wikipedia main web page (wikipedia.org).
> > We would be grateful if you can either confirm that it is a bug and even
> > better fix it or let us know what we're missing.
> >
> > Here's the issue. When a user selects a language in which search results
> > are displayed via the language selection button from the Wikipedia main web
> > page, the following JavaScript function is executed:
> >
> > 1 function setLang(lang) {
> > 2 var uiLang = navigator.language || navigator.userLanguage, date
> > = new Date();
> > 3
> > 4 if (uiLang.match(/^\w+/) === lang) {
> > 5 date.setTime(date.getTime() - 1);
> > 6 } else {
> > 7 date.setFullYear(date.getFullYear() + 1);
> > 8 }
> > 9
> > 10 document.cookie = "searchLang=" + lang + ";expires=" +
> > date.toUTCString() + ";domain=" + location.host + ";";
> > 11 }
> >
> > Depending on the evaluation result of the conditional expression on line
> > 4, "uiLang.match(/^\w+/) === lang", the function leaves or dose not leave
> > the selected language information on the user's computer through a cookie.
> > But we found that the expression, "uiLang.match(/^\w+/) === lang", always
> > evaluates to false, which results in that the function always leaves
> > cookies on users' computers. We think that changing the contidional
> > expression, "uiLang.match(/^\w+/) === lang", to the expression,
> > "uiLang.match(/^\w+/) == lang", will solve the problem.
> >
> > This problem may occur in the main web pages of all the Wikimedia sites.
> > Could you kindly let us know what you think? Thank you in advance.
> >
> > Best,
> > Changhee Park and Sukyoung Ryu
> >
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
I've written a simple MediaWiki extension that uses an instance of the
W3C Validator service (via the Services_W3C_HTMLValidator
<http://pear.php.net/package/Services_W3C_HTMLValidator> PEAR package)
to validate SVG images hosted on a wiki. It is meant to replace the
current system on Commons, that relies on individual contributors adding
templates (e.g. InvalidSVG
<https://commons.wikimedia.org/wiki/Template:InvalidSVG>) by hand to
file description pages.
It exposes a simple API (and a Scribunto module as well) to get the
validation status of existing SVG files, can emit warnings when trying
to upload invalid ones, and is well integrated with MediaWiki's native
ObjectCache mechanism.
I'm in the process of publishing the code, but have some questions I
think the community could help me answer.
* Given that the W3C Validator can also parse HTML files, would it be
useful to validate wiki pages as well? Even if sometimes the
validation errors appear to be caused by MediaWiki itself, they can
also depend on malformed templates.
* Does storing the validation status of old revisions of images
(and/or articles) make sense?
* Do you think the extension should use the extmetadata property of
ApiQueryImageInfo instead of a its own module?
* Is it advisable to store validation data permanently in the database?
The new PSR-3 debug logging system brought namespaced external code
(Psr\Log\LoggerInterface) into use in MediaWiki core. The classes I
built out to work with this system are using faux namespaces by virtue
of class names like "MWLoggerFactory", "MWLoggerLegacyLogger" and
"MWLoggerMonologSyslogHandler". Before 1.25 starts rolling out as a
tarball release I'd like to change these classes to use actual PHP
namespaces rather than this clunky collision avoidance mechanism. [0]
There is also a task to backport minimal PSR-3 support to the 1.23 LTS
system to simplify backports of code and extensions that adopt direct
use of PSR-3 and I'd like to only do that once if possible.
The color I have picked for this namespace bikeshed is
MediaWiki\Core\Logger. The MediaWiki root namespace is a pretty
obvious choice. "Core" is inserted to distinguish this fundamental
MediaWiki functionality from any existing or future extensions that
might use namespaces. I'm hoping "Logger" is sufficiently distinct
from other uses of the term "log" in MediaWiki which generally mean
"audit trail" rather than "debugging information". I'd be fine with
throwing Debug in between Core and Logger too if consensus found for
that instead.
I'd also like to start organizing these files in a directory structure
that would be compatible with the PSR-4 auto loader standard. PSR-1
required all namespace elements to be in the file path as directories,
but PSR-4 allows a common prefix for all classes to be dropped. I was
thinking an includes/Core/ directory could be used as the common base
for these and future namespaced classes in MediaWiki core.
We had some discussion last summer [2] about namespace use in
extensions that seemed to end with "cool do it when you want" and "we
don't really need any standard conventions". Since I'm suggesting
namespace usage in core I figured this was worth another (hopefully
short) round of discussion with the larger community than is likely to
see my patches when they land in Gerrit.
[0]: https://phabricator.wikimedia.org/T93406
[1]: https://phabricator.wikimedia.org/T91653
[2]: http://www.gossamer-threads.com/lists/wiki/wikitech/476296
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hi,
I'd like to be able to calculate the molar mass of chemical compounds
using a Lua module so that I could use the output in my infoboxes for
chemical compounds and drugs alike. The problem is, I haven't the
foggiest how to set up a module, even one that sounds so simple. I was
hoping that someone may be able to set things up for me, or at least
show me how to do so myself^1 if I gave them the basic idea of what I
was hoping this module would do.
Say we call the module Molar mass calculator (i.e., @ /Module:Molar mass
calculator/ on my local Wiki is where its Lua code is and the template
that invokes it /Template:Molar mass calculator/^2 ). I was thinking of
the Lua module using a pair of vectors one (A⇀\vec{A}) containing the
user-defined variables^3 of all 84 chemical elements found in
appreciable quantities in nature and the other containing the average
atomic mass for all these elements (M⇀\vec{M}). Then doing the Lua
equivalent to a dot product (i.e., A⇀⋅M⇀=∑i=184AiMi\vec{A}\cdot \vec{M}
= \sum_{i=0}^{84} A_i M_i) between these two vectors and using the
result as the module's output which would then//used by the template as
its output.
Footnotes
1. Keeping in mind I am a programming noob, especially when it
comes to Lua, so talk to me like a maths guy that just
understands a little MATLAB, NumPy, SciPy, Python and Wikitext
and no other programming languages as this is fairly accurate.
2. /Template:Molar mass calculator/, presently has this Wikitext
(hence if a change is required please do alert me to it):
{{#invoke:Molar mass calculator}}<noinclude>{{Clr}}
{{documentation}}</noinclude>
3. These variables are those provided to /Template:Molar mass
calculator/ as arguments. For example, if I want to call the
template in a Wiki page it may look like this for Ethanol (C_2
H_6 O)
{{Molar mass calculator
|C = 2
|H = 6
|O = 1
}}
and should provide the output of 46.0694 g/mol.
Thanks for your time,
Brenton
Hi there,
I created a Request for comments[1] and a corresponding Phabricator
Task[2] regarding a feature that enables users to watch categories for
page additions and removals.
If you are interested in this topic or somehow involved in
Extension:Echo or watchlist features, feel free to participate in the
discussion.
Cheers,
Kai
[1]: https://www.mediawiki.org/wiki/Requests_for_comment/Watch_Categorylinks
[2]: https://phabricator.wikimedia.org/T94414
--
Kai Nissen
Software-Entwickler
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt
für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello,
Happy to inform that we could successfully complete the global
deployment[1] of Extension:BounceHandler and the Wikipiedia's are handling
bounce emails effectively. The current threshold for the number of allowed
bounces is 5, and we could test the unsubscribe action live on
en.wikipedia.org with help of sysops ( I couldnt send more than 2 emails,
thanks to the Anti-spam checks ).
As of now, 'bounce_records' have 37338 entries, mainly from group0 and
group1 wikis ( in ~ 20 days ). We expect an exponential increase in the
same, as the en-wiki has got amazing bounce rates.
This would mean the finish ( deployment ) of my GSoC 2014 project[2] with
Jeff Green and Legoktm. Thanks to everyone who helped in between, it was
real fun!
[1] https://phabricator.wikimedia.org/T92877
[2] mediawiki.org/wiki/VERP
Thanks,
Tony Thomas <http://tttwrites.wordpress.com/>
FOSS@Amrita <http://foss.amrita.ac.in>
*"where there is a wifi, there is a way"*