Hey!
I have started Developer hub on meta [1] - it's page for developers
(not only php) who are looking for some project they can participate
on, I am myself a developer of huggle (utility used on wikipedia for
reverting vandalism, written in c#) and it's really hard to look for
more devs who could join us, so I guess this could help a bit,
especially if there are people who know other languages than php and
are looking for projects they could work on. Feel free to improve that
page or insert other projects, like AWB, etc. I don't even know what
all tools we are using on wikimedia so it would do to have it all on
one place.
Thanks!
1 - https://meta.wikimedia.org/wiki/Wikimedia_developer_hub
Some folks may be interested in my blog post about high-density displays
and how using higher-density or vector images directly can greatly improve
rendering and legibility of diagrams and charts:
http://leuksman.com/log/2011/12/04/high-density-displays-mobile-and-beyond/
If anybody's interested in fiddling around with JavaScript to swap in
high-density PNG images and scalable SVG images (on either regular or
MobileFrontend view -- though it's more relevant to MobileFrontend view),
try this bookmarklet:
http://leuksman.com/misc/density-bookmarklet/
(source version: <https://github.com/brion/density-bookmarklet>)
It's fairly simplistic and won't work with everything. Attempts to replace
PNG thumbnails with double-sized ones (may fail on some images), and
rasterized SVG thumbnails with the original SVGs; but it does make some
charts and graphs look much nicer on an iPhone 4 or iPod Touch with retina
display!
Many Android phones have intermediate density displays (Android "hdpi"
approx 240dpi, with the new Galaxy Nexus sporting an "xhdpi" 320dpi screen
that's closer to the 326dpi Retina display); unfortunately the picture is
complicated by Android 2.x devices not supporting SVG in the browser!
-- brion
Hi all,
I spent some time this weekend spiffing up the code review stats page:
http://toolserver.org/~robla/crstats/
There may still be some bugs, especially since my "cross browser
testing" involved loading in Chrome and Firefox on a single Linux box,
but it should generally work a lot better than the old version. I've
changed graphing libraries from flot to jqplot. The latter lets you
zoom along x- and y-axis (the latter being a frequently requested
feature).
I've also untangled some of the spaghetti in the code, which means it
should be easier for someone who wants to contribute to do so. Here's
the source:
https://gitorious.org/mwcrstats
(docs are still sorely lacking though...but feel free to bug me if
you're interested).
The numbers for trunk don't look great at all. I'll probably dig more
into the numbers tomorrow, but we're really lagging from our
projections there (in fact, we've lost ground this past week).
The numbers for trunk/phase3 look better, though I think we may still
be going slower than what would be needed for a release.
Rob
On Sat, Dec 3, 2011 at 5:37 PM, Jeremy Baron <jeremy(a)tuxmachine.com> wrote:
> My first guess is that it's related to the use of HTML Tidy. Seems to
> be enabled currently on the WMF cluster:
> see $wgUseTidy @
> http://noc.wikimedia.org/conf/highlight.php?file=CommonSettings.php
>
> -Jeremy
Thanks! That's it. Adding $wgUseTidy = true; in LocalSettings.php
addresses the first issue. </sup> is now converted to </sub>
It doesn't quite match the other two behaviors (I detail below)
However, I think this is the right track. All of the behavior I've
seen so far involves changing html elements.
One other follow-up question. Is MediaWiki using the php_tidy.dll that
is part of my PHP 5.2 environment?
My environment is a Windows XP box with apache 2.2 and php 5.2. It's a
relatively clean install.
I've done some debugging and see that MediaWiki is calling
execExternalTidy in parser\Tidy.php.
It seems to spawn a process called "tidy". However, I can't find any
tidy exe/dll on my box. I'm fairly certain I have not downloaded it
manually.
I'm hoping it's php_tidy.dll and that I can swap out other versions of
php_tidy to get (2) and (3) (as well as the other dozen or so)
working.
If not, I'll muck around with the tidy source, and/or look at the
tidy.conf options later.
Thanks!
== differences ==
(2) <span /> is preserved (it used to be escaped). Wikipedia splits to
<span></span>
(3) no change. empty <i> </i> tags still preserved
Hi all. My apologies if this is not the right group.
I'm trying to find the official MediaWiki version which Wikipedia uses.
According to http://en.wikipedia.org/wiki/Wikipedia:About, Wikipedia
is running MediaWiki version 1.18wmf1 (r104604).
However, when I download r104604, I get a version that is close to
Wikipedia, but not quite.
For example, when parsing the following text: a<sup>2</sub>b
MediaWiki produces "a<sup>2</sub>b</sup>"
Wikipedia produces "a<sup>2</sup>b</p>"
Wikipedia converts the </sub> tag to a </sup> tag. You can see this in
http://en.wikipedia.org/wiki/Algebraic_geometry with the following
excerpt: rectangular box ''a''<sup>2</sub>''b''
I've noticed other discrepancies as well (so far about a dozen). I
list 2 more below. I can list more if anyone is interested, or think
it's relevant.
I'm new to MediaWiki development, so I might be doing something stupid.
However, I've gone through other versions (svn commands listed below),
and I can't figure it out. I've spent well over an hour on it.
Can any one point me in the right direction?
Thanks.
SVN commands
official release (r104604): svn checkout
http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3@104604 wiki
bleeding edge release: svn checkout
http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3 wiki
latest trunk release?: svn checkout
http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3@105083 wiki
Urls for testing behavior. I've been using the "Show preview" button
(it invokes the same code in Parser.php/Sanitizer.php). Modify the
localhost url as per your environment.
http://en.wikipedia.org/w/index.php?title=Wikipedia:Sandbox&action=submithttp://localhost/wiki/index.php?title=Main_Page&action=submit
Other divergent behavior
== <span/> converted to <span></span> ==
Text : <span id="a" />
MediaWiki: <span id="a" />
Wikipedia: <span id="a"></span>
Ref : http://en.wikipedia.org/wiki/Altaic_languages
Excerpt:
<!-- NOTE:
Please leave the following id's which were the previous titles of this
section. Many pages link to these section titles.
--><span id="Controversy" /> <span id="The controversy over Altaic" />
== Empty tags pruned ===
Text : a '' '' b
MediaWiki: a <i> </i> b
Wikipedia: a b
Ref : http://en.wikipedia.org/wiki/Arkansas
Excerpt :
'''53.21%''' '' ''505,823
Today we held our "Are we gonna have a 1.18 point release right now" bug
triage.
We decided not to do a point release right now. Instead, we'll start
documenting the issues that people are running into when they upgrade to
or install version 1.18 and committing any necessary patches to the
REL1_18 branch. I've attempted to start an FAQ at
<https://www.mediawiki.org/wiki/MediaWiki_roadmap/1.18/FAQ>.
Please refer to that FAQ and update it as you see fit. If you know of
any other issues with our 1.18, file a bug report on bugzilla
(http://bugzilla.wikimedia.org/) and, if you like, let me know.
Thanks,
Mark.
Hey guys!
The WMF features team has been experimenting with holding our weekly
status meetings over IRC.
One of the ideas that came up was that we might want to try having an
"open" meeting - sort of like an office hours - that anyone in the
community can join and possibly ask questions.
So, we're inviting you.
This coming Tuesday, December 6th, at 11 AM PST, we'll be gathering on
irc.freenode.net in channel #wikimedia-dev. For the first bit, we'll
probably want to hold questions and answers until after we've had a
chance to give our statuses, but after that, feel free to ask about
anything.
And if no one else shows up, I'll at least have a whole IRC channel to
myself.
-b.
--
Brandon Harris, Senior Designer, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
Looking over our USERINFO files we have 82 (out of our 330) users who
obfuscate their e-mail address.
It's rather trivial to de-obfuscate them, and we will need to for the git
migration.
Have we considered asking these users if they would like a different
e-mail address to be used for get when we migrate?
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
Hello,
Seeking for help in order to make a definitive decision.
For some months, I'm looking for a Java API which helps me to access
Wikipedia and get the content of articles. My project is to buils a
taxonomy of concepts of a given damain.
Details:
1. I have a corpus of damin texts, I extract the first set of terms( that
represents the domain).
2. I search in wikipedia the articles of these words in order to extract
their definitions. The definition of the word helps me to find the
hyperonym of this word. The call for wikipedia will surely be done in a
java loop.
3. I search the definitions of the hyperonyms found in tyhe previous step
to find their hyperonyms, and so on.
4. I draw a graph linking the words to their hyperonyms.
My problem is that for the step 2, I can not make a definitive decision.
1. I wrote java code to access Wikipedia online. It succeeds but the speed
of my connexion determines if the execusion succeeds or fails giving a set
of exceptions. Sometimes, the execusion gives me only 2 or 3 articles.
2. I tryed to use JWPL to treate Wikipedia dumps. I failed because I have
not enough RAM.
3. I'm now hesitating between a set of Java Apis. Please give me your
points of views if you have already done something in this sense. I made a
serious investigation and I found the following links:
1-
http://wdm.cs.waikato.ac.nz:8080/wiki/Wiki.jsp?page=Installing%20the%20Java…
2- http://jwikiapi.sourceforge.net/index.html
3- http://code.google.com/p/gwtwiki/
4- http://www.mediawiki.org/wiki/API%3aMain_page
5- http://jwbf.sourceforge.net/
Give me your suggestions please.
Regards
Khalida Ben Sidi Ahmed