Hi Benedikt and Marcus,
I had some time and was reading about current issues around MediaWiki. I saw
an issue about tab names - Bug 29310 Namespace tab doesn't handle fallback
the same way as core (breaks nstab) - and created a Selenium regression test
to verify when it is fixed. It is called
javascript-unit-testing-checking-tabs-and-content. It verifies that the tab
names are correct. I also created a test suite - test-suite-check-tabs. The
suite includes a test to check the tabs on the main media wiki site (a test
that passes) and the test -
javscript-unit-testing-checking-tabs-and-content. Would you provided
feedback as to how useful the tests and how to improve them?
Michelle Knight
(503) 345-4350
mknight1130(a)gmail.com
Hi everyone,
Thanks to the heroic efforts of many folks here, we're down to a
manageable number of revisions. See the newly repaired chart here:
http://toolserver.org/~robla/crstats/crstats.118all.html
Now that we've got a small enough list, I'm reviving the revision
report for 1.18, which is a list of all of the revisions remaining for
review, broken down by preferred reviewer. That's available here:
http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.18/Revision_report
This also has "fixme"s, broken down by the committer.
Revisions with the following tags will be excluded from this report:
* nodeploy - won't ever affect deployment (e.g. installer)
* 1.18revert - reverted from the 1.18 branch
* 1.18ok - still needs a full review, but we'll let it slide for this
deployment
These tags roughly map onto what we used for 1.17, though not exactly.
During that deploy cycle, it was far, far more important to choose
our battles to have any hope of getting done soon, whereas now, we
know we want to get through everything anyway. So, you should feel a
tinge of guilt using these tags. :)
These are manually generated (currently by me, but Hexmode took over
last time, and I wouldn't be surprised if he does it again). Let me
know if there's anything obviously wrong with the list. Otherwise,
we're all looking forward to seeing this list shrink to zero!
Rob
Hi folks
I've been working on bringing back AjaxCategories the last two days (r92112 and followups, try it out by setting $wgUseAJAXCategories = true;
Obviously it can't edit or remove transcluded categories, so it would be good to not show the edit buttons on transcluded categories.
Question is, how could I get those? Anyone got an idea?
( I could do something like
$templatesCache = $holders->parent->mTplExpandCache;
foreach ( $templatesCache as $template ) {
$strp = strpos($template, $line);
if ( $strp ) break;
}
inside parser->replaceInternalLinks2, L1980, but that's pretty hacky & not really stable either )
Leo
Hi folks,
Any special:export experts out there?
I'm trying to download the complete revision history for just a few
pages. The options, as I see it, are using the API or special:export.
The API returns XML that is formatted differently than special:export
and I already have a set of parsers that work with special:export data
so I'm inclined to go with that.
I am running into the problem that, it seems when I try to use POST so
that I can iteratively grab revisions in increments of 1000, I am
denied (I get a WMF servers down error). If I use GET, it works, but
then I can't use the parameters that allow me to iterate through all
the revisions.
Code pasted below. Any suggestions as to why the server won't accept POST?
Better yet, does anyone already have a working script/tool handy that
grabs all the revisions of a page? :)
Thanks, all! (Excuse the cross posting, I usually hang out on
research, but thought perhaps folks on the developers list would have
insight.)
Andrea
class Wikipedia {
public function __construct(){ }
public function searchResults( $pageTitle = null, $initialRevision = null ) {
$url = "http://en.wikipedia.org/w/index.php?title=Special:Export&pages="
. $pageTitle . "&offset=1&limit=1000&action=submit";
$curl = curl_init();
curl_setopt( $curl, CURLOPT_URL, $url );
curl_setopt( $curl, CURLOPT_RETURNTRANSFER, 1 );
curl_setopt( $curl, CURLOPT_POST, true);
curl_setopt( $curl, CURLOPT_USERAGENT, "Page Revisions Retrieval
Script - Andrea Forte - aforte(a)drexel.edu");
$result = curl_exec( $curl );
curl_close( $curl );
return $result;
}
}
I've made a minor API change ApiUpload, for stashed files. It's not a
commonly used API and the change shouldn't break anything external, but I
figured I should post just in case.
Temporarily stashed files now have their metadata stored in the database
instead of the session. This allows for some future feature expansion with
regard to the image review and categorization process, and works around a
memcached race condition that was preventing simultaneous uploads from
working (see bug 26179<https://bugzilla.wikimedia.org/show_bug.cgi?id=26179>
)
The actual change to the API is pretty simple. First, the 'sessionkey'
parameter has been superseded by 'filekey', though 'sessionkey' remains for
compatibility. You can see that here:
http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/api/ApiUplo…
Second, the 'invalid-session-key' error has been replaced with
'invalid-file-key'.
Here is the entire change in code review:
http://www.mediawiki.org/wiki/Special:Code/MediaWiki/92009
-Ian
Hi everyone,
Per our earlier list conversations, Wikimedia Foundation employees are
going to be spending more time (20% for most engineers) on things like
code review, shell requests, and such. As we've discussed what that
actually means, it became clear to everyone that many engineers will
need training (or at least refreshers) in order to make the most
effective use of that time. Additionally, we would benefit from group
discussion about what code review is.
Since we have a few remote developers visiting us in San Francisco,
we're setting up the first training session for this. This will
optimized for local participation in San Francisco, but we're going to
at least try to get video for purposes of expanding documentation, and
perhaps as educational videos for developers who can't be here but
prefer to learn that way.
This meeting is scheduled for July 19, at 2:30pm PDT. Here's some of
the areas we're planning to have short talks on:
* The basics
* Stability and performance
* UI considerations
* Security
* Unit testing
* General code review philosophy
This won't be the last time we'll be presenting these things. I
imagine we'll be refining these talks for Hackathons and other events.
If you aren't an employee of Wikimedia Foundation, but you'll be in SF
and you'd like to come, let me know and I'll see if we can accommodate
you. If you have ideas for areas that we should cover that aren't
listed above, please let us know.
Thanks!
Rob
Hi,
I'm working on a wiki-journalism project where it's useful to support a "read-state" that allows a user to keep up with
developments without having to continually skim the whole article or look at markup diffs. I thought a good way to
support this would be to allow a period to be selected (e.g. with a slider) and have all content changed over that
period highlighted on the article page itself.
Does anyone know of an extension or other project that implements this, or if anyone's currently working on it?
I've opened a bug as a point of reference:
https://bugzilla.wikimedia.org/show_bug.cgi?id=29860
Thanks.
Hi,
The Wikimedia Foundation's Operations team will perform network
maintenance today around 3pm CEST / 1pm UTC.
For more timezones, please see timeanddates.com's full table:
http://ur1.ca/4oyyq
Disruption should be limited, but you may be unable to access
Wikimedia sites for a few minutes.
--
Guillaume Paumier
WHAT: IRC Triage of caching-related bugs
WHERE: #wikimedia-dev on freenode
WHEN: 2011 July 13, 2300UTC, conversion to local time at
http://hexm.de/53
URL: http://etherpad.wikimedia.org/BugTriage-2011-07
Tomorrow (July 13th, Wednesday) at 2300UTC, I will hold a triage on
IRC of Caching-related bugs.
I'm doing this because since 1.17 we have gotten quite a few reports in
Bugzilla of caching issues, some of them (like
https://bugzilla.wikimedia.org/28613) are affecting a reportedly large
amount of users. Hopefully getting the right people in the same IRC
channel discussing these issues will mean that we can get closer to a
resolution.
I've tried to make this triage more convenient for people in the
Pacific, so my apologies to those of you in Europe. If you can't
attend, but have something of value to add, please leave a note on the
etherpad page (above).
Thanks!
Mark.
Hi!
What's the proper way of thumbnail generation for Ogg media handler, so
it will work like at commons?
First, I've downloaded and compiled latest ffmpeg version (from
git://git.videolan.org/ffmpeg.git) using the following configure
options:
./configure --prefix=/usr --disable-ffserver --disable-encoder=vorbis
--enable-libvorbis
The prefix is usual for CentOS layout (which I have at hosting) and best
options for vorbis were suggested in this article:
http://xiphmont.livejournal.com/51160.html
I've downloaded Apollo_15_launch.ogg from commons then uploaded to my
wiki to check Ogg handler. The file was uploaded fine, however the
thumbnail is broken - there are few squares at gray field displayed
instead of rocket still image.
In Extension:OggHandler folder I found ffmpeg-bugfix.diff. However there
is no libavformat/ogg2.c in current version of ffmpeg. Even, I found the
function ogg_get_length () in another source file, however the code was
changed and I am not sure that manual comparsion and applying is right
way. It seems that the patch is suitable for ffmpeg version developed
back in 2007 but I was unable to find original sources to successfully
apply the patch.
I was unable to find ffmpeg in Wikimedia svn repository. Is it there?
Then, I've tried svn co
https://oggvideotools.svn.sourceforge.net/svnroot/oggvideotools
oggvideotools
but I am upable to compile neither trunk nor branches/dev/timstarling
version, it bails out with the following error:
-- ERROR: Theora encoder library NOT found
-- ERROR: Theora decoder library NOT found
-- ERROR: Vorbis library NOT found
-- ERROR: Vorbis encoder library NOT found
-- ogg library found
-- GD library and header found
CMake Error at CMakeLists.txt:113 (MESSAGE):
I have the following packages installed:
libvorbis-1.1.2-3.el5_4.4
libvorbis-devel-1.1.2-3.el5_4.4
libogg-1.1.3-3.el5
libogg-devel-1.1.3-3.el5
libtheora-devel-1.0alpha7-1
libtheora-1.0alpha7-1
ffmpeg compiles just fine (with yasm from alternate repo, of course).
But there is no libtheoradec, libtheoraenc, libvorbisenc neither in main
CentOS repository nor in aliernative
http://apt.sw.be/redhat/el5/en/i386/rpmforge/RPMS/
However it seems these is libtheoraenc.c in ffmpeg; what is the best
source of these libraries? It seems that there is no chance to find
proper rpm's for CentOS and one need to compile these from sources?
Dmitriy