Using the recently enabled (thanks!) book creation function, I collected
a first version of a book I need
http://www.mediawiki.org/wiki/User:Wikinaut/Books/MediaWiki_Developer%27s_G…
as guide for new (and old) developers with some important pages.
Since now, developers can print out their own book(s)...
As promised earlier this week, I'm presenting a list of issues for any
weekend warriors that want to have something to work on that will help
with the development of MediaWiki and Wikipedia.
Most of these I talked about earlier. Next Friday, though, I will have
some new bugs that aren't in Monday's Triage.
If you work on these, post a comment to the wikitech-l letting us know
about your work.
== Blank slates ==
These bugs seem like they'd be pretty straightforward to reproduce and
fix. Unlike the next section, though, they don't have a proposed
solution. If you don't have commit access, you can put attach patches
to the bugs and I'll see if I can commit them next week.
=== Block notice when editing talk page of blocked user doesn't go away
after block expires ===
https://bugzilla.wikimedia.org/27858
I haven't yet had a chance to track this one down properly, but
something wonky is going on here. See if you can track it down,
reproduce it on your own installation and, then, be a hero and fix it!
=== Upgrade fails "Unknown character set: 'mysql4'" ===
https://bugzilla.wikimedia.org/29102
This is an installer bug that is probably trivial to fix. Fix it before
Chad gets to it!
=== Cannot remove watchlist junk, either in Edit Watchlist page or by
editing raw watchlist ===
https://bugzilla.wikimedia.org/28936
I've added Roan's steps to reproduce the issue to the bug.
=== Implement a way for authorized users to use Special:PasswordReset on
other usernames ===
https://bugzilla.wikimedia.org/29135
Lots of details in the bug.
=== Require token for watching/unwatching pages ===
https://bugzilla.wikimedia.org/27655
Bryan doesn't have time to work on this one right now. Any one want to
take a shot?
=== width of <gallery> always 100% ===
https://bugzilla.wikimedia.org/27540
See if you can fix my example: http://hexm.de/3b
== Patches ==
The following bugs all come with patches. Verifying them and making
sure they are sane would make a few easy projects this weekend. Even if
you don't have commit access, your comment on the bug (“hey, I tried
this before and after the patch and the patch fixed the problem!”) would
be helpful.
=== Suppressed edits still appear in Special:DeletedContributions and
Special:Undelete ===
https://bugzilla.wikimedia.org/19725
=== lang and hreflang attributes for interwiki links ===
https://bugzilla.wikimedia.org/4901
=== Email notification mistakes log action for new page creation ===
https://bugzilla.wikimedia.org/14901
=== WAI-ARIA landmark roles to improve accessibility ===
https://bugzilla.wikimedia.org/18338
If you have a screen reader and want to make Wikipedia more accessible,
this may be a good place to start.
== Happy Hacking ==
See you all on Monday!
Mark.
Max Seminik raised the issue of codurr spamming the channel with a
useless list of revisions that broke the tests.
Example:
[19:39:08] <codurr> Something broke. See
<http://ci.tesla.usability.wikimedia.org/cruisecontrol/buildresults/mw>.
Possible culprits: aashrh/r89027 /r89028 /r89029 nbiabriket/r89035
krenkli/r89036 eryed/r89037 /r89038 eixal/r89039 /r89040 /r89041 /r89043
/r89044 /r89047 /r89049 /r89051 /r89061 /r89062
... and so on.
I repaired cruise control a few days after the hackaton. At that points
tests were broken and I fixed most of them. Then I added back the
Database and Parser groups which added some more interesting tests
breakage (including the order in which files are loaded by PHPUnit).
Since cruisecontrol *does not remember the states of tests* from
previous builds, it just assume the current build broke everything and
hence report all the old breakages on each build :-(
To fix them you have to get to CruiseControl:
http://ci.tesla.usability.wikimedia.org/cruisecontrol/
Clicking on 'mw' in the orange box will show you tests results for the
latest build, below you will find the latest revision tested.
At the moment we can see :
#288 - testParserTest with data set #287
Clicking on it will show the long list of tests. Look for the above
message in the list, next to it is a "Failure >>" link which show the
test output. In this case:
ParserTests::testParserTest with data set #287 ('pre-save transform:
Signature expansion in nowiki tags (bug 93)', 'Shall not expand:
<snip parsertest output with diff and backtrace>
To reproduce it locally:
$ cd tests
$ php parserTests.php --quiet --filter 'bug 93'
This is MediaWiki version 1.19alpha (r89273).
<snip>
Passed 0 of 1 tests (0%)... 1 tests failed!
$ echo $? # show PHPUnit exit code:
1
$
$ svn blame tests/parser/parserTests.txt | grep 'bug 93'
89191 platonides pre-save transform: Signature expansion in nowiki
tags (bug 93)
$
So probably introduced in r89191 .. Check it out and run tests :
$ svn co -r 89191
<snip>
$ php parserTests.php --quiet --filter 'bug 93'
This is MediaWiki version 1.19alpha (r89191).
<snip>
Passed 0 of 1 tests (0%)... 1 tests failed!
$
That one was broken on commit anyway.
http://www.mediawiki.org/wiki/Special:Code/MediaWiki/89191
The BlockTest are broken too, most probably due to its rewrite. You can
easily reproduce the test suite and most project manager we love the
easy report functionality (--testdox):
$ cd tests/phpunit
$ ./phpunit.php --filter BlockTest --testdox
PHPUnit 3.5.13 by Sebastian Bergmann.
ApiBlock
[ ] Make normal block
Block
[ ] Initializer functions return correct block
[x] Bug 26425 block timestamp defaults to time
[ ] Bug 29116 load with empty ip
[ ] Bug 29116 new from target with empty ip
$
Have fun :-)
--
Ashar Voultoiz
Hello,
I have shamelessly copy-pasted some code on cruisecontrol and enabled
continuous integration for the REL1_18 branch. You can find it at the
cruisecontrol entry point:
http://ci.tesla.usability.wikimedia.org/cruisecontrol/
It comes without the tests in the 'Database' group (thus excluding
ParserTests) since they were heavily broken at branch time. Once the
tests are stable enough in trunk, I will probably back-port the fixes in
1.18 and enable the Database group.
The build result for codurr are published in irc-publish-REL1_18.txt
:-)
--
Ashar Voultoiz
Just created two more accounts in SVN:
Patrick Nagel - nagelp - Semantic MediaWiki stuff
Guillaume Paumier - gpaumier - WMF staff, maintaining the blogs' skin
Welcome :)
-Chad
Hi.
Is there a status update about more regular code deployments to Wikimedia
wikis? I know it's been discussed endlessly, but I was under the impression
that it was a real goal going forward. Is that still the case?
MZMcBride
http://schema.org/
An initiative by Google, Yahoo and Bing to make a tag language to make
things more findable in search engines.
Is there anything in this for us? schema.org tags in templates?
Presumably this would require software work too, and require us to
cross levels between software and content, at least a little ...
- d.
Hi, I'm Kevin Brown, a GSoC student this year. I live in Melbourne, Florida
and am attending Brevard Community College. My previous projects include work
on bots on the English Wikipedia for tagging of uncategorized pages and new
page patrol cleanup.
Almost since the web’s inception, link rot has been a major problem. Web-based
content comes and goes, sometimes within a matter of hours. This presents a
major problem, both for users seeking to access this information and
for Wikipedia's
core content policy of verifiability. While Wikipedia policy does not
require users to use web citations, it is by far the most popular form of
citations, because they're easy for readers and editors to access.
To help solve this and ensure adherence to verifiability (WP:V), I plan to
create an archival system over the summer, so users can access all external
links even if they go down. This preemptive archival should effectively
solve the problem of linkrot, as long as the source site allows caching of
its content. The project aims to get something that "just works" without
user input/request and to seamlessly integrate with existing page parsing
and rendering. Such a system will allow users to focus on content
creation, rather
than the distracting technical aspects of archival.
I would appreciate your help with the project. Specifically, I'd appreciate
it if communites could start discussing this on your project's local village
pump, so that we can start developing consensus for deployment.
Also, please feel free to email me or find me on IRC under the nick kevin_brown
regarding any questions you may have.
I am currently drafting proposal and design documents and will be linking
them as they become available. For now, please see a few relevant
proposals:
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_External_links/Webcitebo…http://en.wikipedia.org/wiki/Wikipedia_talk:Link_rot#Proposal_for_new_WikiP…http://en.wikipedia.org/wiki/Wikipedia:Bots/Requests_for_approval/WebCiteBOThttp://en.wikipedia.org/wiki/Wikipedia:WikiProject_Council/Proposals/Dead_L…
(Thanks to Neil and Sumana for helping me write this.)
Best,
Kevin