I am having some difficulties editing with Google Chome. The browser
adds extra spaces with many edits and is unable to add lines to the
middle of infoboxes (when you hit the enter key it just moves you to
the bottom of the infobox code). I am not sure if this is a chrome
problem or a wiki problem. I do wish to get a chrome book but unless
this can be addressed it will just cause me frustration.
--
James Heilman
MD, CCFP-EM, Wikipedian
Hi everyone,
A while back we set up a 'Wikimedia' SVN repository [1] (as opposed to the
'Mediawiki' SVN repository [2]) to hold Wikimedia-related yet non-MediaWiki
related code. This has been great particularly for miscellaneous
fundraising-related code as well as our instance of CiviCRM and associated
modules.
I wanted to give a big thanks to Sam Reed and Casey Brown who earlier this
week helped set up a 'Wikimedia-commits' mailing list. If you're interested
in following along with commits to the Wikimedia repo, feel free to
subscribe [3]. Reedy also helped get the CodeReview tool set up for the
repo [4] which is going to help us a great deal with following similar code
review conventions as with the MediaWiki repository. So yeah, thanks!
Arthur
[1] http://svn.wikimedia.org/viewvc/wikimedia/
[2] http://svn.wikimedia.org/viewvc/mediawiki/
[3] https://lists.wikimedia.org/mailman/listinfo/wikimedia-commits
[4] http://www.mediawiki.org/wiki/Special:Code/Wikimedia
--
Arthur Richards
Software Engineer
Fundraising/Features/Offline/Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
I found a number of troubles when trying to replicate localurl: parser
function by python urllib.quote(), with a number of annoying bugs.
Is this correct-complete?
"/wiki/"+urllib.quote(name-of-page.replace(" ","_"),"/!,;:.-_")
where name-of-page is utf-8 encoded.
Thanks! I apologyze for so a banal question.
Alex brollo
On Wed, Jun 29, 2011 at 10:19 AM, Ashar Voultoiz <hashar+wmf(a)free.fr> wrote:
> You gave me the git repo at one point. Still haven't managed to look at
> the code and enhance it. Maybe we could add it to the subversion
> repository and have volunteers enhance it.
Here's the repo:
https://gitorious.org/mwcrstats/mwcrstats
On Wed, Jun 29, 2011 at 10:22 AM, Chad <innocentkiller(a)gmail.com> wrote:
> Even cooler would be integrating this into CodeReview itself,
> perhaps alongside what we have [here:]
> http://www.mediawiki.org/wiki/Special:Code/MediaWiki/stats
That would be cool. The code I wrote is in Python, so it would need to be
ported PHP. I took an approach that gave me a lot of flexibility, but it
also made things a little complicated. Bryan Tong Minh's approach may be a
better starting point for the data generation part. I don't remember where
he posted the queries he uses, but I'll post them here now:
mysql -hsql-s3 mediawikiwiki_p -e"SELECT MAX(cpc_timestamp) AS ok_timestamp
FROM code_rev JOIN code_prop_changes ON cpc_repo_id = 1 AND cpc_rev_id =
cr_id WHERE cr_repo_id = 1 AND cr_status IN ('ok', 'resolved') AND cr_path
LIKE '/trunk/phase3%' GROUP BY cr_id ORDER BY ok_timestamp ASC" -N
>code_ok.txt
mysql -hsql-s3 mediawikiwiki_p -e"SELECT cr_timestamp FROM code_rev LEFT
JOIN code_prop_changes ON cpc_repo_id = 1 AND cpc_rev_id = cr_id WHERE
cr_repo_id = 1 AND cr_status IN ('ok', 'resolved', 'fixme', 'new') AND
cr_path LIKE '/trunk/phase3%' AND (cpc_rev_id IS NOT NULL OR cr_status =
'new') GROUP BY cr_id ORDER BY cr_timestamp ASC" -N >code_all.txt
Nimish and Erik Zachte are in the process of working on some jqPlot-based
reporting tools, and it might be best to leverage their work rather than
using Flot (which is what my code uses). jqPlot is a fork of Flot, from
what I understand, so much of what is possible in Flot should also be
possible in jqPlot. Some more info on that is here:
http://www.mediawiki.org/wiki/Wikimedia_Report_Card_2.0
...and
http://www.mediawiki.org/wiki/Wikimedia_Report_Card_2.0/status
Rob
The sortable table system in MediaWiki was completely rewritten in r86088;
unfortunately this was done without benefit of any unit testing or
regression testing, and there seem to be a lot of new bugs introduced.
I've started adding test cases for table sorting in r90595; this adds a
qunit test suite that creates some short tables (listing planets Mercury
through Saturn and their radii) and tries setting up and triggering sorting,
then compares the resulting order to the expected value.
The ascending sort by name usually works, but if called twice on two tables,
the second table usually gets sorted *completely* incorrectly. This can be
easily confirmed by manual inspection by copying a page such as
http://test.wikipedia.org/wiki/Planets_of_doom to your trunk wiki and
sorting first one table, then the other.
Numeric sorting on the radius column is also incorrect; the data set
includes values with two different orders of magnitude (> 10k and < 10k),
and they don't sort correctly. You can confirm on the test.wikipedia page
that these sort as expected in 1.17.
These specific bugs will also need to have test cases added, all of which
are claimed to be fixed by r86088 or its follow-ups:
* https://bugzilla.wikimedia.org/show_bug.cgi?id=8028
* https://bugzilla.wikimedia.org/show_bug.cgi?id=8115
* https://bugzilla.wikimedia.org/show_bug.cgi?id=15406
* https://bugzilla.wikimedia.org/show_bug.cgi?id=17141
* https://bugzilla.wikimedia.org/show_bug.cgi?id=8732
* https://bugzilla.wikimedia.org/show_bug.cgi?id=28775
As a reminder, you can run the qunit tests from your browser by simply going
to the tests/qunit/ subdirectory of your wiki installation.
General info: http://www.mediawiki.org/wiki/Manual:JavaScript_unit_testing
-- brion
The number of un-reviewed revisions for 1.18 has dropped below 500 for
all of /trunk and below 200 for phase3:
mysql> select count(*),cr_status from code_rev where cr_repo_id = 1 \
and cr_id > 47450 and cr_id < 87529 and cr_path like \
'/trunk%' group by cr_status;
+----------+-----------+
| count(*) | cr_status |
+----------+-----------+
| 13951 | deferred |
| 82 | fixme |
| 339 | new |
| 16534 | ok |
| 1240 | old |
| 1585 | resolved |
| 793 | reverted |
+----------+-----------+
7 rows in set (0.06 sec)
mysql>
This is great news!
>From this and
<http://toolserver.org/~robla/crstats/crstats.118all.html>, It looks
like we *could* drive the number of “new” revisions to 0 by the end of
the week.
That would be stunning.
Mark.
Hello folks,
we have 12 open positions in Wikimedia engineering right now, and a
few more will go up in coming weeks. If you have connections, we'd
appreciate your help hiring for these roles, forwarding this note to
appropriate listservs, tweeting it, etc.
The full listing is at
http://wikimediafoundation.org/wiki/Job_openings , but here are some
key roles that we need help with:
1) Product Managers: We're looking for great people to act as product
owners for three very important initiatives:
- New editor engagement: helping Wikimedia to attract, nurture and
retain new contributors.
- Analytics: supporting the development of systems and tools for
measuring our impact.
- Mobile: helping us reach hundreds of millions of people on mobile
devices and engaging hundreds of thousands of them to contribute in
useful ways.
The product manager role at WMF entails grooming and prioritizing the
product backlog, engaging with the Wikimedia community, commissioning
and organizing research, hands-on testing, but also helping with
across-the-board priorities triage for ongoing product development.
These folks don't have to be product managers by trade, but they need
to be comfortable negotiating compromise while holding the product
vision. They need to treat engineers as equal partners, and be
excellent communicators. Ideally they have strong domain expertise
relevant to their focus area.
We have two of these currently posted:
http://wikimediafoundation.org/wiki/Job_openings/Product_Manager_(Features)http://wikimediafoundation.org/wiki/Job_openings/Product_Manager_(Analytics)
The mobile one will go up soon, and we'll refine the definition
further. But please use these as reference points for now.
2) Analytics Engineers: We're hiring for two systems engineers to
build out our analytics infrastructure. What exists so far is still
fairly rudimentary, so we need to build scalable logging and tracking
systems for various purposes, e.g.
- geographic breakdown of access and editing activity
- usage data for specific features; A/B testing of features
- search activity, real-time editor retention measures, new activity
visualizations, and more ..
The ideal candidate here likely is someone who's very strong building
out large scale distributed systems, and has experience with NoSQL
technologies, distributed computing, etc.
The relevant JD is here:
http://wikimediafoundation.org/wiki/Job_openings/Systems_Engineer_-_Data_An…
3) A strong QA Lead who can help us write and perform test plans with
shoestring and duct tape, i.e. using a combination of test automation,
work with outside vendors, and volunteer-driven testing to strengthen
our product quality. The relevant JD is here:
http://wikimediafoundation.org/wiki/Job_openings/QA_Lead
4) Strong frontend and backend engineers: for features development,
code review, deployment and release management support, and so forth.
Demonstrable open source experience is always a major plus, and while
PHP is learnable, not being predisposed against it helps. :-)
http://wikimediafoundation.org/wiki/Job_openings/Software_Developer_Fronten…http://wikimediafoundation.org/wiki/Job_openings/Software_Developer_Backend…
Your outreach and support is always appreciated.
All best,
Erik
--
Erik Möller
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
We'll be holding a public IRC bug triage in about 2-3 minutes on
#wikimedia-dev for all who are interested. We will be using
http://etherpad.wikimedia.org/BugTriage-2011-06 to keep notes as well.
See you there!
Mark.
I'm not sure who would be in charge of this, but I think it would be
useful if the WMF was a liaison member of the Unicode Constortium:
http://unicode.org/consortium/memblogo.html
This body makes all sorts of important decisions about the Unicode
standard—decisions that affects many aspects of our projects. If an
issue were to come up that adversely affected us, we would not have a
formal way to object at the moment. Being a liason member gives us
official standing with the organization, allowing us to participate
alongside Google, Apple, and Microsoft in any Unicode-related
discussions that are important to us.
Other open sources projects that are currently liaison members:
The GNOME Foundation
The Mozilla Project
OpenOffice.org
I'm not sure if there is any fee for becoming a liaison member. The
instructions simply say to "contact the Unicode Office for details".
Would it be worth contacting them to find out?
Ryan Kaldari
Tomorrow is the first IRC bug triage (finally!). We'll start the
meeting at 2300 UTC (see http://hexm.de/44 for the UTC impaired, like
myself).
The triage agenda is all set on
<http://etherpad.wikimedia.org/BugTriage-2011-06>. Feel free to make
comments, but please do not delete items.
Here is the broad overview:
* Start with bugs that are affecting Wikipedia users *now*
* Followup on bugs discussed in previous triages and get updates on
progress
* Discuss an outstanding API issue that a tracking bug covers
Following that, I'd like to get brief input on bugs that were marked
“High” priority going into the 1.18 cycle and have been marked
deployment or tarball blockers as a result. Let's discuss if the
deployment blockers should really be blocking deployment.
Some are older bugs, so we may remain relatively unconcerned about them
and decide that they shouldn't block 1.18. Or, maybe we'll find a
willing hacker to fix them.
So join us in #wikimedia-dev for our first ever IRC Bug Triage.
See you there,
Mark.