Since MediaWiki 1.18 we have the variable $wgUseCombinedLoginLink [1]
which is set to true per default.
During edit workshops with students and seniors I registered that new
editors are confused about the combined login page. They tried to
register new accounts on the login page.
Surely, these observations are not representative but I think that the
usability could be improved by setting $wgUseCombinedLoginLink=false
If I missed a prior discussion about this issue I apologize and would be
happy if someone could point me to the discussion.
Otherwise I suggest to set $wgUseCombinedLoginLink to false for all WMF
wikis.
Raimond.
[1] https://www.mediawiki.org/wiki/Manual:$wgUseCombinedLoginLink
Here is a quick suggestion. These should be more neutral and easy on the
eyes if you keep seeing a lot of gerrit. I wish there was a simple way to
change the values in the css and upload a preview, but i did a view source
on gerrit and I received quite a fright.
Leaving it to someone else to show us how this scheme looks:
1) backgroundColor -- no change
2) topMenuColor -- #DBDCFF
3) textColor -- no change (unless its possible to have different classes
for different texts)
4) trimColor -- optional #A4A5BF
5) selectionColor -- #FFE4CE, alternate #F1F1FF
--
j.mp/ArunGanesh
Hi all,
with this Email I want to start the discussion on how to
operationalize the deployment of Wikidata. This Email gives a general
outline, and a link to the page on Meta where I will collect the
results of this discussion. Discussion, feedback and comments are
welcome.
The goal for Wikidata is to release early, release often, and to
eventually follow WMF's lead with their bi-weekly deployment cycle.
The very first version of Wikidata to deploy is almost there. We still
have a number of bugs and wrinkles we are polishing, but in general we
are ready to start moving towards deployment. This will lead to a very
organic introduction of Wikidata data into the Wikipedias. We can
react to problems and use cases early. I think a plan where we
implement the three phases completely and deploy them then is bound to
lead to a less widely accepted solution.
The suggestion is to deploy the following steps. This all still only
covers phase I.
* Step 1: start the Wikidata repository wiki. This only allows to add
language links, and they are not displayed anywhere yet.
* Step 2: deploy the Wikidata client extension on one language edition
of Wikipedia for testing (Q: How to select it?)
* Step 3: deploy the WIkidata client extension on a second language
edition of Wikipedia (Q: How to select it?)
* Step 4: deploy the Wikidata client extension on the English edition
of Wikipedia
* Step 5: deploy on all Wikipedias.
Details are on this page on Meta:
<http://meta.wikimedia.org/wiki/Wikidata/Notes/Deployment>
I would like to find the appropriate means to define how to implement
the above steps whether on this list or off it.
Even though Rob has given me unlimited license to be obnoxious on this
list about this, I will try not to. Mind the "try" :)
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi all,
how are you? I'd like to know about the possibility of solving an old
issue with CAPTCHA for Wikipedias in languages other than English.
This bug
https://bugzilla.wikimedia.org/show_bug.cgi?id=5309
was created in 2006. There is a discussion here about having CAPTCHA
in other languages from February 2012
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/51951/
but it seems there was no conclusion. After working on campus with new
editors in Brazil, I've checked this is a real obstacle, since most
people here cannot ready English at all.
I'd like to know if there are plans to solve this issue - I hope I
don't sound rude, maybe this can be a minor issue when we don't see
the difficulties people from a different place can face. I think this
is important for Wikipedias other than the English one (just read
people comments in the bug) and we can be loosing new contributors
because of their first impressions. Thanks,
Tom
--
Everton Zanella Alvarenga (also Tom)
Wikimedia Brasil
Wikimedia Foundation
hello,
I'd like to do thinks in the better way...then:
I have to made an extension similar to BoilerTemplate.
Right now I made a special page and I'd like to put in a form to let
user choose a teplate.
Some questions:
1) Must I use some apis to render the form?
2) How to set the form target? Hidden input tags are a good solution
but: how can I have a clean urls ready form?
I'm ready to read any tutorial you can suggest me,
thanks for your attenction,
ADM
Hi all wikitech-ers,
I wanted to summarize a few lessons learned from the Wikimania Hackathon
and make myself available to discuss any of these issues at greater length
on the list.
Also, if you (as an attendee) have other thoughts you want to share about
the Hackathon, feel quite free to email the list, just myself, or just
Sumana if for any reason you want your identity or your feedback to be
more private.
(Note: This is probably going to get long! If you just want the "lessons",
look at just the header names.)
= Preface =
My major interest in the Hackathon was to encourage newcomers to step up
their abilities, in terms of editing the encyclopedia, programming bots,
or whatever exciting tech-related thing they want to be better at.
= Thanks =
Major thanks for the Hackathon go to:
* Sumana Harihareswara for inviting OpenHatch to lead the newcomer
portion of the Hackathon;
* Greg Varnum, for being a hugely awesome local co-organizer;
* James Hare, for managing the local logistics and generally helping make
thigns go smoothly behind the scenes;
* Katie Filbert, for helping tremendously with pre-event planning; and
* Mike Linksvayer, who was my other OpenHatch co-organizer, for taking
care of lots of essential tasks like emailing local logistics folks and
signing attendees in.
= Thoughts =
== The combination of newcomers and experienced people seemed to work ==
We got two different notes in the exit survey about this -- one from a
person who said they were happy working in the big open main room and
didn't feel distracted by noise, and from another person who said they
went off because they found the main room too noisy.
I did really like being able to send people to nearby experienced folks to
have a chat. This was especially helpful as I walked around and asked
people what they wanted help on, or what they wanted to work on.
I did notice some experienced people, by the second day, had wandered off
to quieter rooms than the big main room. I'm glad we had those rooms. (I'd
love to hear from those people if they felt "pushed out", or if they
instead felt happy that the quieter rooms were available.)
== We made a good impression by just running the event ==
One prospective attendee who, sadly, couldn't make it, indicated to me
that just by reading the survey we sent to prospective attendees, and
skimming the list of tasks, it was something that they'd be interested in
attending, and that it was great that such an event was going on. In
particular, they suggested it felt more like a "play-with-stuff-a-thon"
rather than a "hack-a-thon", and described that as making them feel
welcome.
If anything, we should have capitalized on this more. I heard from other
prospective attendees that they didn't know the Hackathon was intended to
be newcomer-friendly this year.
Personally, I think the term "Hackathon" gives an exclusionary vibe, and
that a newcomer-oriented event should have a different name. For example,
Boston Ruby recently started using "project night" at our (OpenHatch's)
suggestion and it seems to have gone well for them:
https://openhatch.org/blog/2012/the-steps-boston-ruby-is-taking-to-become-f…
== Many people did cool things for the first time ==
I heard from at least two people who made their first edit to Wikipedia
during the Hackathon! I saw one person write a bot for what I believe was
the first time, as well get a labs account and work on moving that bot
there. In the exit survey, one attendee wrote, "I learned a lot about
batch upload." Another wrote about making their first commits with Git and
Gerrit.
I think this is the most exciting thing about a newcomer-friendly
Hackathon -- it creates an environment where people can step up to the
next level, and hopefully stay at that level through self-driven follow-up
practice. This is how community growth happens.
That reminds me: Two people did indicate on the exit survey they'd be
interested in follow-up mentorship. I'm going to see how we can best
support them; I might just send them a periodic email to see what they're
up to and see how I can personally help or direct them to help.
== Wi-fi was a serious obstacle ==
I personally had a lot of trouble with my laptop failing to stay on the
wifi. I would estimate at least 10-25% of attendees had a serious problem
with this.
It's especially tough because on one hand, the Hackathon is a very
wifi-dependent event. On the other hand, the Hackathon is a pre-Wikimania
event, which means that unless there's serious pre-pre-event testing of
the wifi, Hackathon attendees are the ones that will run into any problems
that occur.
(For those interested in minutiae, it seemed that roaming between access
points triggered the problem. The problem as users experienced it was that
randomly they would suddenly see 100% packet loss with no obvious way to
fix it.)
A sample of things we saw from this:
* At least one person reports in the exit survey that it was impossible to
get work done with the wifi the way it was.
* Mike and I couldn't sign people in using the wiki because our own wifi
had failed, so we used a spreadsheet local to his computer. This meant
that it was harder to use the wiki to locate like-minded people.
One organizer attempted to set up a separate wifi network, which was
operational toward the end of the event. In the future, we need more
testing of this, and more of a plan for what to do if it fails.
== Logistics concerns ==
The room that we had agreed we would use turned out not to have power
strips lining the bottom of every table, so we switched rooms and had to
update signs across the event.
Some exit survey respondents indicated they wanted the event to start
later in the morning, and that they wanted the room to not close at 6pm.
I received more than a few emails from people who were, according to the
wikimania2012 registration system, signed up, but who replied to me
indicating that they couldn't in fact attend due to not receiving a visa
as needed. It would have been nice if those people had not been in the
registration system anymore marked as attendees.
I also sent out at least one email to the wrong address; the problem was
that the user who did the registration isn't necessarily the person who's
being registered. I think that in this case, one person in a company was
responsible for registering another person. We could fix this by making
personal email address a field that you enter at registration time (though
I do realize data will basically never be entirely perfect).
(Similarly, the script I was given for extracting information from the
registration system ended up suddenly failing, which slowed down the
second pre-event email.)
== Exit survey suggests people overall liked the event ==
Of those people, all of them indicate they're either "likely" or "very
likely" to recommend a hackathon organized by OpenHatch to a friend, so
generally speaking people enjoyed themselves.
== Tutorials can use TAs ==
On the second day, I found volunteeres to be "TAs" for the tutorials.
Their job was to wander around and help people with environment problems
or people just having trouble following along because e.g. their web
browser was different than the one being used by the presenter.
Another difficulty I found was that sometimes a tutorial speaker wasn't
loud enough to be audible in the back of the room.
This is above the tasks I had labeled as needed for a "Talkmeister":
http://wikimania2012.wikimedia.org/wiki/Hackathon/Volunteers#Talkmeister
Also, people who are having problem following along during a tutorial
don't always speak up. I'm glad we added the TAs, although I think further
work is required to find out how to non-intrusively convince people that
asking questions is okay. The best way I've seen is to have a very small
group, no bigger than 10, preferably about six people. I almost wonder if
we'd be better-served to use pre-recorded video tutorials with a lot of
TAs available, rather than live lecturers. Then we could easily have small
group rooms, and could pause the video.
== Next time, I'll get firmer commitments from helpers ==
One thing I did was walk around the main room, asking if people needed
anything. I did find some people willing to help out with this, but didn't
have a good sense of when they could help. Next time, I'll do something
like create an hourly sign-up sheet for this.
We did receive feedback at the end of the first day that at least some
people were very happy with the helpers. One person indicated he wished
there had been even more help. While I do think that everyone just about
got the help they needed, it would have been nice to have had a more clear
list of who's helping, partly to ensure we have more capacity, and partly
for me to know when people are planning to help, and partly to encourage
mentors to sign up for brief time slots and know they've helped the event.
== Favorites ==
People listed a wide variety of favorite activities, with all these being
mentioned more than once in the exit survey:
* Tutorials
* Talking to people
* Coding/hacking
The laptop setup process showed up once, too, as did break-out rooms.
Sadly for me, the list of "Tasks" I made wasn't a favorite for anyone who
filled out the exit survey. (Surprisingly and pleasingly, the laptop setup
was!)
== Diversity ==
I would estimate about 10-20% of our newcomer-oriented attendees were
women.
One difficulty wasa that we experienced a lot of competition for attendees
with AdaCamp DC, which took place on the same days and times. (I heard
from more than a few attendees and prospective attendees that this was a
conflict.)
(If you're not familiar with the event: "AdaCamp is a Ada Initiative event
focused on increasing women’s participation in open technology and
culture" <http://dc.adacamp.org/about-adacamp-dc/>)
Given that conflict, though, I think we did reasonably well.
In terms of other diversity, one attendee remarked to me that they were
quite impressed at the diversity of ages in attendance. I was personally
quite impressed by the diversity of experience levels in attendance.
I think we accommodate all that reasonably well.
== Exit survey: Qualtrics, etc. ==
The exit survey we ran received only 11 responses. (Our entrance survey
received over 100, but I sent it to about 400 people. The exit survey was
sent to about 45, so a 25% response rate is somewhat consistent. We had
about 65 people sign in on our spreadsheet, and about 45 of those people
gave us email addresses.)
I used Qualtrics to run the exit survey, since it complies with
Wikimedia's guidelines on data privacy/storage and relationships with
vendors.
I ran into an issue with Qualtrics where the sample email that it sent me
to preview what the form would look like wasn't an accurate simulation --
in particular, it didn't pre-fill the email address in the form in the
same way as the real email did.
Anyway, it took too long to get this sent out, partly due to time spent
learning Qualtrics that I didn't expect to be spending, partly due to my
own conference travel post-Wikimania. Given the wifi, we could have just
created a paper exit survey for people to fill out, or otherwise generally
been faster at this if we had prep'd the exit survey questions
before-hand. We can aim to do that for future events.
== Other thoughts ==
* One exit survey respondent wanted it to be easier to find people
interested in using MediaWiki and related software for non-Wikimedia
tasks, to chat with and work on tasks with.
* I treated laptop setup as a good thing for all attendees to go through
first, because it was a prerequisite for some of the tasks, but by no
means all of them. In hindsight, I would prefer to indicate on each "Task"
what setup steps are required. The people that ended up just mostly
editing Wikipedia didn't need to do them. (The other side of the spectrum
is that it was good for many people to go through them, to enable them to
do tasks that they might not have thought they could do!)
= Conclusion =
That's "it". If you read this far, thank you!
Discuss, if you like! I'm here to chat (although might take a day to see
all responses, if any, and respond in bulk).
-- Asheesh.
Hi everyone,
Y'all know about this, right? :)
http://www.mediawiki.org/wiki/Git/Gerrit_evaluation
As I said at the beginning of this process[1], the way this works is
we argue for a while, Brion watches, and then he makes the call. You
might have also noticed my mention about a meeting today on the
subject[2]. That attempted to be a fairly mundane meeting to make
sure that Brion had everything he needed to make a decision (and give
him the opportunity to say "don't saddle me with this decision") :)
Brion is still on board (yay!) and has everything he needs. Here's
the list of alternatives that are under consideration now:
* Staying with Gerrit (with GitHub integration)
* GitHub (full migration, with some mirroring and limited private repos)
* Phabricator
* Gitlab
* Extended evaluation
That last one is one I just added, to make explicit an idea that has
been mentioned many times (and one I've articulated myself).
Basically, there are many people who are dissatisfied with Gerrit who
don't feel they have the time to look into an alternative, but feel
strongly that the Wikimedia Foundation should put even more
significant resources toward an evaluation.
I'll be blunt, though. Brion made his preference pretty clear that,
if he were to make a decision today, it would be to stick with Gerrit,
and I've made it pretty clear that's also my bias. The other options
(including the "extended evaluation" option) have not been made in a
clear and compelling enough way to Brion to sway him. I wanted to
make sure that we didn't make the final decision in that room, but it
was tough not to call this one early.
However, I was careful to press Brion on what would change his mind in
the next two weeks. Basically, it's different for each system:
* For GitHub: we'll need to be convinced that it's ok for our primary
code repository to be hosted elsewhere. That means, in part, a
mitigation/backup strategy, better explanation of how account
management will work and integrate with our stuff, and we'd need to be
convinced we have a strategy for negotiating the license we need for
our community. I'd say this one is just Brion's problem, but really,
I'm not sure the rest of the org (Erik, our legal dept, etc) would let
Brion unilaterally make this call.
* For Phabricator: we've invited the lead developer to the office,
and we'll be hearing what he has to say about the system.
* For Gitlab: Roan would need to make the time to do more of an
evaluation, or someone would have to pick up with what he's started.
* For the "extended evaluation" option, we'd need to hear why it's a
better investment to set up a bunch of alternatives to play around
with rather than putting that energy into improving Gerrit, since we
likely won't have the time and energy to do both well. We'll also
have to understand why we should subject ourselves to even more
mailing list debate on this topic :)
We agreed we weren't going to set a hard deadline for how long we're
going to stick with whatever decision we make. It of course depends
one which alternative we choose. For the "stick with Gerrit" option,
a year (more or less) sounds right. Also, despite emphasizing the
importance of making a decision and moving forward, we're going to
maintain some flexibility still. If we move to Gerrit, we'll be
supportive of:
* Pilot projects in other systems
* WMF devs spending 20% time working toward alternatives
* Reevaluating soonish if a really, really compelling alternative emerges
Rob
[1] July 11 "Gerrit evaluation process" email:
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/62390
[2] July 26 "Criteria for 'serious alternative'" email:
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/62648
Hi,
I received an interesting question recently: if you have a big list,
without any sections, and you search for a term that appears more than
once, the search results show only the first appearance of the keyword
in the page. Is there any way to show all the places where that word
appears and to be able to redirect the user to that precise location
(let's say "precise location" means "the last anchor before the
keyword appears")?
Thanks,
Strainu
To all developers:
I would like to gather some feedback for an idea that has been around for a while -- implementing an extension manager for Mediawiki.
The topic came up a few times recently at Wikimania 2012's "Ask the developers" session and got me thinking. I've been developing with Mediawiki for 7 years now and have no contributed anything significant to the community. I've taken on this project as a serious opportunity to contribute, and would like to get some comments, opinions, and general feedback (if not some interested parties willing to help out!) I've expanded the current "request for comments" page to include a history of the idea and a number of ideas for features.
You can view the proposal at [ http://www.mediawiki.org/wiki/Requests_for_comment/Extension_release_manage… ]. Please leave your comments on the Talk page (and don't forget to sign your name!)
-Daniel
MW: http://www.mediawiki.org/wiki/User:DanielRenfro
IRC: daniel_renfro
Sam Reed had asked me to help him understand how to query the Bugzilla
API so that he could use it in reporting during the rollouts of new code.
I've already got a PHP library that I wrote (in SVN still:
https://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/bugzilla/client/)
so I used that to write a simple script to query the API several times
and print the counts for each query.
The commented script is here:
https://www.mediawiki.org/wiki/Special:Code/MediaWiki/115632
Since I was the only real user of this code (until now), I didn't write
any documentation, but if there is some interest, then I can start
working on that.
Mark.
--
http://hexmode.com/
Human evil is not a problem. It is a mystery. It cannot be solved.
-- I Don't Believe in Atheists, Chris Hedges