Hi,
I have created my User page, envisaging the ideas I shared through my last
mail - https://en.wikipedia.org/wiki/User:Shivansh13
Please let me know the prospective of the ideas suggested.
Regards,
--
Shivansh Srivastava | +91-955-243-5407 |
http://in.linkedin.com/pub/shivansh-srivastava/17/a50/b18<mr.shivansh.srivastava(a)gmail.com>
<mr.shivansh.srivastava(a)gmail.com>Secretary, BITS Alumni Affairs Division
| Web Expert, Newsletter, BITSAA International
3rd Year Undergraduate | B.E. (Hons.) - Electronics & Instrumentation
BITS-Pilani.
Hi,
You write on https://www.mediawiki.org/wiki/MediaWiki_1.19/Roadmap:
> Wednesday, March 1 (-2), 23:00-03:00 UTC (3pm-7pm PST):
Stage 5 deployment to:
- All Wikipedia sites
Do you mean Wednesday, Feb 29, or Thursday, March 1? :-)
--
Bináris
Which extensions could use more active attention? Let me know so I can
suggest this work to developers with interest & spare time. For
example, if volunteers are writing good patches that await review, maybe
we could encourage them to take over triaging bugs and maintain the
extensions more actively.
This is especially worth investigating for extensions that WMF deploys.
The more developers who learn an extension's codebase and take a hand
in maintaining it, the more quickly we can respond to possible problems.
I looked at
https://www.mediawiki.org/wiki/Category:Extensions_used_on_Wikimedia [0]
and ran a Bugzilla search for open bugs on those extensions with
severity Normal or higher and priority Normal or higher.
http://ur1.ca/85xoz
Numbers below are from a couple weeks ago when I was drafting this mail,
but wouldn't be that different right now. Patches awaiting review:
ParserFunctions: 8
Math (texvc): 5
Cite: 5
AntiSpoof: 4
CentralAuth: 4
ConfirmEdit: 3
DismissableSiteNotice: 3
DumpHTML: 3
ProofreadPage: 3
CheckUser: 2
DynamicPageList2: 2
Lucene Search: 2
MobileFrontend: 2
SyntaxHighlight (GeSHi): 2
WikiEditor: 2
CategoryTree: 1
CharInsert: 1
EasyTimeline: 1
ImageMap: 1
Nuke: 1
OggHandler: 1
Poem: 1
ReaderFeedback: 1
UploadWizard: 1
Vector: 1
As of today there are 137 patches awaiting review for MediaWiki, and 60
patches awaiting review for extensions that WMF deploys.
[0] If someone else wants to figure out & align the various lists of
extensions deployed on WMF servers, the sources to use are
https://translatewiki.net/wiki/Main_Wikimedia_extensions ,
http://svn.wikimedia.org/viewvc/mediawiki/trunk/translatewiki/MediaWiki/Wik…
and https://www.mediawiki.org/wiki/Category:Extensions_used_on_Wikimedia .
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hi everyone,
We'd like to postpone the Git migration 2.5 weeks, with a new final
migration date of Wednesday, March 21. Here's the convergence of
factors that led us to the new date:
* The 1.19 deployment has kept us busy enough that none of the rest
of us in Platform Engineering have had spare cycles to help Chad out
* It's quite likely we'll have many fixes after we deploy to the
bigger wikis (enwiki and friends) on Wednesday
* We have a number of unresolved issues in our Git+Gerrit deployment [1][2]
* Code review is falling back behind. As of right now, we have 38
unreviewed revisions in core (phase3), and another 189 unreviewed
revisions in extensions. That's up from the 4 core + 28 extensions on
February 4. We basically let code review get back out of hand as
we've turned our focus toward bugfixing in deployment.
Chad and Ryan also discovered today that the machine we're using for
Git and Gerrit (formey, which is also SVN) just isn't up to hosting
the whole mess. So, there's a machine deployment we need to do as
well. The good thing about this
Here's the new plan:
* Week of 3/5 - MediaWiki 1.19RC1 release. Code review stats by
end-of-week: 20 new on phase3, 100 new on phase3+extensions
* Week of 3/12 - MediaWiki 1.19.0 release. Code review stats by
end-of-week: as close to zero as possible in phase3+extensions.
Possibly even 1.20wmf1 (first mini deployment untethered to release
schedule, first of many...1.20wmf2, 1.20wmf3, etc)
* Week of 3/19 - Git migration week. Migration day: Wednesday, 3/21
A MediaWiki tarball release *should* be a relatively minor endeavor.
A deployment during this time is a stretch goal. We should be able to
make a deployment from a more recent point on trunk if we're
disciplined about actually getting through the code review backlog and
we do a good job in review. Doing a good job means reverting when we
need to.
The top priority for Platform Engineering will be the Git migration,
so anything that distracts from that (like, for instance, a 1.20wmf1
deploy) may get postponed while we finish this off once and for all.
Thank you everyone for your patience on this transition.
Rob
[1] http://www.mediawiki.org/wiki/Git/Conversion#Unscheduled_items
[2] https://bugzilla.wikimedia.org/showdependencytree.cgi?id=22596&hide_resolve…
In my spare time at Redwerks I've been working on a Short URL
configuration tool:
http://shorturls.redwerks.org/
Our Short URL manual pages have been VERY bad for quite awhile. Every last
one of them has bad practice on them and I only managed to fix one of them.
Considering how we have so many different manual pages simply because
people have slightly different configuration requirements (eg: One for /w
and /wiki/ another for / and /wiki/, another for /subpath/wiki, etc... One
for .htaccess, another for Alias in Apache config, another for RewriteRule
in Apache config, another for Nginx, etc...) rather than trying to fix I
started writing a tool to build the configuration instead.
The tool tries to auto-detect as much as possible (practically everything
in fact). Everything from the type of server, the sapi (mod_php vs. ?),
your scriptpath, etc... it even tries to jump ahead of you and guess what
kind of article path you were intending to use. It even has an early
feature to try and preemptively detect if you're likely to have root
access or no root access (The idea is to detect what kind of host you're
on using the reverse dns for the server you're on).
The configuration generator actually isn't really a simple thing. There
are a lot of conditionals involved in the tool. It can handle the special
cases needed for root /$1 style urls. It knows how to add an extra rewrite
when you use /w and /wiki and need / to redirect to your wiki. It uses
Apache's %{DOCUMENT_ROOT} in RewriteRules but also lets you expand an
absolute docroot when you use an Alias (you can't use %{DOCUMENT_ROOT}
there). I've loaded the Nginx config full of deny rules and conditions
that most people never bothered to properly configure (and yes, it can
handle root urls, and even knows how to stop Nginx from executing php in
uploaded files). Heck, this tool can actually handle TWN's wacky /w/i.php
script path.
The tool is definitely beta right now. I have Apache and Nginx support
written. I'll probably want to have some discussion with Lighttpd and ISS
users to fix the configuration some of them are using and add code to the
tool to support those types of servers. There are still some conditions it
might not handle just yet. For example I haven't written the code to
handle root style urls in root Apache config files.
Feel free to start using it when you setup a MediaWiki installation. I'd
love to know when configuration doesn't work and what kind of tweaks I
need to make to it. As well I'd like to see how the configurations handle
different shared host setups, etc... I'd also like to see people on
different hosts, both shared host users and VPS, Dedicated, etc... users
running Apache jump in and make use of the yellow message's question on
whether you're in a root or shared environment. If you use that specific
feature it keeps track of reverse dns and the response, which I can use to
try and preload rules to indicate what host's reverse dns patterns are for
shared hosting users and what are for VPS servers and the like.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
Hi all!
And here's another hi: Hi! This is my first post to this list, so here is a quick intro in case you missed the other ones. I'm Andrew Otto, an engineering on the new Analytics team. I'm working with David Schoonover (new hire as well), Fabian Kaelin, and Diederik van Liere. Right now we're working on some prototypes for the a WikiMedia report card.
I think we are the first team that is doing active work in git using Gerrit, and Robla asked me to reach out here to describe our experiences and ask for help. We're struggling right now to be productive using Gerrit (I spent 3 hours today just trying to merge a branch), but it could be do to our lack of experience with it. There have been a couple of emails bouncing around to Ryan Lane and Roan, but it might be more productive if I made this conversation more visible here. I'll start with some questions.
1. Will Gerrit allow us to create branches without using the web GUI, and without having to be a Gerrit admin for a project?
One of the points of using git is to be able to create branches at will. We're finding this very difficult right now, not only because creating requires GUI admin access, but because of other reasons explained below.
2. Do I need to rebase every time I push for review?
I don't quite understand what is going on here. I've installed git-review and am using this to push to git. It does a rebase by default. I'm not sure if I should be turning that off or not. Rebases seem like a bad idea unless you really need to do them. I think git-review is doing a rebase by default so it can squash all of your local commits into one big review commit before pushing. Yuck! This would surely mean fewer commits to review in Gerrit, but it destroys real the history. It is making git work more like subversion, where you just work locally until everything is good and then have one big commit. I should be able to commit often and be able to share my commits with other developers before having everything reviewed.
3. How does Gerrit handle merges? Do all merge commits need to be re-approved?
4. What should I do in the following situation?
I have a branch I recently made from master. I've made some changes and pushed them to gerrit. My changes have been approved. Now I want to sync master into my branch. I do
git merge master
Then resolve any conflicts and commit. How should I push these changes? The commits that make up the merge have already been approved in gerrit on the master branch. Do I need to push for review using git-review? They've already been approved, so I would think not. But gerrit will currently not allow me to push without using git-review (is that because the commits need a Change-Id?).
Since gerrit doesn't let me do a regular git push to push my master merge to the remote branch I am tracking, I do git-review. This does rebase by default, so for some reason I am stuck having to resolve every single commit that was made to master in order to get the merge to push. This takes quite a while, but I did it, and once the interactive rebase was finished I was able to git-review to push the merge from master.
Great. Now I that my branch is in sync with master again, I want to merge it into master.
git checkout master
git merge my_branch
All good. Then what? Since I can't do just 'git push', I try git-review again. The same thing happens. I have to run through the whole interactive rebase routine and resolve each of my commits from my_branch manually. I do that, then run 'git-review' again. Now I get this error message:
remote: Hint: A potential Change-Id was found, but it was not in the footer of the commit message.
To ssh://otto@gerrit.wikimedia.org:29418/analytics/reportcard.git
! [remote rejected] HEAD -> refs/for/master/master (missing Change-Id in commit message)
error: failed to push some refs to 'ssh://otto@gerrit.wikimedia.org:29418/analytics/reportcard.git'
Each of the commits I merged from my_branch come with their own Change-Id in the commit messages. But these commits are now merge commits (I think?), so they have information about the merge and any conflicts in the commit message below the original Change-Id. I think this is confusing Gerrit, because it doesn't see the Change-Id in the footer.
Now I'm stuck, I'm really not sure how to push anymore. I want to get Diederik some of my changes, but I can't push them to master.
Thanks for the help everybody! It sounds like we in Analytics are the git+gerrit workflow Guinea pigs, eh? We're happy to fill this role, but SCMs are supposed to streamline and improve work flow, and right now Gerrit is being a big ol' nasty nancy. Help us iron this out so we can keep working!
- otto
http://ottomata.comhttp://www.flickr.com/photos/OttomatonAhttp://www.couchsurfing.org/people/otto
This past Friday, at the request of Michael Movchin, I added a Huggle
product to Bugzilla. Michael and Petr Bena are working on a web version
of Huggle that can be used if you don't have access to Windows (which,
having not used Huggle at all, is where I understand Huggle currently
runs).
I'd like to add more of these sort of tools to Bugzilla, just as we have
a lot of MediaWiki extensions in Bugzilla, since I think this could
encourage cross pollination between the WMF projects and people who are
interested in things that aren't core to WMF.
--
Mark A. Hershberger
Bugmeister
Wikimedia Foundation
mah(a)wikimedia.org
When using wikipedia and clicking on an image it opens up on a new page.
Wouldn't it be nice if it just scaled-up on the same page using JS. Is
there an extension there for this? and if yes why not implemented on
wikipedia?
--
With Regards
Nischay Nahata
B.tech 3rd year
Department of Information Technology
NITK,Surathkal
irc-nick nischayn22
Thanks to Platonides for his comment and also to Olivier (the author of the Realnames extension) who told me to forward the following patch to wikitech-l (which I just subscribed to) for advices, comments and critics.
I was just wondering if this small patch in User.php (function idFromName) was enough in most cases:
$dbr = wfGetDB( DB_SLAVE );
$s = $dbr->selectRow( 'user', array( 'user_id' ), array( 'user_name' => $nt->getText() ), __METHOD__ );
if ( $s === false ) {
//Start Patch $result = null;
$stwo = $dbr->selectRow( 'user', array( 'user_id' ), array( 'user_email' => $nt->getText() ), __METHOD__ );
if ( $stwo === false ) {
$result = null;
}else {
$result = $stwo->user_id;
}
//End patch
} else {
$result = $s->user_id;
}
Then, just try to enter your e-mail on a standard wiki in place of your username and you will be authenticated to the first ID (and user_name) having your e-mail.
The importance of e-mails as a simple way to authenticate on modern sites can't be ignored.
If you want to enter your standard username for authentication you can do it too.
But if your username is not a Roman but an Arabic, Thai, Japanese etc. username or even a French username with accents and if you decided to have authorship recognized in your own language and not only in a English transliterated way, you can also do it with as standard mediawiki installation. But if you are working with somebody who has an English keyboard only, the copy-paste of your Unicode username may be tedious and you would prefer to enter your e-mail address.
The modified Login form could be:
Username (or e-mail address): |___________________|
Password: |___________________|
If someone could test this patch above and report the security issues as well as performances, it could be great for us.
We are managing Demopaedia.org and are willing to open the site to professional demographers (being already subscribed to a national or international union for the scientific study of population). We will not use various LDAP authentication processes but use local standard mediawiki databases. The usual way to be authenticated is the e-mail and password, and we want to keep this option. If you look (for example) at the work of Mikael, his work is authored in Cyrillic: Михаил Денисенко on http://ru-ii.demopaedia.org/w/index.php?title=90&action=history, other Russian authors use the transliteration. It is a question of taste.
If Mikael is traveling and doesn't have a Cyrillic keyboard, he would be pleased to enter his email to authenticate. The password to be entered is the password linked with his username.
For people having multiple usernames (pseudos) with the same e-mail but different passwords for each, a better patch could be to test the password entered and to link with the unique username. But I am not an expert in mediawiki and php and don't know how to get the password within the function idFromName.
I understand that e-mails should not be revealed and the above patch satisfies this condition.
Comments, advices, critics, code are welcome.
Nicolas
Le 15 févr. 2012 à 23:57, Platonides a écrit :
> On 13/02/12 19:56, Nicolas Brouard INED wrote:
>> Thanks John for your comment. It would mean that people logging with an email will have a default account (lowest ID with the same email or whatever rule).
>>
>> For authorship Wikimedia doesn't encourage multiple account names (multiple (>3?) pseudos are blamed). And usually, for a corporate wiki you don't have multiple accounts. If you decide to change your name for any reason (divorce for example) you are supposed to have a (new) unique name. You usually can also have email aliases.
>>
>> And if you want to log on a specific account name, you can copy and paste your account name (if your keyboard doesn't allow you to enter your real, not transliterated, name).
>>
>> Thus, I am not sure that it is strong objection for corporate wikis at least.
>>
>> PS: I am trying to understand how to have a working MyAuthPlugin.php and to get the email in authenticate but it requires time and I haven't found so many examples on the Web.
>>
>> Any hint or comment is welcome.
>>
>> Nicolas
>
> I don't think you can do that with just an auth plugin. You would need
> to modify the SpecialUserLogin code to look for that email.
>
> PS: What's the big issue with copy&paste or transliteration? Doesn't
> your users have a keyboard layout able to type *their own name*?
> I understand the issue when a third party needs to enter them, but eg.
> Russian people usually have/can switch to a cyrillic keyboard layout.
>
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Nicolas Brouard INED
brouard(a)ined.fr
I've finished Nginx and Lighttpd support for the short URL building tool
(http://shorturls.redwerks.org/) and I also have unfinished support for
thumbnail handlers on the alpha (http://shorturls.jade.redwerks.org/).
I am wondering if anyone has spare resources they can use to give me a
temporary test environment for IIS with administrator privileges.
Essentially I want to setup IIS, install MediaWiki on it, and then
configure short urls on it determining the best way to do it as I have
done for both Nginx and Lighttpd. Then after that implement support for it
into the short URL builder and then re-configure the server and wiki to
make sure the config works.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]