> Message: 5
> Date: Wed, 29 Feb 2012 02:03:10 +0530
> From: Shivansh Srivastava <shivansh.bits(a)gmail.com>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Subject: Re: [Wikitech-l] [Wikimediaindia-l] GSoC'12 Proposal : List
> of Ideas
> Message-ID:
> <CAPJSHrnjCxeOE3R8iJxbDEsV4b90_96ifxkVJJQCnFeZnV74kg(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> I am infact working on the News Ticker. It is taking a lot of time than I
> had imagined. Thats why I have started working on it, to get rid of the
> initial ridges!
Note, we already have something fairly similar (The main difference
between what you're describing is that existing ticker works on a
static list of pages that isn't updated, instead of doing some
auto-update magic with ajax). See
https://en.wikinews.org/wiki/Template:Ticker and
https://en.wikinews.org/wiki/User:Bawolff/sandbox/ticker .
Cheers,
-bawolff
Hello,
I would appreciate any help regarding inserting custom text into articles.
The problem setting is as follows:
I am currently trying to create a wiki where each article is about a
specific entity (say, for example, a book). Above each article's body
content, I want to insert custom text (such as the author of the book, the
retail price of the book, the publisher of the book, etc) in the form of
HTML. I do not want the users to be able to edit these custom text, and
thus I want to "inject" these text as the content of the page gets loaded.
What would be the cleanest approach to solve this problem?
I have looked at the list of hooks available at
http://www.mediawiki.org/wiki/Manual:Hooks,
but was unable to find any hook that satisfied my need. I have also tried
modifying the core code (the outputPage() method of the Skin class, for
example), but things got ugly pretty quickly, and it was hard to maintain
the code. I would appreciate any help.
Thanks in advance,
Naoki Orii
Hi folks,
As you may know, WMF's Platform Engineering group plans to embark on a
major performance initiative this year, and had chosen inline
scripting as having the biggest potential impact given what's
practical now. Tim Starling build a Lua prototype last year which
showed a lot of promise for making things much faster. One major
decision before embarking on this effort was a decision on whether
we'd stick with Lua or try another language such as Javascript or
Victor's WikiScript implmentation. I wanted to make a decision by the
end of the month[1], and I think we've done it.
We've decided to build a deployable version of Lua as a new
alternative to wiki markup for templates, barring some scandalous
revelation about Lua's lurid past or other unforeseen barrier. Tim
will be leading this effort, and will start on the implementation some
time after the dust settles on the 1.19 deployment and the Git
migration. The project page for this is located here:
http://www.mediawiki.org/wiki/Lua_scripting
Rough notes from our meeting yesterday are also available [2]
Rob
[1] http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/57769…
[2] http://www.mediawiki.org/wiki/Lua_scripting/Meeting_2012-01-25
Sascha Schueller (schuellersa) is developing the Mediawiki extension:
http://www.mediawiki.org/wiki/Extension:SolrStore.
Kim Eik (netbrain) is working on several things:
" - Extending the Maps extension to include a tag <display_lines> which
will display maps with lines between points.
- Enable copy coordinates through secondary mouse click
- Read kml files through | kml=http://..../[[Link]]
- Additional option to allow map balloons to contain numbers and
letters.
- ....
- Making it possible for <gallery> tag to include links to an a page
instead of the actual image. (e.g | link=)
Purposed syntax:
<gallery>
File:filename.ext|caption|link=some url
</gallery>
- New extension side_bar which simplifies creating drop down menus
Purposed syntax:
<sidebar>
+/-Main chapter (+=show by default, -=hide by default)
*subchapter level1
**subchapter level2
***subchapter level3
+/-New main chapter
</sidebar>"
Thibault Marin (thibaultmarin) works on
http://www.mediawiki.org/wiki/Extension:TimelineTable and says, "I also
have in mind a few other extensions I would like to work on, such as
conversion of LaTeX documents to wiki pages."
Sascha, Kim, and Thibault all have extensions access. Also, Petr Bena
now has core commit access.
Congratulations and welcome!
We have now responded to all our commit access requests and the queue is
empty; if you are waiting for a response from us, please check your spam
folder.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hi,
I have created my User page, envisaging the ideas I shared through my last
mail - https://en.wikipedia.org/wiki/User:Shivansh13
Please let me know the prospective of the ideas suggested.
Regards,
--
Shivansh Srivastava | +91-955-243-5407 |
http://in.linkedin.com/pub/shivansh-srivastava/17/a50/b18<mr.shivansh.srivastava(a)gmail.com>
<mr.shivansh.srivastava(a)gmail.com>Secretary, BITS Alumni Affairs Division
| Web Expert, Newsletter, BITSAA International
3rd Year Undergraduate | B.E. (Hons.) - Electronics & Instrumentation
BITS-Pilani.
Hi,
You write on https://www.mediawiki.org/wiki/MediaWiki_1.19/Roadmap:
> Wednesday, March 1 (-2), 23:00-03:00 UTC (3pm-7pm PST):
Stage 5 deployment to:
- All Wikipedia sites
Do you mean Wednesday, Feb 29, or Thursday, March 1? :-)
--
Bináris
Which extensions could use more active attention? Let me know so I can
suggest this work to developers with interest & spare time. For
example, if volunteers are writing good patches that await review, maybe
we could encourage them to take over triaging bugs and maintain the
extensions more actively.
This is especially worth investigating for extensions that WMF deploys.
The more developers who learn an extension's codebase and take a hand
in maintaining it, the more quickly we can respond to possible problems.
I looked at
https://www.mediawiki.org/wiki/Category:Extensions_used_on_Wikimedia [0]
and ran a Bugzilla search for open bugs on those extensions with
severity Normal or higher and priority Normal or higher.
http://ur1.ca/85xoz
Numbers below are from a couple weeks ago when I was drafting this mail,
but wouldn't be that different right now. Patches awaiting review:
ParserFunctions: 8
Math (texvc): 5
Cite: 5
AntiSpoof: 4
CentralAuth: 4
ConfirmEdit: 3
DismissableSiteNotice: 3
DumpHTML: 3
ProofreadPage: 3
CheckUser: 2
DynamicPageList2: 2
Lucene Search: 2
MobileFrontend: 2
SyntaxHighlight (GeSHi): 2
WikiEditor: 2
CategoryTree: 1
CharInsert: 1
EasyTimeline: 1
ImageMap: 1
Nuke: 1
OggHandler: 1
Poem: 1
ReaderFeedback: 1
UploadWizard: 1
Vector: 1
As of today there are 137 patches awaiting review for MediaWiki, and 60
patches awaiting review for extensions that WMF deploys.
[0] If someone else wants to figure out & align the various lists of
extensions deployed on WMF servers, the sources to use are
https://translatewiki.net/wiki/Main_Wikimedia_extensions ,
http://svn.wikimedia.org/viewvc/mediawiki/trunk/translatewiki/MediaWiki/Wik…
and https://www.mediawiki.org/wiki/Category:Extensions_used_on_Wikimedia .
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hi everyone,
We'd like to postpone the Git migration 2.5 weeks, with a new final
migration date of Wednesday, March 21. Here's the convergence of
factors that led us to the new date:
* The 1.19 deployment has kept us busy enough that none of the rest
of us in Platform Engineering have had spare cycles to help Chad out
* It's quite likely we'll have many fixes after we deploy to the
bigger wikis (enwiki and friends) on Wednesday
* We have a number of unresolved issues in our Git+Gerrit deployment [1][2]
* Code review is falling back behind. As of right now, we have 38
unreviewed revisions in core (phase3), and another 189 unreviewed
revisions in extensions. That's up from the 4 core + 28 extensions on
February 4. We basically let code review get back out of hand as
we've turned our focus toward bugfixing in deployment.
Chad and Ryan also discovered today that the machine we're using for
Git and Gerrit (formey, which is also SVN) just isn't up to hosting
the whole mess. So, there's a machine deployment we need to do as
well. The good thing about this
Here's the new plan:
* Week of 3/5 - MediaWiki 1.19RC1 release. Code review stats by
end-of-week: 20 new on phase3, 100 new on phase3+extensions
* Week of 3/12 - MediaWiki 1.19.0 release. Code review stats by
end-of-week: as close to zero as possible in phase3+extensions.
Possibly even 1.20wmf1 (first mini deployment untethered to release
schedule, first of many...1.20wmf2, 1.20wmf3, etc)
* Week of 3/19 - Git migration week. Migration day: Wednesday, 3/21
A MediaWiki tarball release *should* be a relatively minor endeavor.
A deployment during this time is a stretch goal. We should be able to
make a deployment from a more recent point on trunk if we're
disciplined about actually getting through the code review backlog and
we do a good job in review. Doing a good job means reverting when we
need to.
The top priority for Platform Engineering will be the Git migration,
so anything that distracts from that (like, for instance, a 1.20wmf1
deploy) may get postponed while we finish this off once and for all.
Thank you everyone for your patience on this transition.
Rob
[1] http://www.mediawiki.org/wiki/Git/Conversion#Unscheduled_items
[2] https://bugzilla.wikimedia.org/showdependencytree.cgi?id=22596&hide_resolve…
In my spare time at Redwerks I've been working on a Short URL
configuration tool:
http://shorturls.redwerks.org/
Our Short URL manual pages have been VERY bad for quite awhile. Every last
one of them has bad practice on them and I only managed to fix one of them.
Considering how we have so many different manual pages simply because
people have slightly different configuration requirements (eg: One for /w
and /wiki/ another for / and /wiki/, another for /subpath/wiki, etc... One
for .htaccess, another for Alias in Apache config, another for RewriteRule
in Apache config, another for Nginx, etc...) rather than trying to fix I
started writing a tool to build the configuration instead.
The tool tries to auto-detect as much as possible (practically everything
in fact). Everything from the type of server, the sapi (mod_php vs. ?),
your scriptpath, etc... it even tries to jump ahead of you and guess what
kind of article path you were intending to use. It even has an early
feature to try and preemptively detect if you're likely to have root
access or no root access (The idea is to detect what kind of host you're
on using the reverse dns for the server you're on).
The configuration generator actually isn't really a simple thing. There
are a lot of conditionals involved in the tool. It can handle the special
cases needed for root /$1 style urls. It knows how to add an extra rewrite
when you use /w and /wiki and need / to redirect to your wiki. It uses
Apache's %{DOCUMENT_ROOT} in RewriteRules but also lets you expand an
absolute docroot when you use an Alias (you can't use %{DOCUMENT_ROOT}
there). I've loaded the Nginx config full of deny rules and conditions
that most people never bothered to properly configure (and yes, it can
handle root urls, and even knows how to stop Nginx from executing php in
uploaded files). Heck, this tool can actually handle TWN's wacky /w/i.php
script path.
The tool is definitely beta right now. I have Apache and Nginx support
written. I'll probably want to have some discussion with Lighttpd and ISS
users to fix the configuration some of them are using and add code to the
tool to support those types of servers. There are still some conditions it
might not handle just yet. For example I haven't written the code to
handle root style urls in root Apache config files.
Feel free to start using it when you setup a MediaWiki installation. I'd
love to know when configuration doesn't work and what kind of tweaks I
need to make to it. As well I'd like to see how the configurations handle
different shared host setups, etc... I'd also like to see people on
different hosts, both shared host users and VPS, Dedicated, etc... users
running Apache jump in and make use of the yellow message's question on
whether you're in a root or shared environment. If you use that specific
feature it keeps track of reverse dns and the response, which I can use to
try and preload rules to indicate what host's reverse dns patterns are for
shared hosting users and what are for VPS servers and the like.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
Hi all!
And here's another hi: Hi! This is my first post to this list, so here is a quick intro in case you missed the other ones. I'm Andrew Otto, an engineering on the new Analytics team. I'm working with David Schoonover (new hire as well), Fabian Kaelin, and Diederik van Liere. Right now we're working on some prototypes for the a WikiMedia report card.
I think we are the first team that is doing active work in git using Gerrit, and Robla asked me to reach out here to describe our experiences and ask for help. We're struggling right now to be productive using Gerrit (I spent 3 hours today just trying to merge a branch), but it could be do to our lack of experience with it. There have been a couple of emails bouncing around to Ryan Lane and Roan, but it might be more productive if I made this conversation more visible here. I'll start with some questions.
1. Will Gerrit allow us to create branches without using the web GUI, and without having to be a Gerrit admin for a project?
One of the points of using git is to be able to create branches at will. We're finding this very difficult right now, not only because creating requires GUI admin access, but because of other reasons explained below.
2. Do I need to rebase every time I push for review?
I don't quite understand what is going on here. I've installed git-review and am using this to push to git. It does a rebase by default. I'm not sure if I should be turning that off or not. Rebases seem like a bad idea unless you really need to do them. I think git-review is doing a rebase by default so it can squash all of your local commits into one big review commit before pushing. Yuck! This would surely mean fewer commits to review in Gerrit, but it destroys real the history. It is making git work more like subversion, where you just work locally until everything is good and then have one big commit. I should be able to commit often and be able to share my commits with other developers before having everything reviewed.
3. How does Gerrit handle merges? Do all merge commits need to be re-approved?
4. What should I do in the following situation?
I have a branch I recently made from master. I've made some changes and pushed them to gerrit. My changes have been approved. Now I want to sync master into my branch. I do
git merge master
Then resolve any conflicts and commit. How should I push these changes? The commits that make up the merge have already been approved in gerrit on the master branch. Do I need to push for review using git-review? They've already been approved, so I would think not. But gerrit will currently not allow me to push without using git-review (is that because the commits need a Change-Id?).
Since gerrit doesn't let me do a regular git push to push my master merge to the remote branch I am tracking, I do git-review. This does rebase by default, so for some reason I am stuck having to resolve every single commit that was made to master in order to get the merge to push. This takes quite a while, but I did it, and once the interactive rebase was finished I was able to git-review to push the merge from master.
Great. Now I that my branch is in sync with master again, I want to merge it into master.
git checkout master
git merge my_branch
All good. Then what? Since I can't do just 'git push', I try git-review again. The same thing happens. I have to run through the whole interactive rebase routine and resolve each of my commits from my_branch manually. I do that, then run 'git-review' again. Now I get this error message:
remote: Hint: A potential Change-Id was found, but it was not in the footer of the commit message.
To ssh://otto@gerrit.wikimedia.org:29418/analytics/reportcard.git
! [remote rejected] HEAD -> refs/for/master/master (missing Change-Id in commit message)
error: failed to push some refs to 'ssh://otto@gerrit.wikimedia.org:29418/analytics/reportcard.git'
Each of the commits I merged from my_branch come with their own Change-Id in the commit messages. But these commits are now merge commits (I think?), so they have information about the merge and any conflicts in the commit message below the original Change-Id. I think this is confusing Gerrit, because it doesn't see the Change-Id in the footer.
Now I'm stuck, I'm really not sure how to push anymore. I want to get Diederik some of my changes, but I can't push them to master.
Thanks for the help everybody! It sounds like we in Analytics are the git+gerrit workflow Guinea pigs, eh? We're happy to fill this role, but SCMs are supposed to streamline and improve work flow, and right now Gerrit is being a big ol' nasty nancy. Help us iron this out so we can keep working!
- otto
http://ottomata.comhttp://www.flickr.com/photos/OttomatonAhttp://www.couchsurfing.org/people/otto