Hello everyone. I'm applying for GSoC to work on Wikipedia Offline,
streamlining the process to make collection and generation much
My proposal is at http://www.mediawiki.org/wiki/User:Yuvipanda. It'll
be awesome if I can have feedback on this.
I promise you invisible pink unicorns as prizes :)
Yuvi Panda T
This is Peng Wan. I have submitted my application to wikimedia of Gsoc.
My Project title is "Figuring out the most popular pages".
Here is the project's short description:
The feature aims to figure out the most popular and favorite pages in
wikimedia. The most popular pages are calculated when users click on pages.
The click event can send the action to the database. Then we can figure out
the most popular pages by querying through the destination urls. As for the
most favorite pages, I want to add a "like" or "+1" tag in every page. If
one user likes the content in the page, s/he just need to click the "like"
link to add the "like" number in database.
Here is my proposal link:
I would appreciate for your advice about my proposal.
Extension and Core:
* Paul Copperman (pcopp)
Paul has contributed several patches to core MediaWiki and FlaggedRevs
Code Maintenance Engineer
San Francisco, CA
Hmmm, maybe I should have never 'linked so deep':
Notice: Undefined property: SkinVector::$mTitle in
/home/jidanni/mediawiki/LocalSettings.php on line 55 Fatal error: Call
to a member function quickUserCan() on a non-object.
The code that used to work was:
if (preg_match("/^\d/", $mTitle->getText()) && $mTitle->getNamespace() <= NS_TALK)
return true;} return false;}
Maybe I should just say in my message 'if the page you were trying to
edit was a talk page, then ...', instead of trying to determine it
myself with the ever changing internal parameters.
I saved the HTML source of a typical Page: page from it.source, the
resulting txt file having ~ 28 kBy; then I saved the "core html" only, t.i.
the content of <div class="pagetext">, and this file have 2.1 kBy; so
there's a more than tenfold ratio between "container" and "real content".
I there a trick to download the "core html" only? And, most important: could
this save a little bit of server load/bandwidth? I humbly think that "core
html" alone could be useful as a means to obtain a "well formed page
content", and that this could be useful to obtain derived formats of the
page (i.e. ePub).
I recently fixed a bug (14901) and attached some patches to the bug
ticket. Included were some changes to languages/messages/MessagesEn.php.
Following the directions given at http://www.mediawiki.org/wiki/
Localisation#Changing_existing_messages, I contacted #mediawiki-i18n and
asked them for instructions how to proceed with the internationalization
part of the work.
After looking over my changes, they pointed out that my new message text
violated some requirements, specifically:
+ I had used "/n" rather than literal newlines in the text
+ I had put newlines in front of some messages
+ I had put trailing whitespace in some messages
Since I was (and am) unfamiliar with MW internationalization workflow and
since this text is destined for email messages, not display on a wiki
page, I was puzzled by these requirements. However, it was explained to
me that internationalization occurs by display of the text on
translate.net, which means the text must conform to web page display
rules. Consequently, the problems specified above interfere with the
I understand I need to fix these problems. However, before proceeding I
wonder if there are other constraints that I must observe. I have read
the material at http://www.mediawiki.org/wiki/Localisation, but did not
find the above constraints mentioned there. Is there another place where
they are specified?
-- Dan Nessett
Today, some users on ro.wp reported that the citations used more than
once in the text had changed appeareance. Instead of "^a,b", we now
have something like "↑ 34,0 34,1". This was considered disturbing by
some users. Furthermore, the backlinks from the notes section to the
article are not functioning. This does not happen on en.wp or other
wikis. See http://ro.wikipedia.org/wiki/Dorin_Goian vs.
>From [[Special:Version]], I checked that the versions of the Cite
extension were identical, then I searched for a relevant bug in
bugzilla, both at the "Site requests" and at the "Cite extension", but
I could find nothing. Does anybody know what is the matter with the
links? I did not file a bug report because I'm not sure if this is a
php problem or some setting on ro.wp which makes the extension
Thanks a lot for any pointers,
> Message: 1
> Date: Tue, 5 Apr 2011 23:35:00 +0300
> From: Strainu <strainu10(a)gmail.com>
> Subject: [Wikitech-l] Need some help with a problem with the Cite
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <BANLkTik6Lbr=oC1sgCpbMhY0QJnWt6MJTQ(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
> Today, some users on ro.wp reported that the citations used more than
> once in the text had changed appeareance. Instead of "^a,b", we now
> have something like "? 34,0 34,1". This was considered disturbing by
> some users. Furthermore, the backlinks from the notes section to the
> article are not functioning. This does not happen on en.wp or other
> wikis. See http://ro.wikipedia.org/wiki/Dorin_Goian vs.
> >From [[Special:Version]], I checked that the versions of the Cite
> extension were identical, then I searched for a relevant bug in
> bugzilla, both at the "Site requests" and at the "Cite extension", but
> I could find nothing. Does anybody know what is the matter with the
> links? I did not file a bug report because I'm not sure if this is a
> php problem or some setting on ro.wp which makes the extension
> Thanks a lot for any pointers,
>From what I can tell, it seems as if a wrong version of
somehow got cached somewhere. It was changed to its current value in
April 2007. However, it was showing the
wrong value in [[special:allmessages]]. Anyways, after purging that
page (or perhaps this is all a big coincidence),
the right version appeared in Special:allmessages and the correct
behaviour was observed on [[Dorin_Goian]]
(after purging that page too to get rid of the parser cached version of it).
So I'm not sure what the deal with that was, but it seems to be a one
(Or also entirely possible, it fixed itself well I was looking at it,
and I just think purging the page
had something to do with it, where really its all a big coincidence).