WikiProject Extensions is presenting our first ever "Extension Page Review Drive" - http://www.mediawiki.org/wiki/Project:WikiProject_Extensions/Projects/Page_…
Several extension pages are long overdue for review. Many are lacking the appropriate tags, both header and within the extensions, which cause confusion for other developers and sysadmins. In addition to time wasted on bad installations, there is also time wasted by other developers providing tech support to these bad installations. This will also help with future drives reviewing actual code of extensions (although you're welcome to do thorough code review during this drive if you'd like) and moving extensions based in wikicode to the code repository.
The goal is to review as many of the pages as possible during the 1st quarter of this calendar year (so by March 31st). We've just officially started this and already 1% done. :)
During this drive, all extension pages are marked with an additional category. These will be removed once the drive is completed. Page drive specific template modifications and wikicode will also be removed upon the page drive's completion.
Any interested participants are welcome to sign on as a participant on: http://www.mediawiki.org/wiki/Project:WikiProject_Extensions/Projects/Page_…
Feel free to email me with any questions or feedback. :)
-greg aka varnent
-------
Gregory Varnum
Lead, Aequalitas Project
Lead Administrator, WikiQueer
Founding Principal, VarnEnt
@GregVarnum
fb.com/GregVarnum
Hi,
with regard to bug #24607, I wanted to enable external Stor-
age on my local PostgreSQL-backed wiki. The documentation
at
<URI:http://www.mediawiki.org/wiki/Manual:$wgExternalServers>
reads:
| An array of external MySQL servers.
| e.g.
| $wgExternalServers = array( 'cluster1' => array( 'srv28', 'srv29', 'srv30' ) );
but executing maintenance/storage/compressOld.php with that
configuration yields:
| [...]
| PHP Warning: Invalid argument supplied for foreach() in /var/www/html/w/includes/db/LoadBalancer.php on line 68
| PHP Warning: Invalid argument supplied for foreach() in /var/www/html/w/includes/db/LoadBalancer.php on line 68
| You must update your load-balancing configuration. See DefaultSettings.php entry for $wgDBservers.
| Backtrace:
| #0 /var/www/html/w/includes/db/LoadBalancer.php(571): LoadBalancer->reallyOpenConnection('0rv28', false)
^
| #1 /var/www/html/w/includes/db/LoadBalancer.php(492): LoadBalancer->openConnection(0, false)
| #2 /var/www/html/w/includes/ExternalStoreDB.php(56): LoadBalancer->getConnection(-2, Array, false)
| #3 /var/www/html/w/includes/ExternalStoreDB.php(150): ExternalStoreDB->getMaster('cluster1')
| #4 /var/www/html/w/maintenance/storage/compressOld.php(347): ExternalStoreDB->store('cluster1', 'O:27:"Concatena...')
| #5 /var/www/html/w/maintenance/storage/compressOld.php(94): CompressOld->compressWithConcat(0, 20, '', '', 'cluster1', false)
| #6 /var/www/html/w/maintenance/doMaintenance.php(105): CompressOld->execute()
| #7 /var/www/html/w/maintenance/storage/compressOld.php(408): require_once('/var/www/html/w...')
| #8 {main}
and a look at the code and wmf-deployment's addwiki.php and
renamewiki.php seems to indicate that the structure is a bit
more complex. Could someone update the documentation,
please?
TIA,
Tim
Thibault may not know that WorkingWiki [1] handles LaTeX documents in
wiki pages. - LW
[1] http://lalashan.mcmaster.ca/theobio/projects/index.php/WorkingWiki
> From: Sumana Harihareswara<sumanah(a)wikimedia.org>
> Subject: [Wikitech-l] New committers schuellersa, netbrain, and
> thibaultmarin
>
> Thibault Marin (thibaultmarin) works on
> http://www.mediawiki.org/wiki/Extension:TimelineTable and says, "I also
> have in mind a few other extensions I would like to work on, such as
> conversion of LaTeX documents to wiki pages."
Hi everyone,
Just a reminder that we're deploying MediaWiki 1.19 to most Wikipedias
in a few hours (including enwiki), starting at 23:00 UTC (3pm PST).
In January, Wikipedia represented 94.7% of our page traffic:
http://stats.wikimedia.org/EN/TablesPageViewsMonthlyAllProjects.htm
Since we've already deployed to Polish (pl), Dutch (nl), and Esperanto
(eo), we need to subtract that out. Those represent 2.85%, 1.20%, and
0.05% of our Wikipedia traffic respectively:
http://stats.wikimedia.org/EN/TablesPageViewsMonthlyOriginalCombined.htm
...for a total of 4.1% of Wikipedia traffic, and 3.88% of our overall
traffic. That means that today's deployment to the other Wikipedia
sites (roughly 90.8% of traffic) is clearly the big one.
Our plan is to deploy to enwiki first, then if things go well, march
down the list in traffic order.
Some things will inevitably break during this time, so please refer to
our maintenance notice for how to get in touch with us:
https://meta.wikimedia.org/wiki/Wikimedia_maintenance_notice
In short, you should be able to follow what's going on on the
#wikimedia-tech IRC channel.
We still have a few bugs that could use everyone's attention:
https://bugzilla.wikimedia.org/buglist.cgi?resolution=---&target_milestone=…
Please take a look, and if you can help, please do.
Thanks!
Rob
*Code Mixer*
Hackathon Event by MixOrg
*Event Introduction :*
Building a web application that focuses on a single industry vertical such
as the education (engineering colleges), Healthcare (Hospitals), Publishing
(Magazines), Fashion wear,and Hospitality( Restaurants and Hotels) . These
are just some of the ideas that can go into the app. We will leave the idea
open and see if they can come up with things which are more
innovative/attractive.
*Eligible Industries:*
1. Restaurant
2. Super Market(e.g. Walmart, Big Baazar, Vishal Mega mart, etc)
3. Shopping Malls
*Technology :*
While we do not want students to build a Facebook application but we want
it ultimately to be used there. So, the web app should follow the size
guidelines of Facebook Canvas ( 510 width * 2000 pixels +length) .
(Recommended : Also, building a Facebook app requires SSL certificate so
students are better off building a stand alone web app.) They can use any
platform/technology and also refer to existing applications on Facebook for
inspiration. Also, they can use social components such like/comment within
their app.
*Scoring:*
The apps will be scored on the basis of the novel ideas that go into it,
the design and User interaction, the reusability of the app, etc
*Terms & Conditions:*
1. Ideas should not be copied or replicated as same from any other web app.
If found registrations will be cancelled.
2. Submission of entries after closing time will not be considered as valid
entry.
3. Extra points for well indented code. Marks will be deducted in case of
hard code.
4. Submit application in .tar.gz and zip.
5. Also containing README file which will be containing all the details
regarding application.
*Contest Duration:*
Opening of registration from 25th February, 2012
Announcements of starting of contest on 27th feb,2012
Closing of registration on 1st March, 2012 till Midnight.
Submission of Application on 3rd March, 2012 till 5:00PM.
Announcements of Results of contest on 4th March, 2012
NOTE : Details of uploading will be mailed to all the participants after
registration on the Google Docs Page of mix coder.
*Contact: *
Sheel Sindhu
+91 9711357056
Amit Shah
+91 9891044714
*Website Link:* http://algorhythm12.in/hackathon
--
-------------------------------------------------------------*
*Sheel Sindhu Manohar ( शील सिंधु मनोहर )
Manager JMILUG
*www.jmilug.org *
Founder Linux Adda
*www.linuxadda.org*
-------------------------------------------------------------
> Message: 5
> Date: Wed, 29 Feb 2012 02:03:10 +0530
> From: Shivansh Srivastava <shivansh.bits(a)gmail.com>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Subject: Re: [Wikitech-l] [Wikimediaindia-l] GSoC'12 Proposal : List
> of Ideas
> Message-ID:
> <CAPJSHrnjCxeOE3R8iJxbDEsV4b90_96ifxkVJJQCnFeZnV74kg(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> I am infact working on the News Ticker. It is taking a lot of time than I
> had imagined. Thats why I have started working on it, to get rid of the
> initial ridges!
Note, we already have something fairly similar (The main difference
between what you're describing is that existing ticker works on a
static list of pages that isn't updated, instead of doing some
auto-update magic with ajax). See
https://en.wikinews.org/wiki/Template:Ticker and
https://en.wikinews.org/wiki/User:Bawolff/sandbox/ticker .
Cheers,
-bawolff
Hello,
I would appreciate any help regarding inserting custom text into articles.
The problem setting is as follows:
I am currently trying to create a wiki where each article is about a
specific entity (say, for example, a book). Above each article's body
content, I want to insert custom text (such as the author of the book, the
retail price of the book, the publisher of the book, etc) in the form of
HTML. I do not want the users to be able to edit these custom text, and
thus I want to "inject" these text as the content of the page gets loaded.
What would be the cleanest approach to solve this problem?
I have looked at the list of hooks available at
http://www.mediawiki.org/wiki/Manual:Hooks,
but was unable to find any hook that satisfied my need. I have also tried
modifying the core code (the outputPage() method of the Skin class, for
example), but things got ugly pretty quickly, and it was hard to maintain
the code. I would appreciate any help.
Thanks in advance,
Naoki Orii
Hi folks,
As you may know, WMF's Platform Engineering group plans to embark on a
major performance initiative this year, and had chosen inline
scripting as having the biggest potential impact given what's
practical now. Tim Starling build a Lua prototype last year which
showed a lot of promise for making things much faster. One major
decision before embarking on this effort was a decision on whether
we'd stick with Lua or try another language such as Javascript or
Victor's WikiScript implmentation. I wanted to make a decision by the
end of the month[1], and I think we've done it.
We've decided to build a deployable version of Lua as a new
alternative to wiki markup for templates, barring some scandalous
revelation about Lua's lurid past or other unforeseen barrier. Tim
will be leading this effort, and will start on the implementation some
time after the dust settles on the 1.19 deployment and the Git
migration. The project page for this is located here:
http://www.mediawiki.org/wiki/Lua_scripting
Rough notes from our meeting yesterday are also available [2]
Rob
[1] http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/57769…
[2] http://www.mediawiki.org/wiki/Lua_scripting/Meeting_2012-01-25