Pretty cool! It's really possible to generate the descriptions from
that. thanks!
-----
Yury Katkov, WikiVote
On Sun, Apr 14, 2013 at 12:49 AM, Merlijn van Deen <valhallasw(a)arctus.nl> wrote:
> Hi Yury,
>
> Cleaning up my pywikipedia backlog, and in the category 'better late
> than never':
>
> On 6 August 2012 14:05, Yury Katkov <katkov.juriy(a)gmail.com> wrote:
>> If no such extensions exists could anybody tell how to programaticaly get
>> the parameters of a given template?
>
> The 'generatexml' function in the API should do roughly what you want:
>
> http://www.mediawiki.org/w/api.php?action=parse&page=Template:Gerrit-review…
>
> Then walk over all nodes using //tplarg/title
>
> Merlijn
Summary: do you want to know about new opportunities to contribute your
technical skills, but don't want to find them in a high traffic mailing
list?
Subscribe to receive call for action at
https://lists.wikimedia.org/mailman/listinfo/wikitech-announce
-------- Original Message --------
Subject: Recycling wikitech-announce for tech contributors
Date: Sat, 13 Apr 2013 12:23:21 -0700
From: Quim Gil <qgil(a)wikimedia.org>
Organization: Wikimedia Foundation
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
Hi,
We were missing a way to notify tech contributors and volunteers about
new activities, and after some discussion [1] we have decided to recycle
the unused
https://lists.wikimedia.org/mailman/listinfo/wikitech-announce
Subscribers will receive CALLS FOR ACTION ONLY e.g. for activities like
the ones listed at https://www.mediawiki.org/wiki/Project:Calendar. No
announcements of new releases, features removed, etc. We have channels
already for that.
We will sync the announcements at wikitech-announce with with wikitech-l
and wikitech-ambassadors.
While this is not a big deal for current contributors following already
wikitech-l and a number of wiki pages etc, it will help potential
volunteers willing to get involved and know about opportunities to
contribute.
[1]
https://www.mediawiki.org/wiki/Project_talk:New_contributors#How_to_solve_N…
PS: before clicking reply please read
http://www.gossamer-threads.com/lists/wiki/wikitech/282349 :)
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
This is a notice that on Monday, April 15th between 20:00-21:00 UTC
(1-2pm PDT) Wikimedia Foundation will release security updates for
current and supported branches of the MediaWiki software. Downloads
and patches will be available at that time, with the git repositories
updated later that afternoon. CVSS scores are between 4.3 and 7.1,
most users will want to update.
fyi
-------- Original Message --------
Subject: Sponsoring travel to Wikimedia Hackathon
Date: Thu, 11 Apr 2013 10:05:53 -0700
From: Quim Gil <qgil(a)wikimedia.org>
Organization: Wikimedia Foundation
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
Hi, you have probably heard about
Wikimedia Hackathon
24-26 May in Amsterdam (Netherlands)
http://www.mediawiki.org/wiki/Amsterdam_Hackathon_2013
There are about 100 participants registered, and we have some room for
more. Registration is free but you need to sign up. There is also some
travel sponsorship budget left after a first round of approvals.
If your free software contributions and your Wikimedia love is more
valuable than the money you have in the bank, you can just register and
apply for scholarship. Please include public URLs where we can see your
open source licensed contributions (code, pixels, wise words...). A CV
alone won't cut it, no matter how many avatars appear to endorse your
skills.
Hurry up! The organizers are reviewing applications as they come.
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello,
I'm a student of a master in free software in Spain. For performing a job,
I ask for your cooperation by answering the following questions: 1. - In
what ways began collaborating in an open source project?
2. - What is your motivation to participate in a free software project:
social motivation, technological, economic, etc?
3. - In their collaborations, which is the project you found more motivated?
4. - What do you think is the main reason that generally has the other
contributors?
5. - Do you think your participation in open source projects will help
improve your future career?
6. - Do you think in the future continue to work on open source projects?
or in Spanish:
1.- ¿De que manera comenzó a colaborar en un proyecto de software libre?
2.- ¿Cual es su motivacion para participar en un proyecto de software
libre: motivacion social, tecnologica, economica, etc?
3.- Dentro de sus colaboraciones, ¿en que proyecto se ha encontrado más
motivado?
4.- ¿Cual cree que es la principal motivación que en general tiene el resto
de colaboradores?
5.- ¿Cree que su participación en proyectos de software libre le ayudara a
mejorar su futuro profesional?
6.- ¿Piensa en un futuro seguir colaborando en proyectos de software libre?
In advance, thank you very much.
Dear Specialists,
is there a difference between search founds in:
http://wiki.openstreetmap.org/wiki/<page-title>
and
http://wiki.openstreetmap.org/wiki/<page-title>/<sub-page-title>
?
How much well be the improvement, if we change
"sub-title-pages" to "title-pages"?
or is this equal?
(Google and Mediawiki-search)
Best regards,
Markus
Hello, I am using Squid-3.1 in acceleration mode and trying to
accelerate mediawiki-1.20.2.
I am trying to get squd to cache mediawiki's pages. Here is my squid config
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
acl web_ports port 80
acl purge method PURGE
http_access allow manager localhost
http_access deny manager
http_access allow web_ports all
http_access allow purge localhost
http_access deny purge
http_access deny all
http_port 10.50.79.19:80 accel
defaultsite=ec2-54-228-113-248.eu-west-1.compute.amazonaws.com vhost
ignore-cc
cache_peer 127.0.0.1 parent 8080 0 no-query originserver round-robin
name=wiki
cache_dir aufs /var/spool/squid/cache 500 16 256
refresh_pattern -i ^http: 5 100% 1440 ignore-no-cache
mediawiki is configured to use squid
// Squid
$wgUseSquid = true;
$wgSquidServers = array('10.50.79.19');
$wgSquidServersNoPurge = array('127.0.0.1');
But I am getting on request to main page, which is dymaically generated by
mediawiki, this response headers:
Cache-Control |s-maxage=18000, must-revalidate, max-age=0|
Connection |close|
Content-Language |ru|
Content-Type |text/html; charset=UTF-8|
Date |Mon, 08 Apr 2013 11:24:07 GMT|
Last-Modified |Mon, 08 Apr 2013 10:43:40 GMT|
Server |Apache/2.2.24 (Amazon)|
Vary |Accept-Encoding,Cookie|
X-Cache |MISS from ip-10-50-79-19.eu-west-1.compute.internal|
X-Cache-Lookup |MISS from ip-10-50-79-19.eu-west-1.compute.internal:80|
X-Content-Type-Options |nosniff|
X-Powered-By |PHP/5.3.20|
via |1.0 ip-10-50-79-19.eu-west-1.compute.internal (squid/3.1.10)|
so we see, that squid return result not from cache, but from the apache
web server.
We see coresponding records in Apache logs when cache misses.
While request to static content, like picture files return this headers
Accept-Ranges |bytes|
Age |15438|
Connection |keep-alive|
Content-Length |10204|
Content-Type |text/css|
Date |Mon, 08 Apr 2013 07:06:50 GMT|
Etag |"227ef-27dc-4d9846436bc62"|
Last-Modified |Thu, 04 Apr 2013 08:02:27 GMT|
Server |Apache/2.2.24 (Amazon)|
X-Cache |HIT from ip-10-50-79-19.eu-west-1.compute.internal|
X-Cache-Lookup |HIT from ip-10-50-79-19.eu-west-1.compute.internal:80|
via |1.0 ip-10-50-79-19.eu-west-1.compute.internal (squid/3.1.10)|
and here we see, that squid return content from the cache.
What can be the cause of cache MISS on main page ?
How to get squid to actually cache mediawiki's article pages ?
Thanks.
I've recently been battling a painful bug on
http://practicalplants.org, whereby all Cite <refs> were left as
unsubstituted UNIQ…QINU tags for anonymous users, when anonymous
editing is disabled, and additionally a "You do not have permission to
edit this page" message was output at the top a number of times, when
viewing the page. The bug is not present if the page is rendered by an
authenticated user.
In going through bug reports it seems this can be triggered by a pretty
wide range of problems, most of which are reported as resolved, eg
https://bugzilla.wikimedia.org/show_bug.cgi?id=14959
Through very slow delete-save-does-it-work-yet debugging I finally
hunted it down to a small typo in a template - {{#arrayap…}} instead of
{{#arraymap…}}) - and oddly to assigning a comma separated list to the
SMW property. I have no clue why either of these things should cause
such behaviour.
Anyway, I'm curious if anyone has any experience with the internals
behind this bug (or bugs?), and could give me some a starting point to
begin digging around and see if I can find a way to ouput a useful
error message or something. Also, if anyone has any idea why this only
affects anonymous users when anonymous editing is disabled, that might
help me find a starting point too,
Thanks
Andru
Hi,
I'm using MW 1.20.0 on my own up-to-date Linux server. Everything is working
fine except for dumpBackup.php, which I use for Lucene-search.
dumpBackup.php does not finish, but stops at random pages and MySql stops
working. I have to restart MySql to be able to use the wiki again. Seldomly
dumpBackup.php works fine.
I can't find any error messages (MySql, Php) and therefore I have no idea
what the problem is.
Thanks for any help,
UdoZ
Typicall output, when it stalls:
Dumping t4_wiki...
2013-04-07 11:06:21: t4_wiki (ID 28260) 99 pages (1248.5|1248.5/sec
all|curr), 100 revs (1261.1|1261.1/sec all|curr), ETA 2013-04-07 11:06:32
[max 13651]
[...]
2013-04-07 11:06:22: t4_wiki (ID 28260) 3299 pages (2088.3|78866.7/sec
all|curr), 3300 revs (2088.9|2390.6/sec all|curr), ETA 2013-04-07 11:06:27
[max 13651]
A database error has occurred. Did you forget to run maintenance/update.php
after upgrading? See:
https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
Query: SELECT page_id,page_len,page_is_redirect,page_latest FROM `page`
WHERE page_namespace = '0' AND page_title = 'CHS' LIMIT 1
Function: LinkCache::addLinkObj
Error: 1053 Server shutdown in progress (localhost)
Backtrace:
#0 /home/t4_wiki/public_html/wiki/includes/db/Database.php(899):
DatabaseBase->reportQueryError('Server shutdown...', 1053, 'SELECT
page_id...', 'LinkCache::addL...', false)
#1 /home/t4_wiki/public_html/wiki/includes/db/Database.php(1322):
DatabaseBase->query('SELECT page_id...', 'LinkCache::addL...')
#2 /home/t4_wiki/public_html/wiki/includes/db/Database.php(1413):
DatabaseBase->select('page', Array, Array, 'LinkCache::addL...', Array, Array)
#3 /home/t4_wiki/public_html/wiki/includes/cache/LinkCache.php(216):
DatabaseBase->selectRow('page', Array, Array, 'LinkCache::addL...', Array)
#4 /home/t4_wiki/public_html/wiki/includes/Title.php(2808):
LinkCache->addLinkObj(Object(Title))
#5 /home/t4_wiki/public_html/wiki/includes/Title.php(2826):
Title->getArticleID(0)
#6 /home/t4_wiki/public_html/wiki/includes/WikiPage.php(762):
Title->isRedirect()
#7 /home/t4_wiki/public_html/wiki/includes/Export.php(600):
WikiPage->getRedirectTarget()
#8 /home/t4_wiki/public_html/wiki/includes/Export.php(441):
XmlDumpWriter->openPage(Object(stdClass))
#9 /home/t4_wiki/public_html/wiki/includes/Export.php(379):
WikiExporter->outputPageStream(Object(ResultWrapper))
#10 /home/t4_wiki/public_html/wiki/includes/Export.php(123):
WikiExporter->dumpFrom('')
#11 /home/t4_wiki/public_html/wiki/maintenance/backup.inc(234):
WikiExporter->allPages()
#12 /home/t4_wiki/public_html/wiki/maintenance/dumpBackup.php(77):
BackupDumper->dump(2, 0)