In a dump of the German Wikipedia from July 14th, Jakob Voss found
that the {{Personendaten}} template was used in 47785 articles.
When I went to the template page on August 22nd and clicked "what
links here", I found 42147 links,
http://de.wikipedia.org/wiki/Vorlage:Personendaten
I don't believe that the template was removed from 5638 pages in
the six weeks that have passed since the dump. So what could
explain the difference in these numbers? Is the link table out of
sync? Could/should this be fixed? How does it work?
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Over the past few days, editors of the English Wikipedia have been
experiencing intermittent edit failures, including "connection
refused", and "502" errors, which sometimes mean the edit went through, and sometimes don't. For instance, a few minutes ago, I got:
500 Can't connect to en.wikipedia.org:80 (connect: Connection refused)
(My bot was attempting to edit [[Deep_Submergence_Vehicle]].)
And a few minutes later:
502 Bad Gateway
Sorry- we have a problem...
The wikimedia web server didn't return any response to your request
To get information on what's going on you can visit <a
href="irc://irc.freenode.net/wikipedia">#wikipedia</a>.<br /> An
"offsite" status page is hosted on <a
href='http://openfacts.berlios.de/index-en.phtml?title=Wikipedia_Status'
class='external'
title="http://openfacts.berlios.de/index-en.phtml?title=Wikipedia
Status"> OpenFacts</a>.
Generated Tue, 23 Aug 2005 01:06:08 GMT by srv5.wikimedia.org
(squid/2.5.STABLE9.wp20050410.S9plus[icpfix,nortt,htcpclr])
(My bot was attempting to edit [[Delta_Tau_Delta]].)
Similar symptoms have been reported by others on IRC and OpenFacts for
the past few days. Given how long these problems have gone on, we
seem to have either a shortage of hardware, or a shortage of
developers to investigate these problems. The current situation is
certainly making it quite annoying for *me* to edit, and I'm sure
there are others who are being discouraged from making contributions
entirely.
Hi folks,
Hope this is the right list for this.
I am using MediaWiki for part of a corporate intranet. I am only allowed to do this if I include the standard header and footer to each screen giving people access to the rest of the intranet. I'm also not allowed to use frames.
Is there a simple way of adding html code to the top and bottom of each screen without too much PHP coding? I have looked at the various sections on customisation, hacking and development, but they all seem to involve developing modules etc. Have I missed something?
Regards,
Andrew
Hi,
In the last time we had several cases where spammers uploaded images on
different wikipedias and used the pictures in html emails, trying to
sell chairs, grammophones or whatever.
Usually the uploads were the only contributions of said user, and the
license was missing.
It's maybe worth to consider disabling uploads on the projects which
don't really need it but can use commons (such as the german wikipedia
whose image upload policy is entirely compatible with commons and
especially all the smaller wikis which are not constantly watched).
Crosspost to wikitech and wikipedia-l, but please answer on foundation-l
since this is a project wide policy issue.
greetings,
elian
PS: I'll be away (on holiday) for the next three weeks, so don't expect
an answer from me, only posting this to the list to make people aware of
the issue.
Hi,
Can someone explain me why I got this when trying to join Wikimedia
IRC channels ?
Thanks,
Yann
--- Cannot join #wikimedia-tech (You are banned).
--- Cannot join #wikimedia (You are banned).
--- Cannot join #mediawiki (You are banned).
--
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://www.forget-me.net/pro/ | Formations et services Linux
Thank you Brian,
I found the directory (but no link to it?), but I have some questions:
- As far as I saw, all articles in the XML dump are from namespace 0. Am
I right?
- Is there a documentation about the format? I don't know the exact
meaning of <revision> and I don't know, which <id> is the database id
(primary key): the "page->id" or the "page->revision->id"?
- will XML dumps made recurrent (steady) in the future?
- and, please understand this question ironically, are there no other
users of dumps except me, because I'm the only one who nerves you about
this topic ;-)
Yours
jo
Brion Vibber wrote:
> There's a 20050713 dump in the new format (check the directories).
> Another dump will run this weekend.
Hello,
1.6 alpha is causing major problems in the Hungarian Wikipedia. Something must have changed in the parsing of templates. We have used a "meta-template" to generate the content of Babel templates for some time; take a look at http://hu.wikipedia.org/wiki/Sablon:User_en to see what the templates look like now. (The English template should, of course, look like this: http://en.wikipedia.org/wiki/Template:User_en).
All the best,
Endre/KovacsUr@huwiki
On 20/08/05, Tim Starling <timstarling(a)users.sourceforge.net> wrote:
> Modified Files:
> Parser.php
> Log Message:
> With the introduction of action=render, internal links may also contain http://, and so must be hardened against replacement by replaceExternalLinks in the same way as interwiki links.
>
> Index: Parser.php
[...]
> + * Hardens some text possibly containing URLs against mangling by
> + * replaceExternalLinks()
> + */
> + function hardenURLs( $text ) {
> + return str_replace( 'http://', 'http-noparse://', $text );
> + }
Did you mean to actually use this function somewhere, or did you
change your mind and forget to remove it? I should also note that it
won't work correctly, since I changed the other use of that notation
to fix bug 3090; it now also masks things other than http://... by
using preg_replace("/\b($wgUrlProtocols)/", UNIQ_PREFIX."NOPARSE$1",
...
In fact, even the issue you were addressing would probably want that,
since a wiki could generate URLs beginning https://
--
Rowan Collins BSc
[IMSoP]
This is a cross-post to wikipedia-l and wikitech-l.
I just switched the version of MediaWiki we're running on Wikimedia
websites to CVS HEAD. Currently it's labelled 1.6alpha, but development
is continuous.
What this means for users is that features will appear on the site as
soon as possible after they are written, instead of waiting for the next
major release of MediaWiki in up to 6 months' time.
There were very few user-visible changes associated with this switch,
the main difference will be development practices in the future. One
user-visible change you might notice is that if a page is deleted while
you're editing it, you'll get a kind of edit conflict.
What this means for developers is that we'll have to be more careful
what we put into HEAD. Test everything thoroughly, or put experimental
features into their own branch. You should backport bug fixes to the
stable branch (currently REL1_5), but it's no longer necessary to
backport features.
With a more stable CVS HEAD, we should be able to release major versions
of MediaWiki more often.
This development schedule was suggested by Brion and enthusiastically
seconded by me.
-- Tim Starling
2005/8/17, Tels <nospam-abuse(a)bloodgate.com>:
> I remember that you said you would include the version somehow. Did you do
> this and how would I need to include it? Also, would you willing to test
> an extension of mine to see that I included the info in the proper way?
Yes, you're now able to do:
$wgExtensionCredits[$type][] = array(
'name' => 'Extension foo',
'author' => 'Foo Barstein',
'url' => 'http://www.example.com',
'version' => '0.1'
)
the url and version keys are optional, if you want to test if this is
working okey just go to Special:Version on your wiki to see if it
shows up.