An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Passed 394 of 407 tests (96.81%) FAILED!
I would like to use a wikipedia dump in my installation of Mediawiki 1.6.7 here http://encyclopedia.meta99.com/, using the proceduce found here :
http://en.wikipedia.org/wiki/Wikipedia:Database_downloadhttp://meta.wikimedia.org/wiki/Data_dumps
I have used mwdumper, because tutorial ask that importDump is slow.
mwdumper
mwdumper is a standalone program for filtering and converting XML dumps. It can produce output as another XML dump as well as SQL statements for inserting data directly into a database in MediaWiki's 1.4 or 1.5 schema.
I have lauched the command in the server :
[root@myserver maintenance]# java -jar mwdumper.jar --format=sql:1.5 enwiki-latest-pages-articles.xml.bz2 | mysql -u databaseuser -p databasename
Enter password: "i have used my database password"
1,000 pages (19.529/sec), 1,000 revs (19.529/sec)
ERROR 1062 (23000) at line 46: Duplicate entry '1' for key 1
2,000 pages (21.128/sec), 2,000 revs (21.128/sec)
3,000 pages (21.702/sec), 3,000 revs (21.702/sec)
4,000 pages (21.048/sec), 4,000 revs (21.048/sec)
5,000 pages (20.578/sec), 5,000 revs (20.578/sec)
6,000 pages (20.369/sec), 6,000 revs (20.369/sec)
7,000 pages (20.078/sec), 7,000 revs (20.078/sec)
8,000 pages (20.085/sec), 8,000 revs (20.085/sec)
9,000 pages (20.048/sec), 9,000 revs (20.048/sec)
10,000 pages (20.223/sec), 10,000 revs (20.223/sec)
11,000 pages (20.159/sec), 11,000 revs (20.159/sec)
12,000 pages (20.117/sec), 12,000 revs (20.117/sec)
13,000 pages (19.962/sec), 13,000 revs (19.962/sec)
14,000 pages (20.244/sec), 14,000 revs (20.244/sec)
15,000 pages (20.204/sec), 15,000 revs (20.204/sec)
16,000 pages (20.167/sec), 16,000 revs (20.167/sec)
17,000 pages (20.259/sec), 17,000 revs (20.259/sec)
18,000 pages (20.18/sec), 18,000 revs (20.18/sec)
.....
.....
.....
The problem is that it is the first time in my life that i use MediaWiki and i'm not an expert.
1) I continue to have not any page in the site where i have installed wikimedia and in the database i can not see directly any new entry/page in the site. Exist a system to verify that the dump is inserted really in the database = In the site i use with wikimedia i have not any new page here :
http://encyclopedia.meta99.com/
Exist a system to verify if the dump pages are in the database ?
In a message of your list i have found about a similar problem
http://64.233.183.104/search?q=cache:3LLMxLN9gPcJ:mail.wikipedia.org/piperm…
"2) Are you using read-only mode on the wiki? "
I have installed mediawiki using the standard procedure and at the moment i have not did any changing in the configuration/setting. To use the dump i have to make some particular setting/configuration ???
2) Why this error, after launching the script ?
[root@myserver maintenance]# java -jar mwdumper.jar --format=sql:1.5 enwiki-latest-pages-articles.xml.bz2 | mysql -u databaseuser -p databasename
Enter password: "i have used my database password"
1,000 pages (19.529/sec), 1,000 revs (19.529/sec)
ERROR 1062 (23000) at line 46: Duplicate entry '1' for key 1
Have you an idea ?
3) When mwdumper.jar has finished the work what i have to do, to use the dump in my system ? I have not found instructions about that in the wikimedia site. When msdumper has finished i need to make some particular setting/configuration in my installation of mediawiki to use the pages in my site/system ?
Hi, I need a system for people reading wikimedia marked up language
sentences so that the translated word or sentence in another wikimedia
page will show up in another frame or window. If possible this should
happen each time the mouse hovers over the link (like a popup), but
clicking on the link is also OK.
This does not need to be across all browsers, even if this is
supported by Firefox/Safari it would be good enough.
Any suggestions or ideas? Thx, Kent
I've patched up some old problems with Special:Import and Special:Export:
* Import updates categories etc
* Imports are logged and reviewable in Special:Log/import
* Imported pages also get a null edit in the history indicating the import and
its source
* Export is fixed up to allow fetching history for shorter pages, while still
aborting to avoid bogging down the servers on longer pages (currently set to a
cutoff of 100 edits)
* Transwiki import allows selecting the import-with-history
Wikis wishing transwiki import capability should let us know which wikis they
want to be able to import directly from (for instance, from a wikipedia ->
wiktionary) and we can enable it.
Please pass this notice on to wikis where it will be of interest.
-- brion vibber (brion @ pobox.com)
Speaking of bots, over the past few days and in recent months, bots that
the community had come to rely upon have stopped working, and we don't
have the source available to fix them.
In the most recent case, the en: crypticbot was stopped and banned by a
misguided administrator. It was overturned, but the damage was done.
Unfortunately, it was the one was responsible for daily sub-pages at
Templates for Deletion and others, and running the Village Pump and
others. Cryptic cannot restore operation until returning to work on
Wednesday.
It seems to be too fragile for depending on individual users, and
would benefit by toolserver or central access to the database to
avoid overhead. (Hopefully, ensuring technical competence.)
What needs to be done to have a centralized system for running such
hourly and daily tasks?
> "William Allen Simpson" wrote
> > ..
> > And AWB is capable of running without user interaction, so it's a bot.
>
> AWB can be used in bot mode. But then, the user must be logged in and registered
> at http://en.wikipedia.org/wiki/Wikipedia:AutoWikiBrowser/CheckPage#Bots
> under the section "Bots" (a fully protected page).
>
> And AWB cannot do page moves.
>
> --Adrian
The source code is available to anyone with a little technical savvy...
~Mdd4696
Triple crosspost, I suggest following up to wikitech-l only.
As part of a change to improve the parsing speed for double-brace entities,
I'm changing the syntax of a few unusual function-like constructs, to make
them more like the other parser functions (unless anyone can think of a good
reason not to).
We currently have a number of statistics variables which accept an optional
"raw mode" suffix. This suppresses language-dependent number formatting.
They are:
{{NUMBEROFPAGES|R}}
{{NUMBEROFUSERS|R}}
{{NUMBEROFARTICLES|R}}
{{NUMBEROFFILES|R}}
{{NUMBEROFADMINS|R}}
This will be changed to:
{{NUMBEROFPAGES:R}}
{{NUMBEROFUSERS:R}}
{{NUMBEROFARTICLES:R}}
{{NUMBEROFFILES:R}}
{{NUMBEROFADMINS:R}}
This brings them into line with parser functions such as {{localurl:}}, and
thus allows them to take advantage of the new more efficient handling of
such constructs that I have just implemented.
The non-raw syntax, e.g. {{NUMBEROFPAGES}}, will stay the same.
-- Tim Starling
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Magic Word: {{SCRIPTPATH}}... FAILED!
Running test Magic Word: {{SERVER}}... FAILED!
Running test Magic Word: {{SERVERNAME}}... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Passed 391 of 407 tests (96.07%) FAILED!