An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Magic Word: {{NUMBEROFFILES}}... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, closed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, closed tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Passed 303 of 313 tests (96.81%) FAILED!
Hey,
No, I've got a supposedly 8M road runner connection - Not that I get that on
download.
Anyway, that's not really the problem - it never gets that far. It claims
that it's downloading a file that is 3.5G but then downloads what appears to
be a valid, empty tar that is only 700 bytes. My tar program opens it,
doesn't complain about the format, and says its empty.
I am one of the those dreaded windows users, but I didn't have any trouble
downloading the articles ( I know, much smaller). and the download help
page seems to say that NTFS on win2k/xp is okay for large files, so I
thought I'd be okay.
Anyway, with the apparently valid, empty tar downloading I thought it was
worth checking whether the 75G file was really still there. If it is, I
should probably look elsewhere for my problems solution.
Thanks for all your help.
Tom
"Steve Bennett" <stevage-Re5JQEeQqe8AvxtiuMwx3w(a)public.gmane.org> wrote in
message
news:<f1c3529e0605050751r1dd5047cv1c106dcd53fff4b9(a)mail.gmail.com>...
> On 5/5/06, Tim Starling
<t.starling-JAjqph6Yjy+R5oW9C2NzqLs0Z2mYlMph(a)public.gmane.org> wrote:
> > What happens when you try? I believe some clients have trouble
downloading
> > files bigger than 2GB.
>
> Maybe he was on dialup...?
>
> Steve
I'm trying use a taxobox, but it doesn't work. I need something special
template or anything else?? I can see the text code, for example:
Template:Taxobox_begin, .... but i cann't see the table
Thanks!!
I was curious what the current plans are for MW 2.0. Will it just be
announced sometime during the quarterly release schedule? Will it be a
considerable rewrite?
If the later, I have an interesting idea for SoC. How about somebody
rewrite the MW 1.6 core. Details of the rewrite would include
*Changing everything to PHP 5 (better OO support in core)
**If not PHP 5, then at least better OO'ize the code
*Eliminating $wg global usage.
**Perhaps replacing with a registry of sorts
**Perhaps create a master MediaWiki object that has the $wgTitle, $wgUser
variables and gets passed around
*Numerous little things that everybody hates about the current core
We call the result of this rewrite MediaWiki 1.99, or something ridiculous.
MediaWiki 1.99 has the full functionality of MediaWiki 1.6. The only
difference is a re-factored core. Now, we have two branches of MediaWiki
for developers to work on. We have the 1.6 branch for minor updates on
Wikipedia, etc. We have the 1.99 branch for drastic changes to how
MediaWiki works. The 1.99 branch allows developments such as LiquidThreads,
a well-thought-out API, permissions system, wiki farms, etc to be worked out
independently of the stable 1.6 branch. In addition, database schema can be
changed easily.
Thoughts?
Gregory Szorc
gregory.szorc(a)gmail.com
on fr, a border (in css) has been added (I don't remember who did it) to
emphasize indentation.
> It is not always very clear, but it's a bit better.
>
> You can have a look to:
> http://fr.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Le_Bistro
>
>
>
> On 5/4/06, Uwe Brauer <oub(a)mat.ucm.es > wrote:
> >
> >
> > Hello
> >
> > While signing a contribution in a discussion page makes it clear when
> > that contribution finishes it is *not* clear where it starts.
> >
> > Usually it is recommended to use identation, however some users
> > sometimes use indentation in their editing in order to emphasise a
> > point. So could a here starts my contribution-tag be implemented?
> >
> > One possibility could be
> >
> > '''Re: [[User:Foo]]''' where [[User:Foo]] is the signature of user Foo.
> >
> > I can do something like this with my editor, but it would be nice and
> > helpful to have it implemented.
> >
> >
> > Uwe Brauer
> >
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)wikimedia.org
> > http://mail.wikipedia.org/mailman/listinfo/wikitech-l
> >
>
>
Sorry if this posts in duplicate - I messed up my first effort to post here
somehow.
Are the image downloads offline? I can't pull down the files in the normal
way from:
http://download.wikimedia.org/images/wikipedia/en/
Thanks,
Tom Dichiaro
Hi
google gets more and more negative headlines, europeans make an own search
engine to have not only books scanned, which are from USA.
called quaero:
http://www.heise.de/newsticker/search.shtml?T=quaero&button=los%21
Would it be possible to have in Wikipedia a search engine, which searches
not only the articles, but only all weburls within wikipedia? The mettags
for the urls could then be the articales with all words.
So if a url is on a wikipage, all words within the search found on the
wikipage give a queryhit, as well all urls from the keywordsearch are shown
from teh webpage, which has the same keyworf for the article..
So... Search for Germany and you get
- all weburls from the wikipage-article "germany"
- all weburls from wikiarticles, which include the word germany.
The we could make a new Domain www.go-wiki.org / .de as a search engine,
which is in a simple graphic modus showing ONLY URLS !!! liek google and
then we can brwose from that to the internet.
See the descriptopn of social reviewd urls in lycos and yahoo.
But wiki has this already!!!!!!!!!!!!!!!!!!!!
At download.com you find as well url-blaze, which was never sucessful in a
p2p style.
I have not yet subscribed to the technical list, please can you forward it
to the list?
for a social reviewed link search engine it may be interesting to perform a
joint venture betwen wikipedia, metager.de and querro. aka exalead ??
can someone coordinate this project with wikipedia ?
- all internet-weblinks in the wikipedia are the database.
- a keyword search announces all weblinks in the wikiarticle for this
keyword
- plus all wiki links with the keyword on any page in wikipedia
ok some will drop in in wikipedia some links, but categorized to the
articels the users will discuss and perform only the 15 relevant weburls for
a keyword on the page and on other pages
This would be a great extension to wiki and quero !!
Wolfgang can you coordinate a meeting in hannover?
thanks
>>>>>
Yahoo-Suche wird sozial
Yahoo hat heute eine neuartige Web-Suche gestartet. Der Dienst mit dem Namen
Mein Web 2.0 soll Ergebnisse liefern, die von anderen Nutzern bereits als
interessant und relevant eingestuft wurden. Dazu spinnt Yahoo soziale Netze.
Benutzer können bei dem Dienst ihre Bookmarks speichern – ähnlich
anderen sozialen Bookmarking-Diensten wie del.icio.us mit so genannten Tags
versehen.
Die Lesezeichen lassen sich für Freunde oder für die Allgemeinheit
freigeben. Mein Web 2.0 kann die eigene Link-Sammlung sowie die der
Community im Volltext durchsuchen. Das Konzept ähnelt ein wenig dem von
Lycos iQ, bei dem neben den Link-Sammlungen auch Forenbeiträge in die
Ergebnislisten zu Rate gezogen werden. (jo/c't)
http://yahoo.de/meinwebhttp://del.icio.us/http://iq.lycos.de/
Lycos: Besser suchen mit Web 2.0
Lycos Europe hat heute einen neuen Dienst freigeschaltet, der die Recherche
im Internet mit einem sozialen Netzwerk-System und einem Bookmark Manager
verknüpft. Bei Lycos iQ können Surfer beliebige Fragen stellen, die andere
Nutzer beantworten. Der Fragesteller bewertet die Qualität der Antworten,
ein Punktesystem belohnt die Ratgeber für ihre Hilfe. Greifbare Belohnungen
gibt es derzeit dafür nicht; dafür kann man sich in der Expertenhierarchie
vom "Student" zum "Einstein" hocharbeiten. Lycos baut Ergebnisse aus dem
Fundus von Lycos iQ auch in die klassische Lycos-Volltextsuche mit ein.
Zudem können Benutzer bei Lycos iQ auch ihre Links verwalten. So genanntes
Tagging, die Verschlagwortung mit beliebigen Begriffen durch die Benutzer,
verbindet alle Teile von iQ. Andere so genannte Social-Bookmarking-Dienste
wie del.icio.us, die das Tagging bereits mit großem Erfolg einsetzen,
standen offensichtlich Pate. Lycos hofft, dass das System durch die
Interaktion der Nutzer zu einem großen Wissensfundus anwächst. (jo/c't)
--
Analog-/ISDN-Nutzer sparen mit GMX SmartSurfer bis zu 70%!
Kostenlos downloaden: http://www.gmx.net/de/go/smartsurfer
When I want to overwrite an image file A by uploading an image file B with
exactly the same filename, I get the following error-message:
Internal error: File "/tmp/phpwhT5E3" to
"/www/wiki/images/temp/20060505091417!Image.gif" copy not possible.
Warning: move_uploaded_file(/www/wiki/images/temp/20060505091417!Image.gif):
failed to open stream: No such file or directory in
/www/wiki/includes/SpecialUpload.php on line 370
Warning: move_uploaded_file(): Unable to move '/tmp/phpwhT5E3' to
'/www/wiki/images/temp/20060505091417!Image.gif' in
/www/wiki/includes/SpecialUpload.php on line 370
When i (as admin) first delete image file A and afterwards upload image file
B, everything is fine. The problem is that when a normal user (without admin
rights) wants to change his image, he gets the error message.
Has anyone ever encountered this problem before?
thanks in advance for any reply!
Birger
(Mediawiki 1.5.2)
Hi all
> Replication of enwiki is running, it's currently still 360'000 seconds
> lagged, but it reduces the lag by some 100'000 s/day, so it should be
> up to date on monday. See
> http://tools.wikimedia.de/~leon/stats/replag/replag-en-daily.png
Replication for enwiki is running on zedler (using an external
replication agent developed by DaB), but the database has been corrupted
- something about auto_increment values being out of sync. It's
currently unclear when and how that will be solved. Please direct
questions and suggestions to [[de:User:DaB.]] or to Toolserver-l - DaB
said he's not reading Wikitech.
Regards
-- Daniel
--
Homepage: http://brightbyte.de
Hallo!
Comparing some of the special pages I wonder if it would be possible to add /
expand sort criterias in MySQL queries especially because most of the pages
here are limited and cached:
[[Special:Mostcategories]]
sort by
1. number of references (members)
*and* 2. title
[[Special:Mostimages]]
sort by
1. number of references (how often imagy syntax is used)
*and* 2. title
[[Special:Mostlinked]]
sort by
1. number of references (links, category members, how often imagy syntax is
used)
*and* 2. namespace (by "ns value")
*and* 3. title
[[Special:Mostlinkedcategories]]
sort by
1. number of references (members)
*and* 2. title
[[Special:Mostrevisions]]
sort by
1. number of versions
*and* 2. title
[[Special:Shortpages]] (ascending)
and [[Special:Longpages]] (descending)
sort by
1. number of bytes
*and* 2. title
[[Special:Wantedcategories]]
sort by
1. number of references (members)
*and* 2. title
[[Special:Wantedpages]]
sort by
1. number of references (links, category members, how often imagy syntax is
used)
*and* 2. namespace (by "ns value")
*and* 3. title
[[Special:Whatlinkshere]]
*sort by*
1. namespace (by "ns value")
2. title
see http://bugzilla.wikimedia.org/show_bug.cgi?id=2306
Special:Whatlinkshere lists ordered by creation timestamp - ordering and
filtering should be possible
best regards reinhardt [[user:gangleri]]