Hi developers,
would it be possible to make a function {{CURRENTSECOND}}, in order to be
able to have a sort of random function. What I wat to do is the following: I
want to give the protal I've build a random logo. ''ll make several logo's
and have one popped-up randomly. With a function {{currentsecond}} this
would be possible.
Another request, way much less important is whether one could refresh the
speciaal:wantedpages in the Dutch wiki (nl). Thanks for listening
Best regards,
Mig de Jong
--
Migdejong(a)gmail.com
Rode Kruislaan 1121 A
1111 XA Diemen
The Netherlands
00-31-6-14800370
Hi there...
I've just downloaded wiktionary dump. And wikimedia 4.12.
There is a problem that wiki code is redirected url with lower-case first
character in search phrase to upper-case first character.
For examle if I type
http://localhost/wiki412/index.php/true it's redirected to
http://localhost/wiki412/index.php/True
So I cannot reach the first url with world "true". What should I do for fix
it.
Thank you
Sergey
hi all,
Can somebody help me with the folowing :
In wikipedia when we view the page of an article, it displays the
content of the page and not the metadata such as author(s),page
timestamps etc. A part of these metadata is visible in the history
section where it displays all the page revisions. I want to change the
rendering of the page such that we have a new tab (say metadata) where
all the metadata information regarding that page is displayed.
I would like to know the procedure of how this can be done. I am using
mediawiki1.5 and have already tried hacking through the code. I have a
vague idea but its not very clear and concrete. Your advise,instruction
or help will be highly appreciated.
Thanking you,
C
Hi,
I wonder why all fields in the MySQL database which hold text are
implemented as binary data (blobs, mediumblobs etc.). MediaWiki expects
to read/write them in utf8.
It may be Ok for MediaWiki, but it is VERY inconvenient when it comes to
accessing the data from other tools.
I've managed to recreate my db just by replacing all `blob' occurences
by `text'. My wiki operates still the same, by now I can read Russian
text in db with any tool.
Regards,
Vladimir
Hello!
You are receiving this email because your project has been select to
take part in a new effort by the PHP QA Team to make sure that your
project still works with PHP versions to-be-released. With this we hope
to make sure that you are either aware of things that might break, or to
make sure we don't introduce any strange regressions. With this effort
we hope to build a better relation between the PHP Team and the major
projects.
If you do not want to receive these heads-up emails, please reply to me
personally and I will remove you from the list; but, we hope that you
want to actively help us making PHP a better and more stable tool.
The fifth and hopefully the final RC of PHP 5.1.0 has released today,
it can be downloaded from http://downloads.php.net/ilia/. Since this is
the final release, we ask you to test it extensively with your software
to ensure that no regressions have occurred. If you discover any (we
hope not) please notify PHP's QA team at "php-qa(a)lists.php.net".
In case you think that other projects should also receive this kinds of
emails, please let me know privately, and I will add them to the list of
projects to contact.
regards,
Ilia Alshanetsky
5.1 Release Master
dpbsmith wrote:
>On my next flight, I hope my pilot will not be using these maps.
That's because the maps are marked "ALPHA QUALITY - NOT FINAL
PRODUCTION VERSIONS" and are written in pencil. Never mind they're in
the top-40 maps in the world - the project has peaked way too early.
Damn we need the rating feature. What's holding it up right now? List
please, referring to current version of code. (I know the servers are
creaking ...)
- d.
Hi everybody,
Representatives of the Serbian Wikipedia have asked a number of times
how they can get their conversion engine installed for sr.wikipedia.
1) It was approved in a vote by Serbian Wikipedians
2) It will make Wikipedia more usable to many people, including most
Montenegrins
3) It does not require any new software; rather it is a fully working
Mediawiki component already.
Now, so far there has been no response from this community.
Please, somebody, respond -- how is it that the Chinese Wikipedia got
a conversion system implemented as soon as it was written, but the
Serbian Wikipedia is waiting and waiting and waiting?
Mark
--
"Take away their language, destroy their souls." -- Joseph Stalin
Hi everybody!
I don't want to discuss any of the details regarding the two requested new Wikipedias here, because both have been discussed very thouroughly in the appropriate places and can be considered approved by the community now - if they weren't we would certainly not request their creation here.
Beyond all personal opinions, there are two very simple facts:
* both proposals have been discussed exhaustively and have gained support by a vast majority of users (actually, they are supported by more people than some of the new Wikipedias created recently)
* both proposals have only minimal opposition against them (one anonymous user in one case, one anonymus plus one registered user in the other case) and meet all formal requirements
Hence, allowing them at Wikipedia is simply a question of fairness.
In my opinion, the ongoing attempts of discrediting them are plainly anti-social and contradict basic principles of the Wikipedia community ("Assume Good Faith", "Wiki-Love" ...). Everybody has the right to dislike certain ideas but such destructive behaviour is not acceptable here.
Having said that, I would like to support the petition made by our fellow Wikipedians Servien and Gerard, and kindly ask for the creation of the two wikis mentioned above (ISO codes are "rmy" and "nds-NL").
Thank you very much for your consideration!
Arbeo
---------------------------------
Gesendet von Yahoo! Mail - Jetzt mit 1GB kostenlosem Speicher
Brion Vibber brion at pobox.com wrote:
>I already told Magnus, if there's anything terribly wrong with it it'll
>get fixed after it's turned on (and if necessary, back off).
Indeed :-)
>It's a solution in search of a problem; it doesn't solve the
>validated-version-display issue in any way, it's just a survey form that
>might, in theory, produce data that might, in theory, be interesting or
>useful to someone one day.
We (or I) have a plan to apply it. Call it a medium-term solution,
trying to go straight to the deep answer :-)
(I could be completely wrong and the results could be complete
rubbish, of course.)
>Will it be worth the trouble of turning it on and possibly having to
>deal with fixing it when further problems become evident? Who knows.
>> So. What's up with Special:Validate?
>It's on my list for this week, I'll see about getting it turned on and
>working.
:-D
Please let me know when you do, as I have a profound interest in this
and will be watching it closely!
Thanks, Brion :-)
- d.
Hello,
I tried dumping local Wikipedia to HTML using dumpHTML.php and I found
two bugs:
- the most important one is in includes/Title.php in getHashedDirectory
function. When the $dbkey contains characters such as ".", it is
used, so for example if $dbkey is "1. ", then the generated directory
name is "1/./_/", which of course is only "1/_" and all links in that
file stop working (assuming I'm using depth of 3).
The fix is easy (borrowed from getHashedFilename). Just adding
$chars[$i] = strtr( $chars[$i], '/\\*?"<>|~.', '__________' );
to the else part of if in the for cycle fixes the problem.
- When generating specials (I suggest generating also Special:Allpages,
not only categories), we should either make it possible to navigate
through the result set using static HTML, or (for smaller wikis)
getting rid of limits and navigation alltogether. I worked around
this problem by increasing limits and deleting forms from the
particular pages.
Also, the generation is painfully slow, is it possible to speed it up
somehow? I had to rerun it on Sun Fire V20 with two processors to get at
least a little bit reasonable time for generation (two hours for sk and
cs wikipedia).
One last question: how to get rid of interwiki links to different
language mutations? They don't work in local wikipedia of course...
Sincerely,
Juraj Bednar.