Okay, Plan B (which we all use anyway, after all, so why even try to
come up with a plan A, eh?):
If I copy the current (locked, admin only) [article] page contents to
the (open) [discussion] page to start off our "sort of edit every page"
wiki, is there a neat, fast MySQL/PHPMyAdmin command to copy the
database table contents of [article] into the table for [discussion] the
moment before we announce and go live, so the discussion pages start as
pure copies of the articles? Any link to such a routine?
(Otherwise, Plan C is to hand cut and paste - argh!) Thanks.
Hi
I was trying varnish cache but found that problem:
The url changes and the backend port is appended to the url which makes mw
not use the varnish cache.. so instead of http://mw.wiki/wikistuff , it
changes to http://mw.wiki:backendport/wikistuff ..which gives errors of
course.. so is there a way to make mediawiki stick to the url without that
port (port 80 only)?
demo:
http://delicieux.us.to/mwsvn/
it will give error with the port 4000, if you modify it (in address bar), it
works...
Thanks
--user:alnokta
Hello,
For example, in mediawiki-1.10.2\includes\zhtable\ I can find some
basic tables (they are all very small size), which contains some
popular conversion such as "�ڴ�" to "ӛ���w" (RAM in English),
however, I can't find some extra conversion which is currently in
wikipedia such as "���ɶ���" to "���ö�" (Football player `Ronaldo`), e..g at
http://zh.wikipedia.org/w/index.php?title=%E7%BD%97%E7%BA%B3%E5%B0%94%E5%A4…
Should I need to download the table from somewhere else?
Thanks.
Gentlemen, how might I best rig it so my Wikipedia cookies last longer
than a month? I live on top of a hill on a Pacific island and nobody
will be borrowing my terminal. Assume I don't know the owner of
Wikipedia so can't have them adjust things on their end.
Do I just close my browser and hack the dates
in ~/.mozilla/firefox/*/cookies.txt?
I am running mediawiki on MySQL on a Windows server. I am writing a TAG extension which needs to pull data from a MS SQL data source. In the extension's PHP code, I have the line:
$sqlconnect=odbc_connect($dsn,$username,$password);
When running this PHP code outside of the Wiki (e.g., in the DOS prompt by typing PHP code.php), the connection works and I actually pull out data from the database. But inside the extension code, $sqlconnect ends up being null. Is there a setting that I am missing that allows me to connect to an external database? Is one not allowed to connect to a database in a TAG extension code? (I am new to extensions).
I also tried pulling the same data via web services. I tried the line:
$client = new SoapClient("http://www.mydomain.com/MyService.asmx?wsdl <http://www.mydomain.com/MyServiceFulfillment.asmx?wsdl> ");
Again, I successfully was able to pull data out of the web services when I ran the code outside of the wiki at the DOS prompt. But once in the Wiki extension code, my Wiki pages go blank!
Again, is there a setting that I am missing that will allow me to connect to databases and web services in TAG extensions?
Thanks
George A. Kiraz
-------------------------------------------------
George A. Kiraz, M.St. (Oxon), M.Phil., Ph.D. (Cantab)
Hello,
A discussion has been underrun on MediaWiki Bugzilla (
http://bugzilla.wikimedia.org/show_bug.cgi?id=12131) about the fact that
different system messages follow different
naming styles, not a single one. In the end, Brion suggested that we
should use hyphens (-) as long as possible, and avoid underscords or
spaces
in the names of system messages.
I would like to ask all developers to follow this standard for all new
system messages they define.
Any input in this regard is more than welcome on the Bugzilla page. Also, I
would be happy to hear from
you about your ideas on renaming the already created system messages
to follow one single naming fashion.
Cordially,
Hojjat (aka Huji)
>
> From: "Charlotte Webb" <charlottethewebb(a)gmail.com>
> Date: 2007/12/20 Thu PM 12:33:41 EST
> To: "Wikimedia developers" <wikitech-l(a)lists.wikimedia.org>
> Subject: Re: [Wikitech-l] Relative External Links?
>
> On 12/20/07, Andre-John Mas <ajmas(a)sympatico.ca> wrote:
> > I have recently installed Mediawiki as part of my website. Because I am
> > integrating it into an existing site I need to be able to link to stuff
> > already there. The issue I am currently having is that the site is accessed
> > via two URLs (internal and external). For this reason I need links refering
> > to the existing site to be relative.
>
> If understand correctly, you're saying the second half of such a link
> (to non-mediawiki parts of your site) remains the same, but the first
> half varies depending on how the site is being accessed.
>
> If this is the case, and if the "internal" URL is only used by the
> webmaster (you), you could probably "internalize" all of the links
> (for you only) using a personal javascript at
> "User:Your_account_on_the_wiki/monobook.js" by doing a search and
> replace on all urls matching the "external" format.
>
> Something like:
> > addOnloadHook(function(){
> > a = document.getElementsByTagName("a");
> > for(x = 0; x < a.length; x++)
> > a[x].href = a[x].href.replace(/yoursite\.com/i, "back.door.of.yoursite.com");
> > });
>
I would like to make the support a bit more general this, allowing
for links such as:
[/page page]
If I knew which class/method was responsible for parsing this I
would be willing to see what it would take to make the changes,
since I am not afraid of diving in, its just I don't where I
should be looking.
Any ideas?
Andre
Hi,
since October 9 no german dump is available. The last dumping at
November 14 failed and thne last general status of
http://download.wikimedia.org/backup-index.html is from December 5th.
Please can someone tell us, when dumping will be continued? Or has the
address of the download page changed?
Thank you!
jo
On 12/21/2007 05:02 PM, Lars Aronsson wrote:
> Hi Johann,
>
>> When I just want current pages with no discussion and
>> no history but with all the templates and category lists
>> working correctly, which of all the files in
>> http://download.wikimedia.org/enwiki/latest/
>
> That should be the pages-articles.xml
> "the one most people will want".
>
What puzzles me about that is that there are so many
other files in the download dir ... don't I need any
of these?
When I last tried that (I downloaded pages-articles.xml,
converted with Xml2sql and imported with mysqlimport),
templates did not show properly
and no category lists where available at all.
>> do I have to download and
>> how do I have to import the data so that the import
>> is complete and fast?
>
> Fast? Haha, no, it won't be.
>
> You should try it out on a smaller language than English.
> Try Faroese (fo) with 2,700 articles or Anglosaxon (ang) with 900
> articles.
>
Well, I meant *relatively fast* :)
It seems there are different ways to do the import which
might be differently fast, given the same dump.
I also wonder if there are any settings that would
speed up the import into the local DB ... e.g., I
turned off the binary log of Mysql which seemed to speed
up things. I only want to read the local copy, not
update anything.
Cheers,
Johann