Hi all,
I experience extremely slow import of pagelinks.sql (MYISAM). It's only
about 13 000 000 rows of data so it shouldn't take many minutes. Now it
takes hours.
I noticed that time for inserts increase as the table grows. Peeking
with SHOW PROGRESSLIST sometimes says:
+------+------------+--------
| time | State | Info
| 38 | ** DEAD ** | INSERT INTO 'pagelinks' ...
+------+------------+--------------------------------
I thought it'd be enough to shut off keys during import etc like so:
SET AUTOCOMMIT=0;
SET UNIQUE_CHECKS=0;
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE `pagelinks` DISABLE KEYS;
...and after import activate them again like so:
SET UNIQUE_CHECKS=1;
SET FOREIGN_KEY_CHECKS=1;
ALTER TABLE `pagelinks` ENABLE KEYS;
COMMIT;
Since it took so long I aborted the import of tables (tried first to
import in alphabethical order) and imported all the other tables first,
and after that I import pagelinks.
I have a dual core with 2GB ram, xampplite on windows.
What am I doing wrong? Does import order really matter that much? (I'da
already tried all tricks if it only didn't take soooo loooooong to try
all 'em tricks when its slow like this).
Any ideas about how to reduce time /significantly/ from hours down to
minutes? I use this data for testing ("consuming" it by processing it)
and thus I need to often "reinstall" the data, in several databases.
// Rolf Lampa
I got an email from someone accessing my wiki:
http://against-the-day.pynchonwiki.com/wiki/index.php?title=Main_Page
I'm using v 1.9.3
Here's the message I received:
I get a message
from Avast anti-virus that the page is infected with :
HTML:Iframe-inf.
I have contacted Avast who says this is not a false positive.
You can Google
virus name--it seems that this infection is fairly common.
Please let me know if the infection is real--I would like to use the
wiki.
Any idea of what I should do to address this?
Thanks!
Tim
.........................................................Tim
Ware.........................................................
HyperArts .. 201 4th Street, Ste 404 .. Oakland CA 94607
t: (510) 339-6084 .. f: (510) 339-6086 .. e:
tim(a)hyperarts.com .. twitter.com/hyperartshttp://www.hyperarts.com
Map
Wondering how I would go about overriding the getCategoryLinks method
I figured in the extension php code I have I could just put
class Mine extends Skin {
function getCategoryLinks() {
return 'new';
}
}
But no go. Any ideas?
Thanks,
-Adam
Have been away a bit but am now able to continue developing my site.
Last time I searched I didn't find one, but I still wonder if there is
a personal ad extension for Mediawiki somewhere to be d/l'ed? Or if
not anything dedicated, maybe someone is using something else for this
kind of thing?
What I'm looking for would be for people to be able to market their
products through small adds on the site (duh). Of course I could
create a template for this, but something that handles stuff
automagically would be nice.
Regards,
Martin S
Im writing a plugin that needs to know the last section number of an
article. Or how many sections a page has. Or... Even a way to list all
sections of an article (maybe by name?)
Im using the API for it.
Anyone know of anything like this in the API?.
-Adam
It was very much preferable when this was still a maintenance script, mostly
because it had the useful property of working!
/var/www/emergent/extensions/DumpHTML$ *php dumpHTML.php
> /tmp/emergent_static/ -k monobook*
> Initialising destination directory...
> Warning: mkdir(): Permission denied in
> /var/www/mediawiki/bleeding_edge/includes/GlobalFunctions.php on line 2072
> Unable to create destination directory.
> Backtrace:
> #0 /var/www/mediawiki/bleeding_edge/extensions/DumpHTML/dumpHTML.php(137):
> DumpHTML->setupDestDir()
> #1 {main}
>
Now, let's comment out line 137 of dumpHTML.php and see what happens (I
already created this directory anyway - it doesn't need to do that).
/var/www/emergent/extensions/DumpHTML$ *php dumpHTML.php
> /tmp/emergent_static/ -k monobook*
> Creating static HTML dump in directory /var/www/mediawiki/code/static.
> Using database localhost
> Starting from page_id 1 of 813
> Warning: mkdir(): Permission denied in
> /var/www/mediawiki/bleeding_edge/includes/GlobalFunctions.php on line 2072
> Warning: file_put_contents(/var/www/mediawiki/code/static/misc/logo.png):
> failed to open stream: No such file or directory in
> /var/www/mediawiki/bleeding_edge/extensions/DumpHTML/dumpHTML.inc on line
> 778
> Warning: mkdir(): Permission denied in
> /var/www/mediawiki/bleeding_edge/includes/GlobalFunctions.php on line 2072
> Warning:
> file_put_contents(/var/www/mediawiki/code/static/misc/favicon.ico): failed
> to open stream: No such file or directory in
> /var/www/mediawiki/bleeding_edge/extensions/DumpHTML/dumpHTML.inc on line
> 778
> Processing ID: 1
> Warning: mkdir(): Permission denied in
> /var/www/mediawiki/bleeding_edge/includes/GlobalFunctions.php on line 2072
> Error: unable to create directory
> '/var/www/mediawiki/code/static/articles/m/a/i'.
> Warning:
> file_put_contents(/var/www/mediawiki/code/static/articles/m/a/i/Main_Page_(old)_0e7f.html):
> failed to open stream: No such file or directory in
> /var/www/mediawiki/bleeding_edge/extensions/DumpHTML/dumpHTML.inc on line
> 564
> Can't open file
> '/var/www/mediawiki/code/static/articles/m/a/i/Main_Page_(old)_0e7f.html'
> for writing.
> Check permissions or use another destination (-d).
>
I don't know why this script isn't working. But one thing is clear - it is
thinking *entirely too hard* about what it's supposed to be doing. Whatever
changes have been made since this was a maintenance script have not been
great changes for me.
I tried using wget. Something like: wget -e robots=off --mirror -I emergent
--convert-links --no-parent http://grey.colorado.edu/emergent
But mediawiki is a dynamic content trap when it comes to wget, so I am
presently without a solution.
Does anyone see whats wrong with this script?
Thanks,
Brian
Hello. I am using two languages on my site, Arabic and English. The problem I have is that the arabic script is read from left to right. How can I make certain lines be read from right to left (for arabic) and certain lines from left to right (for english)?
Please help.
Thanks,
Alan.
I cant find/ figure out how to do an API query through plain php.
I can/ am doing them perfectly through AJAX/ JSON right now. But Id
like to do it with php if possible.
Is this doable?
Thanks,
-Adam
Hi everyone,
One of my users has an odd problem and I'm hoping a fix is simple :). This
user is clicking links in the wiki and randomly, some pages are actually
being shown as code in the browser while others aren't. Clicking this same
link again renders the page properly. This screenshot (
http://yfrog.com/5avdlt) is an example of what happens when the browser
doesn't render the code and instead displays it.
Any ideas?
Thanks,
Bryan
Is there a way to find/show a list of Wiki pages (or sub-pages) that
have Talk pages with content?
I've been poking around in the Extensions and doing some searching but
haven't come up with anything yet.
C.
--
Clayton Cornell ccornell(a)openoffice.org
OpenOffice.org Documentation Project co-lead
StarOffice - Sun Microsystems, Inc. - Hamburg, Germany