Rich,
To rebuild the images, just run rebuildimages.php. Change directory so you are in the maintenance folder.
Thanks,
Kevin
On Wed, Feb 6, 2019 at 7:00 AM mediawiki-l-request@lists.wikimedia.org wrote:
Send MediaWiki-l mailing list submissions to mediawiki-l@lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit https://lists.wikimedia.org/mailman/listinfo/mediawiki-l or, via email, send a message with subject or body 'help' to mediawiki-l-request@lists.wikimedia.org
You can reach the person managing the list at mediawiki-l-owner@lists.wikimedia.org
When replying, please edit your Subject line so it is more specific than "Re: Contents of MediaWiki-l digest..."
Today's Topics:
- Re: help importing image into a new wiki (Manuela)
- Re: help importing image into a new wiki (Evans, Richard K. (GRC-H000))
- Re: What's the best way to improve performance, with regard to edit rate (Ariel Glenn WMF)
- Re: What's the best way to improve performance, with regard to edit rate (Brian Wolff)
Message: 1 Date: Tue, 5 Feb 2019 10:01:14 -0700 (MST) From: Manuela pressephotografin@gmail.com To: mediawiki-l@lists.wikimedia.org Subject: Re: [MediaWiki-l] help importing image into a new wiki Message-ID: 1549386074545-0.post@n8.nabble.com Content-Type: text/plain; charset=us-ascii
Hi Rich,
I don't know why your approach does not work.
This is what I do: I export the database as sql using pypmyadmin, import it to the new host and copy the images folder. This has always worked for me
Manuela
-- Sent from: http://mediawiki-i.429.n8.nabble.com/
Message: 2 Date: Tue, 5 Feb 2019 17:13:25 +0000 From: "Evans, Richard K. (GRC-H000)" richard.k.evans@nasa.gov To: MediaWiki announcements and site admin list mediawiki-l@lists.wikimedia.org Subject: Re: [MediaWiki-l] help importing image into a new wiki Message-ID: DC3FB55EE7FEFD409A6FC1EA00D50D0B0FD456B3@NDJSMBX203.ndc.nasa.gov Content-Type: text/plain; charset="utf-8"
Unfortunately I'm not exporting the database SQL directly as you are .. I'm using the exportDump.php maintenance script to get the pages and page content as an XML file. You're approach works because your database transfer includes all data in all tables.. my approach is just the page names and page content as XML imported as new articles.
I can't (really don't want to) export the database directly as SQL because it's a very old version of MW 1.17 .. and the new site is MW 1.30.
The XML export from 1.17 and import to 1.30 went wonderfully well (except no images).. I have the image folder from the 1.17 site copied in to the 1.30 site and I just need to figure out how to tell MW to "rebuild the image pages" from the images folder.
Anyone?
/Rich
-----Original Message----- From: MediaWiki-l [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Manuela Sent: Tuesday, February 05, 2019 12:01 PM To: mediawiki-l@lists.wikimedia.org Subject: Re: [MediaWiki-l] help importing image into a new wiki
Hi Rich,
I don't know why your approach does not work.
This is what I do: I export the database as sql using pypmyadmin, import it to the new host and copy the images folder. This has always worked for me
Manuela
-- Sent from: http://mediawiki-i.429.n8.nabble.com/
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Message: 3 Date: Wed, 6 Feb 2019 10:54:00 +0200 From: Ariel Glenn WMF ariel@wikimedia.org To: MediaWiki announcements and site admin list mediawiki-l@lists.wikimedia.org Subject: Re: [MediaWiki-l] What's the best way to improve performance, with regard to edit rate Message-ID: < CALCvg_5c1HVRFNhMivtYHTK5JG6SDJd2mtDFY5Hr6mZCEG50Bw@mail.gmail.com> Content-Type: text/plain; charset="UTF-8"
It's not split up (sharded) across servers, a least as far as page and revision tables go. There is one active master at any given time hat handles all writes; the current host has 160GB of memory and 10 physical cores (20 with hyperthreading). The actual revision *content* for all projects is indeed split up across several servers, in an 'external storage' cluster. The current server configuration is available at https://noc.wikimedia.org/conf/highlight.php?file=db-eqiad.php
You can get basic specs as well as load information on these servers by looking up each one in grafana; here's db1067 (the current enwiki master) as an example:
https://grafana.wikimedia.org/d/000000607/cluster-overview?orgId=1&var-d...
Ariel
On Wed, Nov 28, 2018 at 4:34 PM Hershel Robinson hershelsr@gmail.com wrote:
I've heard that Wikimedia splits their enwiki database up among more
than
one server; is that how they're able to handle several page saves per second on the master?
That is correct. See here https://meta.wikimedia.org/wiki/Wikimedia_servers for more details.
-- http://civihosting.com/ Simply the best in shared hosting
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Message: 4 Date: Wed, 6 Feb 2019 10:46:03 +0000 From: Brian Wolff bawolff@gmail.com To: MediaWiki announcements and site admin list mediawiki-l@lists.wikimedia.org Subject: Re: [MediaWiki-l] What's the best way to improve performance, with regard to edit rate Message-ID: <CA+oo+DWr-qEeL= su2Pbt31XX04LrDmw8A15XnuAHURksXXmVHw@mail.gmail.com> Content-Type: text/plain; charset="UTF-8"
What is your caching setup (e.g. $wgMainCacheType and friends)? Caching probably has more of an effect on read time than save time, but it will also have an effect on save time, probably a significant one. If its just one server, apcu (i.e. CACHE_ACCEL) is probably the easiest to setup.
There's a number of factors that can effect page save time. In many cases it can depend on what the content of your page edits (e.g. If your edits have lots of embedded images, 404 handling can result in significant improvements).
The first step I would suggest would be to do profiling - https://www.mediawiki.org/wiki/Manual:Profiling This will tell you what part is being slow, and we can give more specific advice based on that
-- Brian
On Wed, Nov 28, 2018 at 2:17 PM Star Struck starstruck7200@gmail.com wrote:
The server is just the localhost that I use for testing; it's Apache and MySQL running on Ubuntu 18.04 on an HP Elite 3.0ghz with 4GB of RAM <
https://www.amazon.com/HP-Elite-Professional-Certified-Refurbished/dp/B0094J...
.
I haven't really figured out what kind of hardware or software I want to use for production, because I haven't done a lot of server administration (I've typically just used a VPS when I needed webhosting, but perhaps my needs now have expanded beyond that, because this is going to be a huge wiki, the same scale as Wikipedia; although my main concern at the moment is with making page saves, rather than page loads, more efficient, since
I
don't necessarily anticipate having a lot of visitors from the Internet reading the wiki, or else I'd be focusing more on stuff like caching; I mostly just want to set up a workable proof-of-concept for the moment.)
I've heard that Wikimedia splits their enwiki database up among more than one server; is that how they're able to handle several page saves per second on the master?
On Wed, Nov 28, 2018 at 8:19 AM Hershel Robinson hershelsr@gmail.com wrote:
What's the best way to boost performance ...
Depends on a myriad of factors, such as the OS, web server and database and hardware etc.
The simplest answer is to increase your hardware resources, meaning if the site has one CPU, give it two. For anything more specific, we would need more details about the server software and hardware.
Hershel
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Subject: Digest Footer
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
End of MediaWiki-l Digest, Vol 185, Issue 5