Ive used this to dump what I hoped would be a complete backup of my
local hosted mediawiki. Th purpose of which was to import the .xml file
to a new working server. The script to import it did the job. however
the dumped.xml file did not contain all the articles. It uploaded 476 of
a site that contains 5391 articles. Just wondering why. Ive done it 3
times with no different results.
john
-------- Messaggio originale --------
Oggetto: [Wikimania-l] Meetup MediaWiki release management
Data: Fri, 9 Aug 2013 14:02:57 +0000
Mittente: Markus Glaser
Hello all,
I just proposed a meetup on MediaWiki Release Management:
http://wikimania2013.wikimedia.org/wiki/Meetups#MediaWiki_Release_Management
Time: Saturday, 1430-1530 in room N114 (next to Logo Square).
Description: MediaWiki is mainly developed for the Wikimedia sites. So,
naturally, there is some room for improvement when it comes to the
tarball releases. Besides automating the release process, we care a lot
about third party users and extension developers. So in this meetup,
we'd like to listen to you. What are your needs? Where do you feel the
current release (process) is not good enough? What are the biggest
issues? And: do you want to help?
In case you are interested, please sign up or just show up J
Best,
Markus (mglaser) and Mark (hexmode)
Dear Officer,
Here is Mr. King Wei from Wuhan Line Power Transmission Equipment Co., Ltd.. We found our company establishing year on http://supplier.wikiwhere.net/Wuhan_Line_Power_Transmission_Equipment_Co.,_…. is incorrect, we want to change 2008 into 2000. Pls. inform me of the editor who could modify this article?
Appreciate your further instructions, and looking for your soonest reply!
Best regards
King Wei
Import & Export Manager
Wuhan Line Power Transmission Equipment Co., Ltd.
No.5 Yangguang Road, Miaoshan, Wuhan (430223), Hubei, China.
Tel.: +86 27 81319015 ext. 8032
Fax.: +86 27 81319175
Mob.: +86 15827280627
www.whlinepower.com
Skype: wlbjiab
From: mediawiki-l-request
Date: 2013-08-07 20:00
To: mediawiki-l
Subject: MediaWiki-l Digest, Vol 119, Issue 4
Send MediaWiki-l mailing list submissions to
mediawiki-l(a)lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
or, via email, send a message with subject or body 'help' to
mediawiki-l-request(a)lists.wikimedia.org
You can reach the person managing the list at
mediawiki-l-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of MediaWiki-l digest..."
Today's Topics:
1. Page edit question (John W. Foster)
2. dumpBackup.php (John W. Foster)
----------------------------------------------------------------------
Message: 1
Date: Tue, 06 Aug 2013 13:30:56 -0500
From: "John W. Foster" <jfoster81747(a)gmail.com>
To: Mediawiki <mediawiki-l(a)lists.wikimedia.org>
Subject: [MediaWiki-l] Page edit question
Message-ID: <1375813856.14457.1.camel(a)beast.home>
Content-Type: text/plain; charset=UTF-8
Im trying to edit a main Page on a new site; I,m getting this error &
not sure why, as the file called into error is installed as it is called
for. Any help is appreciated. New to Semantic Mediawiki.
> Warning: Missing argument 2 for ParamProcessor
> \ParamDefinition::__construct(), called
> in /home/content/07/11469707/html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php on line 589 and defined in/home/content/07/11469707/html/extensions/Validator/includes/ParamProcessor/ParamDefinition.php on line 175
>
> Fatal error: Call to a member function setPrintRequests() on a
> non-object
> in /home/content/07/11469707/html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php on line 79
john
------------------------------
Message: 2
Date: Tue, 06 Aug 2013 21:08:37 -0500
From: "John W. Foster" <jfoster81747(a)gmail.com>
To: mediawiki-list <mediawiki-l(a)lists.wikimedia.org>
Subject: [MediaWiki-l] dumpBackup.php
Message-ID: <1375841317.23926.3.camel(a)beast.home>
Content-Type: text/plain; charset=UTF-8
Ive used this to dump what I hoped would be a complete backup of my
local hosted mediawiki. Th purpose of which was to import the .xml file
to a new working server. The script to import it did the job. however
the dumped.xml file did not contain all the articles. It uploaded 476 of
a site that contains 5391 articles. Just wondering why. Ive done it 3
times with no different results.
john
------------------------------
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
End of MediaWiki-l Digest, Vol 119, Issue 4
*******************************************
Hello,
Im having this problem every time I create a new user.
The user get the password by email, but when the they try to log in, it
fails.
Then I have to set the password with maintenance/changePassword.php, and
then it works fine.
Any thougths on what might be causing this issue?
--
[]'s
Camponez
Im trying to edit a main Page on a new site; I,m getting this error &
not sure why, as the file called into error is installed as it is called
for. Any help is appreciated. New to Semantic Mediawiki.
> Warning: Missing argument 2 for ParamProcessor
> \ParamDefinition::__construct(), called
> in /home/content/07/11469707/html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php on line 589 and defined in/home/content/07/11469707/html/extensions/Validator/includes/ParamProcessor/ParamDefinition.php on line 175
>
> Fatal error: Call to a member function setPrintRequests() on a
> non-object
> in /home/content/07/11469707/html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php on line 79
john
Charset=ISO-8859-I
----------
Sent from my Nokia phone
------Original message------
From: <mediawiki-l-request(a)lists.wikimedia.org>
To: <mediawiki-l(a)lists.wikimedia.org>
Date: Wednesday, July 31, 2013 12:00:21 PM GMT+0000
Subject: MediaWiki-l Digest, Vol 118, Issue 25
Send MediaWiki-l mailing list submissions to
mediawiki-l(a)lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
or, via email, send a message with subject or body 'help' to
mediawiki-l-request(a)lists.wikimedia.org
You can reach the person managing the list at
mediawiki-l-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of MediaWiki-l digest..."
Today's Topics:
1. integrating analytics into your MediaWiki installation
(Sumana Harihareswara)
----------------------------------------------------------------------
Message: 1
Date: Tue, 30 Jul 2013 16:37:43 -0400
From: Sumana Harihareswara <sumanah(a)wikimedia.org>
To: MediaWiki announcements and site admin list
<mediawiki-l(a)lists.wikimedia.org>
Cc: Daniel Friesen <daniel(a)nadir-seen-fire.com>
Subject: [MediaWiki-l] integrating analytics into your MediaWiki
installation
Message-ID: <51F82417.8030409(a)wikimedia.org>
Content-Type: text/plain; charset=ISO-8859-1
A shoutout to Daniel Friesen -- thanks, Daniel! -- in this post about
what to change in LocalSettings.php to integrate analytics:
http://infotrope.net/2013/07/30/clicky-analytics-with-mediawiki/
The blogger, Skud, particularly recommends Clicky Analytics
<http://clicky.com/>.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
------------------------------
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
End of MediaWiki-l Digest, Vol 118, Issue 25
********************************************
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA384
Hi,
Has anyone tested MediaWiki with MySQL 5.6.x? I'm aware that Wikimedia
now uses MariaDB, but I'm currently stuck with MySQL on Amazon RDS and
was wondering whether I should upgrade from 5.5.31 to 5.6.12.
- -Kudu.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG/MacGPG2 v2.0.20 (Darwin)
iQIcBAEBCQAGBQJR+eWtAAoJEMG+LL161/I7vZYP/jLOcF48u2oVyipIi06KYvrD
GFpPErKysFkRK8WRcINMsT4yvWpcy9vaJII42Hix0I6sFZuDRDhCdGGfuAAKKFeH
LahsZBTWUENja73OdsSnGUlUcCXS88+TfPz3X4u3ZMcf564lkt1uzKveJ5Odl/7u
OKxoFmSffL77G19PuX8PnmjTqdjgAGU457fQCeDkGlA0TvY0KJWPB7TCEj8aXz52
RAgbzcNrNtNiYP+9ngaBsRRIV1ILWgG03u3FP09GqLVTB4Dw7UiAWgCdPp4CWa3O
/Icik7QiDN2mUbT6OwCFCfdTkrDZaxJDYGIOLwNs8uI1ssTV+BMMfdgm4zgraCGO
dB1w2qSHbpmPVq2NsvSfVR7885m0CyWvCQQDkWyzJazOAoca6BO9TLpqrV56J2+B
qpzh5jTHfFg7msuWiVAXtZSM+T3YsKoQdtk+NSnF+3pg/e/Xjv3XGGdr3ba7rfLg
VUbT/bCP5aRZeB8IaOlXhEV/GlU8c/SmPVyDeP0AmfbR2fEY9vNhwupHYCeVOeo5
N9UNTZWnOetxeYRK5u1MKTFTvgvOdF5K6gZChAKsog3s/fuHxLAWTPWEpCpl9WzS
rX79ELOHU0RU5bT/XJBWYuoobM4aDSLdmAV7kv+Di2nUMNlNtfv9QwbnJ31zl57a
mXTjVOhyWy7CDtIMHcLv
=3rd6
-----END PGP SIGNATURE-----
Hi All
Just wanted to run this by some on the list. Been scratching my head on
how to deal with file caching and using the MobileFrontend extension on
one domain. I know the preferred method is a mobile domain and a desktop
domain. I have no way of installing AMF and the manual page is a little
vague on the rest of what to do, .htaccess, etc. Dealing with two
domains and not being able to create symlinks is another hurdle, I'd
have to maintain two instances of Mediawiki(as best as I can tell). A
couple of references to how Wikimedia does it are dead links so I
started brainstorming.
I came across an old post about dealing with file caching using the
$wgRenderHashAppend(thanks Platonides!) to set file caching dynamically.
Here is what I came up with using a script to detect mobile devices by
inserting this at the top of my LocalSettings.php. I know it won't
handle them all.
//Mobile detect
if (!defined('REQUEST_MOBILE_UA')) {
define('REQUEST_MOBILE_UA',
'(iPhone|Android|MIDP|AvantGo|BlackBerry|J2ME|Opera
Mini|DoCoMo|NetFront|Nokia|PalmOS|PalmSource|portalmmm|Plucker|ReqwirelessWeb|SonyEricsson|Symbian|UP\.Browser|Windows
CE|Xiino)');
}
preg_match('/' . REQUEST_MOBILE_UA . '/i',
$_SERVER['HTTP_USER_AGENT'], $match);
if (!empty($match)) {
$wgRenderHashAppend = 'mobile';
} else {
$wgRenderHashAppend = 'desktop';
}
//End Mobile detect
Then I set my file cache to append the result.
$wgFileCacheDirectory = "$IP/cache/html/$wgRenderHashAppend";
It seems to work as far as I can tell and you are able to switch back
and forth between desktop and mobile. Once you make the switch, you stay
in desktop until you request mobile again. Even creates the file cache
pages in the correct directory, /desktop if they don't exist while on
mobile. Switch back to mobile view and you will be served cache pages
from the mobile directory it they exist, otherwise they are created in
the mobile directory.
Am I missing anything? Advice welcome.
Thanks
Tom