import userlib
import wikipedia
site = wikipedia.getSite()
for username in list:
user = userlib.User(site,username)
user.sendMail('Subject','email contents')
On Sat, Nov 9, 2013 at 9:53 AM, John <phoenixoverride(a)gmail.com> wrote:
> If you can get a list of users, perhaps from a database query, its about 5
> lines of code to mass email them.
>
> On Sat, Nov 9, 2013 at 9:04 AM, Fred Bauder <fredbaud(a)fairpoint.net>wrote:
>
>> > Depends on the wiki configuration. It is fairly easy to mass email all
>> > users who have email enabled, via pywikibot
>>
>> Perhaps I will struggle with pywikibot again... Can't be any harder than
>> cube roots.
>>
>> Fred
>>
>> >
>> > On Sat, Nov 9, 2013 at 7:08 AM, Fred Bauder <fredbaud(a)fairpoint.net>
>> > wrote:
>> >
>> >> Is there a way to easily send an email message to all users registered
>> >> with a wiki? Perhaps to give out a new url if the wiki has moved.
>> >>
>> >> Fred
>> >>
>> >>
>> >> _______________________________________________
>> >> MediaWiki-l mailing list
>> >> MediaWiki-l(a)lists.wikimedia.org
>> >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>> >>
>> >
>>
>>
>>
>
Hi!!
I'm having a HUGE problem with the spamming in my wiki (
http://www.realzaragozapedia.es). I've installed the Confirm Edit and set
up the User Registration to "false"...but the wiki don't have much sense
with these parameters.
Also I'm using the "Merge and Delete" extension, wich is a tedious an very
slow way to delete all of the 7000 spam accounts registered in my wiki. Is
there any faster way to do that? (I mean, preserving the database integrity
).
Thanks for your help and sorry for my english.
Juan
Hi,
I was wondering if there is any way in which Mediawiki could support
content federation, something like InstantCommons but for transcluding
content from external mediawiki instances. Has it been attempted before?
The reason for this is that in some countries some texts are copyrighted
while in other countries they are supposed to be in the public domain.
On Wikisource we use iwpages [1] for translcuding between different
language versions, and it would be interesting to know if we could
transclude pages from external websites like Wikilivres [2].
Thanks
Micru
[1]
https://wikisource.org/w/index.php?title=MediaWiki:InterWikiTransclusion.js
[2] http://wikilivres.ca/wiki/Main_Page
Hello,
I'm trying to install MediaWiki on CentOS with php-fpm, memcached, and
nginx over SSL. If you've seen the Ars Technica "How to set up a Safe and
Secure Web Server" (
http://arstechnica.com/gadgets/2012/11/how-to-set-up-a-safe-and-secure-web-…)
I'm basically attempting to replicate all the applications they installed
on Ubuntu to CentOS.
So long story short, I'm installing Mediawiki as per the instructions here:
http://arstechnica.com/information-technology/2013/02/web-served-7-wiki-wik…
Whenever I attempt to browse to the default index page on my wiki to
perform the install, I get a blank page. I check my nginx error logs and I
get this:
2013/11/03 11:27:37 [error] 22415#0: *1 FastCGI sent in stderr: "PHP
message: PHP Fatal error: session_start(): Failed to initialize storage
module: memcache (path: /var/run/memcached/memcached.sock) in
/var/www/html/BlindSeeker/public_html/wiki/includes/templates/NoLocalSettings.php
on line 50" while reading response header from upstream, client:
192.168.1.2, server: www.blindseeker.localdomain, request: "GET /wiki/
HTTP/1.1", upstream: "fastcgi://unix:/var/run/php-fpm/php-fpm.soc:", host:
"192.168.1.11"
So, understand my web-foo isn't strong, but from what I can make of this,
Mediawiki is refusing to use memcache for session management. php-fpm has
been told where the unix socket for memcached is located, and other php
applications (vanilla, wordpress) are able to use it effectively.
I can also use nc -U /var/run/memcached/memcached.sock and connect to the
socket successfully, so I don't know why MediaWiki is refusing to use it.
So I've tried hitting the #mediawiki IRC channel for some answers, and the
only suggestion I got back was to see if safe_mode is turned on via
php.ini, other suggestions via google state to turn on verbose logging
under LocalSettings.php, but the file doesn't exist yet -- I haven't
finished the installation phase since the install page won't render.
So, I'm at a loss here. any suggestions?
--
when does reality end? when does fantasy begin?
Tonight I went through the RELEASE-NOTES-1.22 and did a little house
cleaning. In the process, I had some ideas for how we could make the
1.23 (and future releases) more usable more easily. I mention this on
this mailing list because I'd like to get feedback from actual users.
You can see my proposed changes here:
https://git.wikimedia.org/raw/mediawiki%2Fcore/d1daef8be9f856b6eeff2cb2d4cd…
The ordering is as follows:
* New features
* Breaking changes (collected in a single section)
* Configuration changes
* Bug fixes
* API changes
* Language updates
* Misc changes
I tried to put the changes I thought end users would be interested in
closer to the top to make them easier to find.
What do you think?
Mark.
Our MWsearch stopped updating a while ago. I'm wondering if one of Apple's Java updates caused a problem.
Release notes say the backend is it's Lucene Sarch 2.1.3. Checking the Java version:
$ java -version
java version "1.6.0_51"
Java(TM) SE Runtime Environment (build 1.6.0_51-b11-457-10M4509)
Java HotSpot(TM) 64-Bit Server VM (build 20.51-b01-457, mixed mode)
When I try to do
sudo sh update
from the Lucene-search directory, I get a bunch of messages and then:
14420 [main] WARN org.wikimedia.lsearch.interoperability.RMIMessengerClient - Error invoking remote method enqueueFrontend() on host Hexamer : error marshalling arguments; nested exception is:
java.net.SocketException: Broken pipe
java.rmi.MarshalException: error marshalling arguments; nested exception is:
java.net.SocketException: Broken pipe
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:138)
at java.rmi.server.RemoteObjectInvocationHandler.invokeRemoteMethod(RemoteObjectInvocationHandler.java:178)
at java.rmi.server.RemoteObjectInvocationHandler.invoke(RemoteObjectInvocationHandler.java:132)
at com.sun.proxy.$Proxy0.enqueueFrontend(Unknown Source)
at org.wikimedia.lsearch.interoperability.RMIMessengerClient.enqueueFrontend(RMIMessengerClient.java:183)
at org.wikimedia.lsearch.oai.IncrementalUpdater.main(IncrementalUpdater.java:214)
Caused by: java.net.SocketException: Broken pipe
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:92)
at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1864)
at java.io.ObjectOutputStream$BlockDataOutputStream.writeByte(ObjectOutputStream.java:1902)
at java.io.ObjectOutputStream.writeFatalException(ObjectOutputStream.java:1563)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:332)
at sun.rmi.server.UnicastRef.marshalValue(UnicastRef.java:274)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:133)
... 5 more
14421 [main] WARN org.wikimedia.lsearch.oai.IncrementalUpdater - Error sending index update records of colipedia to indexer at Hexamer
java.rmi.MarshalException: error marshalling arguments; nested exception is:
java.net.SocketException: Broken pipe
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:138)
at java.rmi.server.RemoteObjectInvocationHandler.invokeRemoteMethod(RemoteObjectInvocationHandler.java:178)
at java.rmi.server.RemoteObjectInvocationHandler.invoke(RemoteObjectInvocationHandler.java:132)
at com.sun.proxy.$Proxy0.enqueueFrontend(Unknown Source)
at org.wikimedia.lsearch.interoperability.RMIMessengerClient.enqueueFrontend(RMIMessengerClient.java:183)
at org.wikimedia.lsearch.oai.IncrementalUpdater.main(IncrementalUpdater.java:214)
Caused by: java.net.SocketException: Broken pipe
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:92)
at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1864)
at java.io.ObjectOutputStream$BlockDataOutputStream.writeByte(ObjectOutputStream.java:1902)
at java.io.ObjectOutputStream.writeFatalException(ObjectOutputStream.java:1563)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:332)
at sun.rmi.server.UnicastRef.marshalValue(UnicastRef.java:274)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:133)
... 5 more
It worked before, but the person who helped set it up has left my group. Java problem? conf problem? Any help would be appreciated.
Jim
=====================================
Jim Hu
Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
What's the best way to disable the back button for users? I have a user
who uses his back button a lot and we're finding that he accidentaly
reverts his changes.
Also, we're running into an issue with images being cached, most likely in
the browser, and editors have to refresh the page several times. Is there
a way to insert a PRAGMA: NO-CACHE into each page?
> From: Tim Starling <tstarling(a)wikimedia.org>
>
> On 05/11/13 07:38, Jan Steinman wrote:
>> But doing this: xxx {{#ifeq:"0 "|"{{#tag:sql2wikiInline|SELECT
>> count(*) FROM s_product_harvest h INNER JOIN mw_user u ON h.who1 =
>> u.user_id WHERE u.user_name = '{{{1}}}'|database=EcoReality}}"|no
>> harvests|some harvests}} xxx
>
> #tag generates an opaque placeholder, you can't compare it to things
> with #ifeq. It would work if your sql2wikiInline extension provided a
> proper parser function interface, instead of using #tag.
You've got my attention.
Can you point me to a starting point for figuring this out? (I find crawling through MediaWiki documentation to be an "opaque placeholder." :-)
It can be used as "<sql2wikiInline>...</sql2wikiInline>", but I think you're saying if it could be used as "{{#sql2wikiInline:..." it would work.
:::: Giving society cheap, abundant energy at this point would be the equivalent of giving an idiot child a machine gun. -- Paul Ehrlich
:::: Jan Steinman, EcoReality Co-op ::::
[reply] [-] Description Fred Bauder 2013-11-06 02:40:49 UTC
Please see the short discussion at
https://www.mediawiki.org/w/index.php?title=Project:Support_desk&offset=201…
If it is necessary in the sense that something must be changed in the
mediawiki
software to utilize this php accelerator if present, and it would improve
performance, please include support for the Zend_OPcache_v7.0.3-dev I
have no
idea what is involved other than that gained from reading the Wikipedia
article
https://en.wikipedia.org/wiki/List_of_PHP_accelerators
[reply] [-] Comment 1 Andre Klapper 2013-11-06 09:42:22 UTC
(In reply to comment #0)
> Please see the short discussion at
> https://www.mediawiki.org/w/index.php?title=Project:
> Support_desk&offset=20131104170209#Zend_OPcache_v7.0.3-dev_35188
Could you please summarize it here, and come up with one sentence that
describes what this bug report is about, and put that sentence into the
"Summary" field?
Status: NEW → UNCONFIRMED
Ever confirmed: false
Severity: normal → enhancement
[reply] [-] Comment 2 Fred Bauder 2013-11-06 11:38:16 UTC
(In reply to comment #1)
> (In reply to comment #0)
> > Please see the short discussion at
> > https://www.mediawiki.org/w/index.php?title=Project:
> > Support_desk&offset=20131104170209#Zend_OPcache_v7.0.3-dev_35188
>
> Could you please summarize it here, and come up with one sentence that
> describes what this bug report is about, and put that sentence into the
> "Summary" field?
I'm not sure what the summary field is or how to put anything in it, but
when
installing Mediawiki the installer searches for an extension to php that
would
do data caching and looks for APC Xcache Wincach. Does
Zend_OPcache_v7.0.3-dev
add anything to Mediawiki functionality and is support for it needed? Or
does
it function without anything being added to Mediawiki software? It was
added to
the latest version of PHP in March, 2013.
I guess the summary would be Support for Zend_OPcache_v7.0.3-dev
[reply] [-] Comment 3 Andre Klapper 2013-11-06 11:43:41 UTC
So does Zend OPcache support data caching?
[reply] [-] Comment 4 Fred Bauder 2013-11-06 11:50:19 UTC
Not according to the wikipedia article, however it does "opcaching" Is
Mediawiki configured to use what it does do?
[reply] [-] Comment 5 Andre Klapper 2013-11-06 12:39:13 UTC
Refering to the Support Desk thread where you asked "Would this Zend OPcache
v7.0.3-dev do the same thing?", you answered this yourself by "No" in
comment
4.
So I don't know what you request in this bug report is and what you would
like
to achieve.
Also see
https://www.mediawiki.org/wiki/Manual:MediaWiki_architecture#Caching
and https://meta.wikimedia.org/wiki/Talk:PHP_caching_and_optimization for
general information.
[reply] [-] Comment 6 Fred Bauder 2013-11-06 15:25:15 UTC
I am not a reliable source with respect to a question that I don't know the
answer to. There is no use in rephrasing the support request repeatedly, but
here goes: is Zend_OPcache_v7.0.3-dev supported by Mediawiki; what does
it de;
should it be supported?
[reply] [-] Comment 7 Andre Klapper 2013-11-06 15:31:43 UTC
Closing as INVALID, a bugtracker is not the place to discuss this.
Please refer to the support desk or the mediawiki-l mailing list at
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Status: UNCONFIRMED → RESOLVED
Resolution: --- → INVALID
Rather like the other bug I submitted 10 years ago. Dismissed... but how
about Zend_OPcache_v7.0.3-dev_35188?
Fred