I just wanted to put() a simple page on a MediaWiki 1.16
instance, where I have to use screen scraping (use_api=False).
There is something strange however:
There is an API call invoked by _getBlocked:
/w/api.php?action=query&format=json&meta=userinfo&uiprop=blockinfo
Here's my backtrace:
File "pywikipedia/wikipedia.py", line 693, in get
expandtemplates = expandtemplates)
File "pywikipedia/wikipedia.py", line 743, in _getEditPage
return self._getEditPageOld(get_redirect, throttle, sysop, oldid, change_edit_time)
File "pywikipedia/wikipedia.py", line 854, in _getEditPageOld
text = self.site().getUrl(path, sysop = sysop)
File "pywikipedia/wikipedia.py", line 5881, in getUrl
self._getUserDataOld(text, sysop = sysop)
File "pywikipedia/wikipedia.py", line 6016, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File "pywikipedia/wikipedia.py", line 5424, in _getBlock
data = query.GetData(params, self)
File "pywikipedia/query.py", line 146, in GetData
jsontext = site.getUrl( path, retry=True, sysop=sysop, data=data)
getUrl(), which is also called from API, seems always
to call _getUserDataOld(text) where text is ... API output
so it tries to do strange things on that and gives warnings
like
Note: this language does not allow global bots.
WARNING: Token not found on wikipedia:pl. You will not be able to edit any page.
which is nonsense since the analyzed text is not HTML - only API output.
If getUrl() is supposed to be a low-level call, why call _getUserDataOld()
there?
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/7461
has introduced this call there.
It's easily reproducable by this:
import wikipedia
import config
config.use_api = False
wikipedia.verbose = True
s = wikipedia.getSite("pl", "wikipedia")
p = wikipedia.Page(s, u"User:Saper")
c = p.get()
c += "<!-- test -->"
p.put(c, u"Testing wiki", botflag=False)
//Saper
Having problems with pageimport.py on a third-party wiki (WikiQueer). Anyone else having issues with that script?
I'm calling it from a script I'm playing around with - but no luck. It doesn't error out - but it doesn't import and confirms that the import failed.
Here's the "test" script I'm working from:
import wikipedia as pywikibot
from pageimport import *
def main():
wanted_category_title = "Apple"
enwiki_site = pywikibot.getSite()
importerbot = Importer(enwiki_site) # Inizializing
importerbot.Import(wanted_category_title,project='wikipedia', prompt = True)
try:
main()
finally:
pywikibot.stopme()
On a related note, the ultimate goal is to import pages for "Wanted Categories" from English Wikipedia into the third-party wiki. Any ideas, tips or existing code to that end would also be appreciated.
Thanks!
-greg aka varnent
-------
Gregory Varnum
Lead, Aequalitas Project
Lead Administrator, WikiQueer
Founding Principal, VarnEnt
@GregVarnum
fb.com/GregVarnum
Hi folks,
is there a piece of code in pywiki that determines easily if the current
time zone is summer or winter time? (Last Sunday of March, 3 o'clock --
last Sunday of October, 2 o'clock)
--
Bináris
what is your command line to invoke the not script?
Xqt
----- Ursprüngliche Nachricht -----
Von: Bojan Kalkan
Gesendet: 02.04.2013 19:04
An: Pywikipedia discussion list
Betreff: [Pywikipedia-l] Archivebot
I constrantly get MissingConfigError: Missing or malformed template.
Where is the problem? This is our archive temlate
http://sr.wikipedia.org/wiki/%D0%9A%D0%BE%D1%80%D0%B8%D1%81%D0%BD%D0%B8%D0%…
Thanks.
_______________________________________________
Pywikipedia-l mailing list
Pywikipedia-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
I'm a bit stumped, but may have missed a change somewhere on how
watchlist tokens are supposed to be called using the API.
I'm logged in and get
<http://commons.wikimedia.org/w/api.php?action=query&prop=info&intoken=watch…>
but it returns the error "Action 'watch' is not allowed for the
current user" and fails to pass me a watch token. I have also tried
this as a post rather than a get, but with the identical result.
Was there some security change that
<http://www.mediawiki.org/wiki/API:Watch> does not reflect?
I'm not doing anything sneaky, I would just like a sensible way of
trimming down my uneditable 90,000 pages long watchlist on Commons.
Cheers,
Fae
--
faewik(a)gmail.com http://j.mp/faewm
Guide to email tags: http://j.mp/mfae
I tried script by reza1615
http://www.wikidata.org/wiki/User:Reza1615/BOT/new_interwiki.py
It works, but only for some pages; many of them script skip,
displaying only ---------------------- (which is in line 362)
i run new_interwiki.py -start:foo
or
new_interwiki.py -page:foo
unfortunatelly I am not able to change this code for work with all pages
Can anybody help?
JAnD
Hi PyWikipedians,
Is there a simple way to get a list of pages created on a particular date?
I was hoping to find something in pagegenerators, but maybe not?
Thanks for any pointers!
John
couple years ago somebody did stay for a couple years ;) (altho that was
the chapters conference and there was a vulcano involved)
henna
On Fri, Mar 8, 2013 at 7:19 PM, Petr Bena <benapetr(a)gmail.com> wrote:
> I have one question :) why the registration form is asking me which
> year and month I will depart? Are you afraid some attendees are
> planning to stay for several years? :D
>
> On Fri, Mar 8, 2013 at 6:54 PM, Maarten Dammers <maarten(a)mdammers.nl>
> wrote:
> > Hi everyone,
> >
> > Wikimedia Nederland invites all developers to the Wikimedia Hackathon.
> The
> > Wikimedia Hackathon will be in 2013 from 24-26 May. The registration is
> now
> > open and also includes the possibility to apply for a travel,
> accommodation
> > or full scholarship. You can find the form at
> >
> https://docs.google.com/spreadsheet/viewform?formkey=dFg2SmRRbkpxNmxCcFNFdl…
> >
> > The hackathon is an opportunity for all Wikimedia community developers
> and
> > sysadmins to come together, squash bugs and write great new features &
> > tools. Unlike the previous years (2012, 2011, etc.) this Hackathon won't
> be
> > in Berlin, but in Amsterdam.
> >
> > The event is open to a wide range of developers. We welcome both seasoned
> > and new developers as well as people working on MediaWiki, tools,
> > pywikipedia, Wikidata, gadgets, extensions, templates … . Please suggest
> and
> > discus topics at
> > https://www.mediawiki.org/wiki/Amsterdam_Hackathon_2013/Topics .
> >
> > You can indicate that you're coming at
> > https://www.mediawiki.org/wiki/Amsterdam_Hackathon_2013/Attendees and/or
> > https://www.facebook.com/events/167285526755104/ . This doesn't replace
> > registration, it's just to let others know what you're up to.
> >
> > Keep an eye on https://www.mediawiki.org/wiki/Amsterdam_Hackathon_2013for
> > updates!
> >
> > Maarten
> >
> >
> > _______________________________________________
> > Labs-l mailing list
> > Labs-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/labs-l
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
--
"Maybe you knew early on that your track went from point A to B, but unlike
you I wasn't given a map at birth!" Alyssa, "Chasing Amy"
Hey,
delete.py uses this summary for deleting a list of pages from file
(-file:file.txt option):
'delete-from-file': u'Robot: Deleting a list of files.',
It's really wrong and all of translation is based on this and all of them
must change
Best
--
Amir