----- Original Nachricht ----
Von: Samat <samat78(a)gmail.com>
An: info(a)gno.de
Datum: 20.10.2013 10:23
Betreff: Re: Re: [Pywikipedia-l] import module --> ImportError: No module
named ...
> Dear xqt,
>
> I got my core bot from github using TortoiseSVN, it should have been the
> latest repository.
> I followed the instructions on
> https://www.mediawiki.org/wiki/Manual:Pywikibot/Installation page, so I
> used h <https://github.com/wikimedia/pywikibot-core/trunk>
> ttps://github.com/wikimedia/pywikibot-core/trunk<https://github.com/wikimedi
> a/pywikibot-core/trunk>
> (In fact I checked out into a pywikipedia folder instead of
> /pywikibot-coreor
> /svn-core, but it shouldn't cause this (path?) problem.)
>
> Best,
> Samat
>
I use TortoiseSVN too for my bot. You have to set external properties to the svn working copy to include all needed externals as follows:
https://github.com/wikimedia/pywikibot-i18n/trunk scripts/i18n
https://github.com/wikimedia/pywikibot-externals-httplib2/trunk externals/httplib2
(maybe I have to place the properties anywhere as file)
You can do it manually by
- right click onto the svn working copy
- choose "Properties" (last item of the pull down menu)
- click "Subversion" tab
- click "Properties" button
- click "New..." -> "Externals"
- click "New..."
- set "scripts/i18n" for the Local path
- set "https://github.com/wikimedia/pywikibot-i18n/trunk" for URL
- click "OK"
- Again click "New..." for the next external
- set "externals/httplib2" for the Local path
- set "https://github.com/wikimedia/pywikibot-externals-httplib2/trunk" for URL
- click "OK" to save
- click "OK" to leave property edit
- click "OK" to leave property
Now make "SVN update" to retrieve the externals.
For compat you need to include externals too as follows:
... (open properties, see above)
- set "i18n" for the Local path
- set "https://github.com/wikimedia/pywikibot-i18n/trunk" for URL
... (save it, see above)
Xqt
> Brave new world!
> I liked the old days when it was just simple to use Pywiki. No reinstall
> after each update, no 100-mile-long commands.
>
>
If you install core as side package, the command line of ore and compat is the same.
Without installing core the core command is just 4 characters longer than compat:
in compat you run for example
touch.py user:xqt/Test
in core it is
pwb.py touch user:xqt/Test
this is p+w+b+<blank> more.
btw you may shorten command lines with command files. I always use command files to invoke bot with all needed options.
Binaris, just change your monitor settings. You shouldn't use font size with 7^7 Point which make 4 charaters 100 miles long ;)
Greetings
xqt
Hi,
I had used pywikibot for years, but in the past years I didn't use it.
Now, I would like to use it again on Windows7, therefore I downloaded the
code from here: https://github.com/wikimedia/pywikibot-core using
TortoiseSVN, and installed Python 2.7.5. I use the last version of these.
I set the path for Python27, pywikipedia and pywikibot folders as PATH
system variable.
I have my own user-config.py file.
I followed the instruction on
https://www.mediawiki.org/wiki/Manual:Pywikipediabot/Windows
As I want to run login.py, I get this message:
Traceback (most recent call last):
File "C:\Program Files\pywikipedia\pywikibot\login.py", line 15, in
<module>
import pywikibot
ImportError: No module named pywikibot
I got similar error for other scripts: import module --> ImportError: No
module named ...
Could you please help me, how could I solve this problem?
Thanks in advance.
Best regards,
Samat
Hoi,
Core and Compat are the two faces of pywikipedia. Who is interested to
answer some questions that helps me understand this better.
I would like to ask ten questions about this and publish the answers on my
blog.
Thanks,
Gerard
PS please answer off-list
Hi,
As I've said in a previous thread, I believe that feature parity
should exist between core and compat, but not necessarely vice-versa
(i.e. all non-deprecated features from compat should exist in core).
I've logged bug #55880 as a tracking bug for such issues. If you know
of other useful features from compat which are not (yet) in core,
please log bugs for them and set them as blocking for bug 55880.
Also, if you can fix any of the bugs on that list, feel free to send a patch. :)
Thanks,
Strainu
Hello,
I'm trying to convert a fairly large set of scripts from compat to
core and I found a significant loss of functionality in getting image
and template info. While writing this, I've noticed that the latest
version of core also has some of these prblems. I will elaborate on
this loss of functionality below, but I would like to know if this
simplification is intended or if this is part of some work in
progress.
For the image parsing, the function linkedPages(withImageLinks = True)
used to provide images that were not included through templates, while
imageLinks would provide all the images. In core, the linkedPages
function no longer provides this capability, and I haven't found any
replacement (I ported the old function in my code)
For template parsing, templatesWithParams from class Page used to
provide a pair containing the template name and a list of parameters,
with the full "key=value" string. Nowadays, we're getting a dictionary
instead of that list. Normally there is nothing wrong with that,
except that in Python 2 the dictionary is unordered, which means that:
* the order of the parameters is forever lost
* the original text cannot be reconstructed (because of the above and
the missing whitespace information) - this means there is no easy way
to identify and/or replace a particular instance of the template in a
page with many identical templates. It used to be you could do it with
simple find/replace operations, now it takes some more work.
I personally would like to have the old behavior back, it would save
me and probably others a lot of work.
Thanks,
Strainu
That's when a trunk user develops a script. :-) -save was invented by me,
but I never used rewrite/core, and nobody felt like porting this feature to
core.
2013/10/14 <bugzilla-daemon(a)wikimedia.org>
> https://bugzilla.wikimedia.org/show_bug.cgi?id=55689
>
> --- Comment #3 from xqt <info(a)gno.de> ---
> Sorry, -search option is quite right but -save option is available on
> compat
> version only (yet).
>
> --
>
Crossposting to all lists of interest, sorry for the fancy long title
but should be easier to search for future reference.
I recently had the opportunity to mentor during the GHCOSD[0] for the
Wikimedia Foundation. We were two mentors from the Foundation, and I
took on mentoring what we called the challenging tasks: collaborating
your first patch and writing your first bot.
This e-mail is about my approach to the writing your first bot task,
posting it here for future reference in case someone finds it useful,
and for comments/opinions. My approach consisted of challenging the
participants to write a game called Wikiflashcards. The game would use
pygame[1] to display an index card with the name of a country and,
after clicking, it would reveal the name of the capital city of that
country. The frontend was all given[2] so that participants wouldn't
have to worry about pygame at all (yet, we learned all the possible
ways to install pygame on a relatively old Mac, pretty complicated),
instead their task was to implement the backend using pywikibot to
generate the list of cities and getting the capital for each city.
This would naturally introduce the concept of listing a set of pages
of interest, searching through the wikicode, mining templates,
filtering links, etc.
This approach differs from that of teaching people how to use
pywikibot to collaborate directly with the wikipedia. My hypothesis is
that teaching how to use these tools to "scratch your own itch",
personal research, hobby, etc would make people match pywikibot with
their own interest, make them active users of the framework and that
will eventually lead them to use their expertise to collaborate with
any of the WMF projects.
After finishing a first version of the backend, I introduced the
concept and purpose of Wikidata, challenged the participants to
rewrite the backend using Wikidata items and properties and compare
the two approaches - in particular, the complexity of the first
approach vs the advantages of having a new backend ready for i18n and
whatnot. The goal was to naturally introduce the need of a structured
way to store and retrieve data, since I believe a direct introduction
to Wikidata to someone that has never been involved in a task of
mining data out of a Wikipedia looks very artificial.
At the end the challenge seemed to be very engaging for the
participants, and I had positive feedback about it but that doesn't
really tell if the goals listed above were achieved or not. If you
have further comments or questions just let me know.
Disclaimer: I'm not implying this is a good idea (in particular, I'm
not implying this was the best idea for this particular event), just
my idea.
David E. Narvaez
[0] http://gracehopper.org/2013/conference/grace-hopper-open-source-day/
[1] http://pygame.org/news.html
[2] https://gitorious.org/wiki-flash-cards/wiki-flash-cards
Hi,
This project has done a big change in its infrastructure and it's normal
to hit sharp corners at the beginning. Looking at how the transition has
gone for the rest of MediaWiki / Wikimedia projects you can be quite
confident that the change is worth.
The type of discussions you are having here remind the discussions that
some MediaWiki core and extension developers lead a year ago when
switching from SVN to Git. Nowadays most people is mostly happy with the
current setup, and in fact a lot happier than before. Yes, there is a
bit more of process but yes it is a lot more difficult to get bugs and
regressions sneaking into your master branch and deployments.
Also, Git and code review workflows are widely adopted. We are using
standard tools. Anything you learn here will be useful in many other
software projects.
About Windows users: they exist :) and the documentation has
instructions specific for them.
https://www.mediawiki.org/wiki/Gerrit/Tutorial
About GitHub users: they also exist, and fwiw there is a way to sync
GitHub and Gerrit repos.
https://www.mediawiki.org/wiki/User:Yuvipanda/G2G
If you still find problems please report them in Bugzilla:
https://bugzilla.wikimedia.org/enter_bug.cgi?product=Wikimedia&component=Gi…
Thank you for using the Wikimedia infrastructure.
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
I still don't understand, why was abandoned SVN and is there some git
instead.
In svn times when there was some critical problem, usually was patch after
some hours. One person wrote it, submited it and other users can download
and use.
now there was critical bug with interwiki.py, which happened about 15th
september. In these days was old sourceforge tracker moved to bugzilla, so
report was lost somewhere. After ten days I reported this bug again[1].
Three days later there was patch, but we had to wait one week more when
another developer rewieved this patch.
now there are hundrets of new unconnected articles in wiktionaries,
wikiquotes, wikinews...
In the meantime there was some diff, from which was possible to patch
manually scripts [2], but not in plaintext, with tabs instead of spaces;
and nowhere was complete patched file to download.
The second problem is git: some people on IRC said, that there were many
people in Hackathon who weren't able to instal git correctly - and all of
them have PC with Windows - and it were about 80% of people with windows.
Is somewhere *simple manual* how to install and run git updates on windows?
or is somewhere *simple manual* how to use svn again?
and is somewhere possibility do download certain file from bot? now there
are only nightly dumps, which overwrites my changes in files when I want to
unpack it...
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=54480
[2] https://gerrit.wikimedia.org/r/#/c/86047/3/wikipedia.py
JAnD