Percolate is a system of Python scripts and modules for working with
bot-updated templates, built by WMF's grantmaking department, but reusable.
https://meta.wikimedia.org/wiki/Grantmaking_and_Programs/Learning_%26_Evalu…
"Percolate provides a relatively simple and robust way of replicating
some of the functionality that closed-source social software (like
Facebook and Quora) use to surface relevant content and recent activity
without sacrificing the flexibility that makes MediaWiki great. It does
this through profiles and views..."
(found via the monthly report:
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Report,_July_2013#Indi…
)
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
A bit old, but seemed like it might be useful.
-Sumana
-------- Original Message --------
Subject: Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW
mentor needed)
Date: Mon, 3 Jun 2013 17:31:53 +0200
From: Merlijn van Deen <valhallasw(a)arctus.nl>
Reply-To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
To: yuvipanda(a)gmail.com, Wikimedia developers
<wikitech-l(a)lists.wikimedia.org>
On 31 May 2013 17:19, Yuvi Panda <yuvipanda(a)gmail.com> wrote:
> I'm pretty sure that the 'syncing' can be accomplished by a simple
> bot, and it might even already exist(?). Will be happy to help write
> the bot if it doesn't exist yet.
Kasper Souren contributed replicate_wiki.py to pywikipedia at the last
hackathon. It allows replicating a complete namespace from one central
wiki to multiple others - see [1].
Merlijn
[1]
https://svn.wikimedia.org/viewvc/pywikipedia/branches/rewrite/scripts/repli…
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi,
I have set-up a wiki at http://wiki.x2592.com and installed (among other
things) the MathJax extension
(https://www.mediawiki.org/wiki/Extension:MathJax). (I originally tried
with the Math extension, but did not manage to get it working, despite
the fact that all the prerequisites seem to be met.)
MathJax is displaying Tex-encoded equations OK, but not MathML-encoded
equations. The greater-than and less-than characters in MathML tags seem
to be being converted to the HTML entities &ls and >.
I'm working with a local git checkout of the MathJax package (although
the results were identical using their CDN-delivered version), and a
one-week-old git checkout of MediaWiki and the extensions repository.
The MW MathJax is the version available as of 7 days ago, too.
The invocation string used to call MathJax is:
http://wiki.x2592.com/MathJax/MathJax.js?config=TeX-AMS-MML_HTMLorMML
Examples of equations and other supported content can be seen on the
main page of http://wiki.x2592.com, showing the problem, and the page is
currently enabled for editing for anyone that feels like dabbling.
It might be useful to add that I started a thread about this in the
MathJax discussion group:
https://groups.google.com/forum/#!topic/mathjax-users/ZSVVy0Gs0nY
Is anyone able to help me troubleshoot this and get MathML displaying
OK?
Many TIA if so, :-)
--
Dave
Where can I configure my wiki's top right Search box to do only the
traditional "Search" behavior (go to search results page) and _not_ do
the "Go" behavior (go to an article if its title matches)?
The "Go" action may be appropriate for wiki's that aim wide and
shallow at everything under the sun. My wiki however aims deep and
narrow at one topic. I find the "Go" search behavior gets in the way
of showing search results.
Here's what I've tried so far:
$wgVectorUseSimpleSearch=true; // makes the top right Search box
show a simple search icon, which I like, instead of the "Search" and
"Go" buttons, but it does the "Go" function.
$wgVectorUseSimpleSearch=false; // shows both the "Search" and "Go" buttons.
I want to show only the search _icon_ and have it do only the
_traditional_ Search behavior. Where can I configure this?
Thanks,
Roger
Teflpedia.com
Mediawiki 1.20.2
I've been trying to set up the Collection extension and my own render
farm so I can generate zim files from my wiki. So far no luck.
Here's where I am:
I followed this guide:
http://edutechwiki.unige.ch/en/Mediawiki_collection_extension_installation
And I can create zim files from wikipedia using:
mw-zip -c :en -o test.zip Acdc Number
mw-render -c test.zip -o test.pdf -w zim
mw-zip works just as one would expect.
-------------------------
mw-zip -c :en -o test.zip Acdc Number
creating nuwiki in u'tmpuIdHyY/nuwiki'
2013-09-07T10:10:58 mwlib.utils.info >> fetching
'http://en.wikipedia.org/w/index.php?title=Help:Books/License&action=raw&tem…'
removing tmpdir u'tmpuIdHyY'
memory used: res=25.0 virt=816.4
--------------------------
I can read those files using kiwix, so I know my basic render farm setup
is solid.
The problem is that I cannot get it to work with anything other than
wikipedia.
If I try the URL in the guide:
------------------
mw-zip -c http://edutechwiki.unige.ch/mediawiki/ -o test2.zip
Mediawiki_collection_extension_installation
creating nuwiki in u'tmpRnDvRH/nuwiki'
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line
328, in run
result = self._run(*self.args, **self.kwargs)
File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line
747, in refcall_fun
fun(*args, **kw)
File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line
632, in handle_new_basepath
api = self._get_mwapi_for_path(path)
File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line
684, in _get_mwapi_for_path
raise RuntimeError("cannot guess api url for %r" % (path,))
RuntimeError: cannot guess api url for 'http://edutechwiki.unige.ch/en'
<Greenlet at 0x24d2cd0: refcall_fun> failed with RuntimeError
WARNING: (u'Mediawiki_collection_extension_installation', None) could
not be fetched
removing tmpdir u'tmpRnDvRH'
memory used: res=19.3 virt=226.7
-------------------
and if I try my own:
-------------------
mw-zip --username=uuu --password=ppp -c
http://newmoon.seiner.lan/mediawiki/index.php/ -o test2.zip Test
creating nuwiki in u'tmpG82RPH/nuwiki'
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line
328, in run
result = self._run(*self.args, **self.kwargs)
File
"/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line
114, in run
api = self.get_api()
File
"/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line
28, in get_api
api.login(self.username, self.password, self.domain)
File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line
186, in login
res = self._post(**args)
File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line
106, in _post
res = loads(self._fetch(req))
File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line
23, in loads
return json.loads(s)
File "/usr/lib/python2.7/dist-packages/simplejson/__init__.py", line
413, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line
402, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line
420, in raw_decode
raise JSONDecodeError("No JSON object could be decoded", s, idx)
JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0)
<Greenlet at 0x1a7b870: <bound method start_fetcher.run of
<mwlib.apps.make_nuwiki.start_fetcher object at 0x1acf790>>> failed with
JSONDecodeError
removing tmpdir u'tmpG82RPH'
memory used: res=16.8 virt=152.5
Traceback (most recent call last):
File "/usr/local/bin/mw-zip", line 9, in <module>
load_entry_point('mwlib==0.15.11', 'console_scripts', 'mw-zip')()
File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py",
line 155, in main
make_zip(output, options, env.metabook, podclient=podclient,
status=status)
File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py",
line 50, in make_zip
make_nuwiki(fsdir, metabook=metabook, options=options,
podclient=podclient, status=status)
File
"/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line
189, in make_nuwiki
pool.join(raise_error=True)
File "/usr/local/lib/python2.7/dist-packages/gevent/pool.py", line 98,
in join
raise greenlet.exception
simplejson.decoder.JSONDecodeError: No JSON object could be decoded:
line 1 column 0 (char 0)
-----------------
I can't even begin to work on the actual extension interface until I
have this working..... Any suggestions? Where do I go next?
Thanks!
--
Project Management Consulting and Training
http://www.ridgelineconsultingllc.com
I know this is more of an apache question, but it should be a slam-dunk
and I am stuck..
I am trying to get short URLs working. I've configured mod-rewrite
before on other systems and I can't find anything wrong with any of the
various setups I've tried, but mod_rewrite is being ignored.
Here's my mediawiki setup:
Alias /mediawiki /var/lib/mediawiki
<Directory /var/lib/mediawiki/>
Options +FollowSymLinks
AllowOverride All
order allow,deny
allow from all
</Directory>
# some directories must be protected
<Directory /var/lib/mediawiki/config>
Options -FollowSymLinks
AllowOverride None
</Directory>
<Directory /var/lib/mediawiki/upload>
Options -FollowSymLinks
AllowOverride None
</Directory>
This works fine, except that I have long URLs in the form of
http://newmoon.seiner.lan/mediawiki/index.php/Main_Page
I would really like
http://newmoon.seiner.lan/wiki/Main_Page
I've tried
root@NewMoon:/etc/apache2/mods-enabled# cat rewrite.conf
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteRule ^/wiki(/.*)?$ %{DOCUMENT_ROOT}/mediawiki/index.php [L]
RewriteLog "/var/log/apache2/rewrite.log"
RewriteLogLevel 5
</IfModule>
and various other places. If I take out the Alias in the mediawiki
file, I get 404 errors.
Nothing is written in the rewrite log.
But if I intentionally break the rewrite.conf, I get an error, so I know
apache is reading it, but it is not rewriting the URL ever.
I've tried adding the rewrite rules in /var/www/.htaccess,
/var/lib/mediawiki/.htaccess, no joy.
Any hints on where I can look?
Thanks!
--
Project Management Consulting and Training
http://www.ridgelineconsultingllc.com
Sent from my Boost Mobile phone.
mediawiki-l-request(a)lists.wikimedia.org wrote:
>Send MediaWiki-l mailing list submissions to
> mediawiki-l(a)lists.wikimedia.org
>
>To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>or, via email, send a message with subject or body 'help' to
> mediawiki-l-request(a)lists.wikimedia.org
>
>You can reach the person managing the list at
> mediawiki-l-owner(a)lists.wikimedia.org
>
>When replying, please edit your Subject line so it is more specific
>than "Re: Contents of MediaWiki-l digest..."
>
>
>Today's Topics:
>
> 1. Re: "Go" vs "Search" (roger(a)rogerchrisman.com)
> 2. Re: "Go" vs "Search" (roger(a)rogerchrisman.com)
> 3. content in site root (Jeremy Baron)
> 4. Re: "Go" vs "Search" (Nik Everett)
>
>
>----------------------------------------------------------------------
>
>Message: 1
>Date: Fri, 6 Sep 2013 21:25:53 -0700
>From: roger(a)rogerchrisman.com
>To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
>Subject: Re: [MediaWiki-l] "Go" vs "Search"
>Message-ID:
> <CADxRgOzwk9TQZd194kOUnJsfpdvAJzb7t45x5AaX27YN6roCBQ(a)mail.gmail.com>
>Content-Type: text/plain; charset=ISO-8859-1
>
>Wait!
>
>At the bottom of the 'Simple search' search box's drop down suggestion
>list is a final item in a shaded box that reads 'Containing: <search
>sting here>'. Selecting that takes you to a search results page. Aha!
>
>Why am I only now learning this? Maybe I'm a model average user? Not
>likely. But it is learning by playing. So maybe others can do it, too?
>What do you think?
>
>Roger
>
>On Fri, Sep 6, 2013 at 9:18 PM, <roger(a)rogerchrisman.com> wrote:
>> I've just realized that, using 'Simple search', all one has to do to
>> get past the article title suggestions is lengthen ones search string
>> till it no longer exactly matches any article titles. Then, and only
>> then, does 'Simple search' take you to a search results page. Might
>> this simply be something folks need to learn to do, by doing? And
>> might it simply work? Are average users consistently so cleaver?
>> Argggg. I think if we lead them in the right direction it will work
>> fine. What do you think?
>>
>> Roger
>>
>> On Fri, Sep 6, 2013 at 8:27 PM, <roger(a)rogerchrisman.com> wrote:
>>>> Now that I think about it, does the prefix search still work with that hack?
>>>> By work I mean you should be able to select one of the entries that pops
>>>> down under the search box as you type and it takes you to that article.
>>>
>>> No, with the hack in place clicking on one of the drop down search
>>> suggestions takes you to the search results page for that value, not
>>> to an article page. That _is_ perhaps what I want. I want Search to
>>> take you to a search results page, so that a user may choose which
>>> page he wants.
>>>
>>> Is the drop down list of search suggestions generated from a list of
>>> the wiki's page titles? Hm, I guess it is. I had not realized that. So
>>> how does one search the wiki's page texts? This whole search feature
>>> seems to be designed to take a user to a destination article directly
>>> whenever possible. I'm not sure that is best for my wiki because that
>>> page may not always be the one a user wants and without the search
>>> results page, how will he find the one he does want? By doing another
>>> search with more search terms? Hm, maybe we can all get used to this
>>> and maybe it will just work.
>>>
>>> Argggg, and trying to change this designed in behavior may not be so
>>> easy for a chiefly point and click guy like me.
>>>
>>> So I may decide to use the two button ('Go' + 'Search') Search box
>>> that I get when I set $wgVectorUseSimpleSearch to false, instead,
>>> especially if this hack is not going to work in future.
>>>
>>> Thanks for the tips.
>>>
>>> Roger
>
>
>
>------------------------------
>
>Message: 2
>Date: Fri, 6 Sep 2013 21:27:46 -0700
>From: roger(a)rogerchrisman.com
>To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
>Subject: Re: [MediaWiki-l] "Go" vs "Search"
>Message-ID:
> <CADxRgOyz8C6iEZtc4En6bB03=93qxQ2KPb-_DxOddTOPEzc9Gw(a)mail.gmail.com>
>Content-Type: text/plain; charset=windows-1252
>
>> have you tried clicking the entry below the suggestions? (where it says
>> "containing…") see attachment
>>
>> -Jeremy
>
>Yes Jeremy. Thanks, just discovered that a moment ago. I guess others
>can "discover" it, too. Hope so.
>
>Roger
>
>
>
>------------------------------
>
>Message: 3
>Date: Sat, 7 Sep 2013 04:36:47 +0000
>From: Jeremy Baron <jeremy(a)tuxmachine.com>
>To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
>Subject: [MediaWiki-l] content in site root
>Message-ID:
> <CAE-2OCaRwyhRfkKEjPuVos423kK6X0NaSMfDHSoe3stdm+XjGQ(a)mail.gmail.com>
>Content-Type: text/plain; charset=UTF-8
>
>On Sep 7, 2013 12:27 AM, <roger(a)rogerchrisman.com> wrote:
>> Yes Jeremy. Thanks, just discovered that a moment ago. I guess others
>> can "discover" it, too. Hope so.
>
>btw, just noticed that your content lives in the site root. have a look at
>https://www.mediawiki.org/wiki/Manual:Wiki_in_site_root_directory when you
>get a chance.
>
>(moved offlist because I didn't notice it until now and didn't want to add
>more noise to the thread)
>
>-Jeremy
>
>
>------------------------------
>
>Message: 4
>Date: Sat, 7 Sep 2013 07:10:06 -0400
>From: Nik Everett <neverett(a)wikimedia.org>
>To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
>Cc: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
>Subject: Re: [MediaWiki-l] "Go" vs "Search"
>Message-ID: <097A3737-D691-4DCB-8E9B-8A3D4A2864F2(a)wikimedia.org>
>Content-Type: text/plain; charset=utf-8
>
>
>
>Sent from my iPhone
>
>On Sep 7, 2013, at 12:27 AM, roger(a)rogerchrisman.com wrote:
>
>>> have you tried clicking the entry below the suggestions? (where it says
>>> "containing…") see attachment
>>>
>>> -Jeremy
>>
>> Yes Jeremy. Thanks, just discovered that a moment ago. I guess others
>> can "discover" it, too. Hope so.
>>
>> Roger
>>
>
>You can prefix a search with ~ and that'll force a search but that isn't intuitive either.
>
>
>> _______________________________________________
>> MediaWiki-l mailing list
>> MediaWiki-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
>
>------------------------------
>
>_______________________________________________
>MediaWiki-l mailing list
>MediaWiki-l(a)lists.wikimedia.org
>https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
>End of MediaWiki-l Digest, Vol 120, Issue 8
>*******************************************
On Sep 7, 2013 12:27 AM, <roger(a)rogerchrisman.com> wrote:
> Yes Jeremy. Thanks, just discovered that a moment ago. I guess others
> can "discover" it, too. Hope so.
btw, just noticed that your content lives in the site root. have a look at
https://www.mediawiki.org/wiki/Manual:Wiki_in_site_root_directory when you
get a chance.
(moved offlist because I didn't notice it until now and didn't want to add
more noise to the thread)
-Jeremy
WE THE PEOPLE ARE IN NEED OF A Call/email list that registers people under zip code and county's . to send out to help the court advocates to have when court cases are going to court so we can notify the other court supporters so we can go in large groups
highriskgroup101(a)yahoo.com
Greetings; I want to reverse engineer the schema (tables.sql) which I've
loaded into mysql. Is there also a script that creates all the foreign keys
as specified in the
http://commons.wikimedia.org/wiki/File:Mediawiki-database-schema.png document?
Otherwise I can create them by hand.
--thanks;