Hi,
you know, links break after time... Thats why many wikis use services
such as webcitation.
Is there some non-experimental extension, which automatically creates
backups of links using either webcitation, archive.org or similar services?
Or some doing some other kind of backup, such as screenshot the page or
storeing the page?
(The ArchiveLinks extension [1] looks very well, serving exactly this
purpose, but its really stresses, not to use in in production wiki.)
Cheers,
adrelanos
[1] https://www.mediawiki.org/wiki/Extension:ArchiveLinks
This is a notice that on Tuesday, September 3rd between 20:00-21:00 UTC
(1-2pm PDT) Wikimedia Foundation will release security updates for current
and supported branches of the MediaWiki software, as well as several
extensions. Downloads and patches will be available at that time, with the
git repositories updated later that afternoon.
Chad Horohoe and I've been working on a new search backend for
MediaWiki for the past few months and it is finally ready for people
to try out! Project goals:
1. Parity or better than the MWSearch/lsearchd combination used inside WMF.
2. Easier install than the MWSearch/lsearchd combination used inside WMF.
3. Expand templates before indexing.
4. Horizontal scalability for query performance (read this as sharded indexes)
It is currently running in WMF's beta environment
(http://beta.wmflabs.org/) and on test2.wikipedia.org and as an option
on mediawiki.org (you have to add the super secret url parameter
"srbackend=CirrusSearch".)
We think we've hit goals 2, 3, and 4 already. We're far enough along
with goal 1 that any features missing are being filed as bug
(https://bugzilla.wikimedia.org/buglist.cgi?list_id=230108&resolution=---&qu…)
We'd love for some guinea pigs to give it a shot on wikis outside of
WMF. To set it up you'd need to:
1. Install elasticsearch (http://elasticsearch.org/)
2. Setup Mediawiki from the master branch:
git clone https://gerrit.wikimedia.org/r/mediawiki/core
<the rest of setup>
3. Clone CirrusSearch:
cd mediawiki/extensions
git clone https://gerrit.wikimedia.org/r/mediawiki/extensions/CirrusSearch
4. Update the submodule in CirrusSearch:
cd CirrusSearch
git submodule init
git submodule update)
5. Turn on CirrusSearch and configure it in LocalSettings.php by adding:
require_once( "$IP/extensions/CirrusSearch/CirrusSearch.php" );
$wgCirrusSearchServers = array(
'the_server_on_which_you_installed_elasticsearch');
6. Run two scripts to set up the index:
php updateSearchConfig.php
php forceSearchIndex.php
At some point we'll update mediawiki-vagrant to handle all that
automatically....
Please us me know what you think.
Nik Everett
I want to see what is in the LocalSettings.php that runs Wikipedia.
Surely there is a public copy somewhere. I wish to see why some of my
stuff is NOT working, perhaps that will assist me in figureing out my
errors.
Thanks
John
Any help with this;
> Import failed: No handler for model 'Scribunto'' registered in
> $wgContentHandlers
I have scribunto installed & the wiki special pages says its working.
The page I reied to upload, actually several in an .xml file were
downloades(exported) from wikipedia. So there should be no issue on the
files compilation, therefore, I figure I got something screwed up on the
new site.
Thanks!
John
Hi,
I have a big set of articles and i want to import them in a mediawiki site.
For this i have to create a huge number of articles and upload similar
number of images. For creating the articles i think
API:Edit<https://www.mediawiki.org/wiki/API:Edit>would be better. I
wanted to use some PHP bots and found
Wikimate <https://github.com/hamstar/Wikimate>[1]. It is easy to use and
workign fine for smaller texts. But it just do not worf for the large texts.
Here my question is, is there any PHP bot framework which will help me to
create new articles with large texts and may have the file upload feature?
or should i go for some different approach?
[1] - https://github.com/hamstar/Wikimate
*--
**Nasir Khan Saikat* <http://profiles.google.com/nasir8891>
www.nasirkhn.com
Awesome update.
Nikolas Everett, 28/08/2013 20:21:
> Today we threw the big lever and turned on our new search backend at
> mediawiki.org <http://mediawiki.org>. It isn't the default yet but it
> is just about ready for you to try. [...]
> 2. The relative weighting of matches is going to be different. We're
> still fine tuning this and we'd appreciate any anecdotes describing
> search results that seem out of order.
I'm not very imaginative and I have no idea what queries to test,
especially when it comes to language-specific searches (that we probably
be able to test some time soon on beta.wmflabs.org language subdomains?).
I know from
<http://laxstrom.name/blag/2012/02/13/exploring-the-states-of-open-source-se…>
that it's possible to get corpuses of actual search queries. It would be
nice to have some extract, or whatever is easy to produce, to have some
ideas of stuff to test and improve our anecdotal assessment of the
search results quality.
Other ideas may come from previous bugs, I guess. :S
<https://bugzilla.wikimedia.org/buglist.cgi?query_format=advanced&component=…>
> [...]
> 6. incategory:"category with spaces" isn't working. (Bug 53415)
Does it work on categories not directly mentioned on the page? It may
considered an obvious yes as you are indexing expanded templates, just
checking. (You probably want to avoid that exceptions in templates
indexing break this feature.)
> What we've changed that you probably don't care about:
> 1. Updating search in bulk is much more slow then before. This is the
> cost of expanding templates.
> 2. Search is now backed by a horizontally scalable search backend that
> is being actively developed (Elasticsearch) so we're in a much better
> place to expand on the new solution as time goes on.
>
> Neat stuff if you run your own MediaWiki:
> CirrusSearch is much easier to install than our current search
> infrastructure.
Yay! :) Maybe people on mediawiki-l would like to test it too.
Nemo
>
> So what will you notice? Nothing! That is because while the new search
> backend (CirrusSearch) is indexing we've left the current search
> infrastructure as the default while we work on our list of bugs. You
> can see the results from CirrusSearch by performing your search as
> normal and adding "&srbackend=CirrusSearch" to the url parameters.
>
> If you notice any problems with CirrusSearch please file bugs directly
> for it:
> https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensions…
>
> Nik Everett
Hello,
I would like to announce the release of MediaWiki language extension
bundle 2013.08
* https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2013.08.tar…
* sha256sum: 21b3abf3a8e19d0c746d41d246e4bc8883d0f5e179d894e1720500031c621f2c
Quick links:
* Installation instructions are at https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to https://bugzilla.wikimedia.org
* Talk with us at #mediawiki-i18n @ freenode
Release notes for each extension are below.
Kartik Mistry
== Babel, CLDR and LocalisationUpdate ==
* Only localisation updates.
== Translate ==
=== Noteworthy changes ===
* Initial translation area height is increased. With a higher
translation area it provides more space for suggestions.
* Bug 47861: Allow text selection in page mode of Special:Translate so
that parts of original message can be copied and pasted.
* Bug 46875: The tooltip highlighting the proofread action now appears
less often in incorrect places.
* Bug 36692: On language stats and message group stats, rows can no
longer get stuck in highlighted state.
* On translation pages, a "Translate" tab is shown instead of "Edit"
tab to improve usability and discoverability of the translation
functionality.
* Bug 49850: The page mode of Special:Translate parsed incorrectly
some square brackets as external links. Now it checks for supported
protocol before turning it into a link.
* Bug 52623: Discard changes button now works as expected in Google Chrome.
* Map 'be-tarask' language to 'be' language in Yandex machine
translation suggestion.
* Bug 52272: Assistant languages suggestions ('In other languages')
are no longer stripped of newlines.
* A step forwards for arbitrary source language for translatable pages
was taken. Translate now extension now respects the page soure
language returned by MediaWiki. There is still no user interface to
set the page source language.
* Bug 49326: Add "notify translators" link after marking page for
translation when TranslationNotifications is installed.
* Loading of message content now works correctly. It was broken with
namespaces which did not force capitalization of the first letter.
* Bug 52216: Special:AggregateGroups has now better group selector
which supports search.
=== Changes relevant to API users and developers ===
* Check for and disallow dynamic groups in ApiQueryMessageGroupStats.
* Reimplement beforeSubmit, afterSubmit and afterRegisterFeatures
hooks to support them in the new TUX editor.
== UniversalLanguageSelector ==
=== Noteworthy changes ===
* Added support for event logging. To use it you need to install the
EventLogging extension. The support is considered to be experimental.
To enable event logging, add following lines to your LocalSettings.php:
require_once "$IP/extensions/EventLogging/EventLogging.php";
$wgMainCacheType = CACHE_MEMCACHED;
$wgMemCachedServers = array( '127.0.0.1:11211' );
$wgEventLoggingBaseUri = 'http://localhost:8080/event.gif';
$wgEventLoggingFile = '/var/log/mediawiki/events.log';
To learn more about setting up EventLogging extention, see:
[1] https://github.com/wikimedia/mediawiki-extensions-EventLogging/blob/master/…
[2] https://www.mediawiki.org/wiki/Extension:EventLogging
* The libraries are loaded on demand, when they are actually used for
reducing traffic.
* Language settings are closed when clicked outside ULS langauge
selection window.
* Bug 50564: When the user makes changes in multiple modules and
clicks the Cancel button or closes the language settings after that,
cancel the changes in all the modules.
* Bug 50562: Canceling font change doesn't work for system font.
* Internal code changes to improve performance and make loading faster.
* Top position the ULS for IME menu with reference to the input field
instead of ... menu item to fix weird positioning at some places.
* Bug 51923: Use no-repeat follow url for background images to remove
possible duplicate images
* Fix wrong language comparison in webfonts so that expliticit
fontfamily style are not added to child elements.
* Use events instead of callbacks for success or no results in the ULS
search box. This allows extension users to bind for this event and
reduces callbacks.
* Use mw.hook for notifying cancel of settings window to modules.
* Documentation fixes.
=== Browser Blacklisting ===
* Currently, Internet Explorer < 7 is blacklisted for MediaWiki
version 1.22 and Internet Explorer 6 and 7 are explicitly blacklisted
for MediaWiki before version 1.22
=== Fonts ===
* Added Gentium font for Latin languages rich with diacritics like
IPA, Vietnamese and Polytonic Greek.
* Added Junicode font for Old English.
* Added Phetsarath font for Lao.
* Added lklug font for Sinhala.
* Added the Nuosu SIL font for the Yi language.
* Added Xerxes for for Old Persian.
* Added Shapour font for Pahlavi script.
* Added Nazli as a serif font for Persian script.
=== Input methods ===
* Bug fixes in Gujarati Phonetic, Gujarati Inscript 2, Punjabi
Phonetic and Oriya keyboards that didn't allow typing some characters.
* Added Kyrgyz Cyrillic keyboard.
* Added IPA X-SAMPA layout.
* Fixed the IPA-SIL layout: use the "modifier letter apostrophe" for
ejective consonants.
* Fixes ZWNJ character issues for Hindi and Marathi input methods.
* Updated Javanese keyboard.
* Removed outdated Myanmar keyboard.
--
Kartik Mistry | IRC: kart_
{0x1f1f, kartikm}.wordpress.com
Buenas estimados Sres.
Por favor necesito un su colaboración para la creación de un servidor wiki
Saludos
César Renteria
CONFIDENCIALIDAD La información contenida en este mail puede ser confidencial y de propiedad intelectual de EUROCOM. Si usted ha recibido este e-mail por error, le pedimos lo informe al remitente y proceda a la destrucción de su contenido.