Dear All,
Best regard of the day
I need to download wikipedia data for some text retrieval and evaluation
for my academic research. So I need to know how to download the content of
the following wikipedia link.
*http://ur.wikipedia.org/wiki/ <http://ur.wikipedia.org/wiki/>*
I need data in the form of '*text*' files. I already tried KIVIX software
but it's not fulfill my conditions.
Kindly give me the best suitable solutions....
Warm Regards
*Indian School of Mines*
*Imran Rasheed*
*(**Research Scholar**)*
Hello all,
As the original wikibugs bot did not come back online today, the tasks of
wikibugs have now fully been taken over by a python re-implementation of
the same bot, running on Tool Labs.
A few improvements over the old bot:
- running on Tool Labs instead of the WMF mail server, so
community-maintainable,
- messages to multiple channels based on bug properties (e.g. product and
component)
- several minor improvements:
-> shows changes instead of the new situation
-> adds first line-or-so of a comment,
-> shows product and component for each message,
-> better URLs (with anchor to the specific comment),
-> real names are now *always* used (they are retrieved from BZ if the
mail does not list them).
Source code is available at https://github.com/valhallasw/pywikibugs
Comments, suggestions for improvements and pull requests are, as always,
very welcome.
Merlijn
FYI, if you're interested in continuing to talk about this and related
issues, I strongly encourage Wikitech subscribers to join the Design list.
---------- Forwarded message ----------
From: Steven Walling <swalling(a)wikimedia.org>
Date: Mon, Apr 28, 2014 at 6:42 PM
Subject: Typography refresh, now that dust has settled
To: "A list for the design team." <design(a)lists.wikimedia.org>
Hi everyone,
For curious, I wanted to share a quick update on the typography refresh.
The following four recent pieces of news that should be of interest...
1). The French Wikipedia community closed their vote on most aspects of the
new design (color, size, new headers, etc.) and it was very much in favor
of keeping the new design in all aspects.[a]
2). Spanish Wikipedians also closed their vote, which was more of a simple
yes/no on whether to revert. This was also in favor of keeping the new
design.[b]
3). Jon Robson has put up a patch to allow LESS styles to be set
per-language.[c] This means that many local site hacks, like Japanese
Wikipedia removing the serif headers or Farsi Wikipedia having to set
completely different Farsi-friendly fonts, will potentially no longer be
necessary. Review and testing is needed!
4). For some time now MediaWiki.org and the test replica of English
Wikipedia (en.wikipedia.beta.wmflabs.org) have been using a new proposed
body font stack by Erwin Dokter. This puts Nimbus Sans L first, and
restores the other body font settings like Helvetica Neue for Mac users,
Arial for all Windows users etc. Please try it out, especially if you're on
Windows or Linux. We'd like to put this in Vector/core at some point, if we
can make sure it works.
Thanks!
a.
https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Prise_de_d%C3%A9cision/Afficha…
b.
https://es.wikipedia.org/wiki/Wikipedia:Votaciones/2014/Sobre_la_actualizac…
c. https://gerrit.wikimedia.org/r/#/c/125760/
--
Steven Walling,
Product Manager
https://wikimediafoundation.org/
--
Steven Walling,
Product Manager
https://wikimediafoundation.org/
After a long and intense selection process, Google and the GNOME Foundation
have announced a new wave of GSoC and FOSS OPW projects selected. We have
16 + 7 Wikimedia projects, a total of 23. Two more than last year, our
highest number ever!
Congratulations to all the teams of contributors and mentors. Also thank
you to the participants that have n't been selected. We encourage you to
keep contributing and apply again in the next call for proposals.
Google Summer of Code
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014
Aaron Xiao
UniversalLanguageSelector fonts for Chinese wikis
Aditya Chaturvedi
Catalog for mediawiki extensions
Amanpreet Singh
Annotation Tool that extracts information and feed them on Wikidata
Bartosz Dziewoński
Separating skins from core MediaWiki
Deepali Jain
Book management in Wikibooks/Wikisource
Hardik Juneja
Parsoid-based online-detection of broken wikitext
Jack Phoenix
A modern, scalable and attractive skin for MediaWiki
Jatin Mehta
Switching Semantic Forms Autocompletion to Select2
konarak
"LUv2: Generic, efficient localisation update service "
kunalgrover05
Critical bug fixes for Translate extension
Pratik Lahoti
Tools for mass migration of legacy translated wiki content
Rainer Rillke
Chemical Markup support for Commons or MediaWiki or both
Rohit Dua
(Automation Tool) Google Books > Internet Archive > Commons upload cycle
Tony Thomas
Adding proper email bounce handling to MediaWiki (with VERP)
Vikas S Yaligar
Automatic cross-language screenshots for user documentation
wctaiwan
MassMessage page input list improvements
FOSS Outreach Program for Women
https://www.mediawiki.org/wiki/FOSS_Outreach_Program_for_Women/Round_8https://wiki.gnome.org/OutreachProgramForWomen/2014/MayAugust
Ali King
Template matching for RDFIO
Anjali Sharma
WikiHunt the ‘Property’
Dinu Sandaru Kumarasiri
Welcoming new contributors to Wikimedia Labs and Wikimedia Tool Labs
Frances Hocutt
Evaluating and improving MediaWiki web API client libraries
Helen Halbert
Feed the gnomes - Wikidata outreach
Jaime Lyn Schatz
OpenHistoricalMaps and Wikimaps
Marielle Volz
Improving URL citations on Wikimedia
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi,
as previously announced [1], we've been facilitating a collective review
of Wikimedia's current product management tools and development
toolchain.
The most popular idea at the moment is to consolidate Wikimedia's
product management and infrastructure tools (such as Bugzilla, Gerrit,
RT, Mingle, Trello) into all-in-one Phabricator. We have therefore put
together a Request for comment to bring this up for wider discussion.
This discussion affects anyone who deals with bug reports, feature
requests and code changes in Wikimedia, so it's critical that you test
Phabricator for your own use and make your voice heard in the RFC:
https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator
We're compiling a list of Frequently asked questions at
https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator/FAQ ;
You're welcome to add more and help answer them :)
We'll host a few IRC discussions while the RFC is running to help answer
questions, etc. Our tentative times and dates are at
https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Phabricator#IRC_di…
Thank you for your input!
Guillaume and Andre
[1] http://lists.wikimedia.org/pipermail/wikitech-l/2014-March/074896.html
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
Hi everyone!
I'm using VisualEditor and TemplateData to insert template calls in
the wiki page. My users find it prerrty hard to understand the Add a
template Dialog: you should first find a template, then add all its
fields to the call and only after that place some value in each of the
fields. Is it possible to automatically add all the fields to the
template call once the template is selected?
Cheers,
-----
Yury Katkov
Hello all,
I would like to announce the release of MediaWiki Language Extension
Bundle 2014.04. This bundle is compatible with MediaWiki 1.22.6 and
MediaWiki 1.21.9 releases.
* Download: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2014.04.tar…
* sha256sum: f20631d2629e0cf80df8ca022e6eec4d6d784e0cd39799f9fd46f338f4a7381a
Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://bugzilla.wikimedia.org
* Talk with us at: #mediawiki-i18n @ Freenode
Release notes for each extension are below.
-- Kartik Mistry
== Babel, CleanChanges and CLDR ==
* Only localisation updates.
== LocalisationUpdate ==
* Make sure that older MediaWiki versions also get updates for core messages.
== Translate ==
=== Noteworthy changes ===
* Added ElasticSearch support for translation memory and translation search.
* Set JSON message format as default for MediaWiki extensions.
* Add tracking (mw.track) for primary events: translation and proofread.
* Removed SingleFileBasedMessageGroup.
* Removed classes used for supporting the old MediaWiki core format
and updated related maintenance scripts.
* Localisation updates.
== UniversalLanguageSelector ==
=== Noteworthy changes ===
* Allow always logging tofu detection in EventLogging.
* Bug 62981: Fixed RTL positioning for compact interlanguage links.
* Localize the number in the "more languages" message in the compact
links feature.
* Bug 60815: Set Arabic as the writing system for Ottoman Turkish (ota).
* Bug 63718: Allow overriding the header styles from typography refresh.
* Add loading of messages using $wgMessagesDirs
=== Input Methods ===
* Added Batak input method.
* Fixed the Odia Lekhani method, as well as the InScript methods for
Hindi, Odia and Malayalam.
Thanks!
--
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com
There is a new bot for wikibugs but it doesn't seem to reply very well
to basic CTCP messages and provides no link to its documentation.
Where can it be found? How is the bot controlled? How do I make it
join some channel and subscribe to a feed?