Just wow... Thank you WikiTeam and task force! Is scraperwiki involved?
SJ
On Tue, Aug 7, 2012 at 5:18 AM, emijrp <emijrp(a)gmail.com> wrote:
Hi;
I think this is the first time a full XML dump of Citizendium is publicly
available[1] (CZ offers dumps but only the last revision for each
article[2], and our previously efforts generated corrupted and incomplete
dumps). It contains 168,262 pages and 753,651 revisions (9 GB, 99 MB in
7z). I think it may be useful for researchers, including quality analysis.
It was generated using WikiTeam tools.[3] This is part of our task force
to make backups of thousands of wikis around the Internet.[4]
Regards,
emijrp
[1]
http://archive.org/details/wiki-encitizendiumorg
[2]
http://en.citizendium.org/wiki/CZ:Downloads
[3]
http://code.google.com/p/wikiteam/
[4]
http://code.google.com/p/wikiteam/wiki/AvailableBackups
--
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT <http://code.google.com/p/avbot/> |
StatMediaWiki<http://statmediawiki.forja.rediris.es>
| WikiEvidens <http://code.google.com/p/wikievidens/> |
WikiPapers<http://wikipapers.referata.com>
| WikiTeam <http://code.google.com/p/wikiteam/>
Personal website:
https://sites.google.com/site/emijrp/
_______________________________________________
Wiki-research-l mailing list
Wiki-research-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
--
Samuel Klein @metasj w:user:sj +1 617 529 4266