Hi all,

I have long been wanting to say this, but is it possible for the team behind compiling such datasets to put future (and if possible, current) datasets into dumps.wikimedia.org so that it is easier for everyone to find stuff and not be all over the place? Thanks for that!

On Tue, Oct 23, 2012 at 4:51 AM, Dario Taraborelli <dtaraborelli@wikimedia.org> wrote:
We've released a full, anonymized dump of article ratings (aka AFTv4) collected over 1 year since the deployment of the tool on the entire English Wikipedia (July 22, 2011 - July 22, 2012).

http://thedatahub.org/en/dataset/wikipedia-article-ratings

The dataset (which includes 11m unique article ratings along 4 dimensions) is licensed under CC0 and supersedes the partial dumps originally hosted on the dumps server. Real-time AFTv4 data remains available as usual via the toolserver. Feel free to get in touch if you have any questions about this data.

Dario
_______________________________________________
Wiki-research-l mailing list
Wiki-research-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l



--
Regards,
Hydriz

We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org