Sven Andersson, 24/07/20 03:53:
> Hi everyone! I'm looking for some reading. I hope this is an acceptable use of this list.
Sure.
>
> Which are the core Wikipedia papers, that shaped understanding Wikipedia and Wikipedia research?
Hard to tell, but there was an attempt to have some selected pointers
organised by topic:
https://meta.wikimedia.org/wiki/Research:Codex
Depending on your selection criteria, you might also find some help from
various literature reviews which were published in the past, or from
semi-curated databases like Wikipapers
<http://wikipapers.referata.com/wiki/Main_Page>.
Federico
Morten Wang, 18/12/20 17:23:
> One thing I've noticed is that all the papers I'm referencing focus on the
> English Wikipedia. When it comes to studies of other language editions, or
> across multiple ones, I've struggled to come up with a key paper to point
> to.
For this I usually reference Felipe Ortega's dissertation of 2009,
"Wikipedia: A Quantitative Analysis". All the basic trends were there to
see already, and at the time it was the only cross-language study AFAIK
(apart from Erik Zachte's statistics).
https://burjcdigital.urjc.es/handle/10115/11239
It's also linked from:
https://meta.wikimedia.org/wiki/Research:Resources
> Hopefully someone else chimes in and fills that hole, as it's important
> to recognize that "Wikipedia" doesn't equal the English one.
Indeed.
Federico
Andy Mabbett, 05/05/2018 17:33:
>> Both Wikidata and DBpedia surely can, and should, coexist because we'll
>> never be able to host in Wikidata the entirety of the Wikipedias.
> Can you give an example of something that can be represented in
> DBpedia, but not Wikidata?
More simply, there's still a long way to go until Wikidata imports all
the data contained in Wikipedia infoboxes (or equivalent data from other
sources), let alone the rest.
So, as Gerard mentions, DBpedia has something more/different to offer.
(The same is true for the various extractions of structured data from
Wiktionary vs. Wiktionary's own unstructured data.)
That said, the LOD cloud is about links, as far as I understand.
Wikidata should be very interesting in it.
Federico
Thomas Douillard, 31/05/19 13:27:
> Hi all, I find it hard to know what are the connections between Wikidata
> and authority control organizations and big live databases.
That's by design, since identifiers on Wikidata are not some kind of
top-down process where ever single actor's responsibility is defined
from the beginning.
This doesn't preclude good things from happening, as we've seen with LOC:
https://blogs.loc.gov/thesignal/2019/05/integrating-wikidata-at-the-library…
Federico
Denny Vrandečić, 19/09/19 19:56:
> I had used my 20% time to support such teams. The requests became more
> frequent, and now I am moving to a new role in Google Research, akin to
> a Wikimedian in Residence
That's very interesting! Is it the first free culture project for which
something of the like happens? From what you write, I understand it will
be something separate from the Google Open Source office, right?
Federico
Nicolas VIGNERON, 03/01/2018 17:34:
> So now, I'm sending you this mail to know whether you had to deal with
> things like this or if you have any ideas of what should be done.
Assuming the copyfraudster has already been kindly pointed to
<https://commons.wikimedia.org/wiki/COM:REUSE>, the next step is for the
photographer to take action. He can try with the friendly
<http://photoclaim.com/en/>.
Given the strong moral rights laws in France, lex loci protectionis and
various previous court rulings, I'd expect the lawyers to threaten a
lawsuit for a few thousands euro and quickly settle for less.
I also usually go fetch suggestions from
<https://wiki.creativecommons.org/wiki/Case_Law> but I don't see
anything from France there.
Federico
Leila Zia, 06/08/19 07:45:
> [1] In our volunteer times, we're co-presidents of a non-profit
> organization and the organization has received the books from the
> publishers (Cambridge, Springer, Morgan & Claypool).
Do the donors prohibit you from donating the books to the Internet Archive?
Federico
Reception123 ., 06/03/2018 08:25:
> I was wondering how one could install and use the "Page Views" tool that
> Wikimedia uses, on a non-WMF wiki.
I guess you could rebuild the entire cache and analytics clusters from
puppet (supposedly documented somewhere around
<https://wikitech.wikimedia.org/wiki/Analytics>), or write something
from scratch that would expose data with the same API format.
Federico