Hi everyone,
We are delighted to announce that Wiki Workshop 2021 will be held
virtually in April 2021 and as part of the Web Conference 2021 [1].
The exact day is to be finalized and we know it will be between April
19-23.
In the past years, Wiki Workshop has traveled to Oxford, Montreal,
Cologne, Perth, Lyon, and San Francisco, and (virtually) to Taipei.
Last year, we had more than 120 participants in the workshop and we
are particularly excited about this year's as we will celebrate the
20th birthday of Wikipedia.
We encourage contributions by all researchers who study the Wikimedia
projects. We specifically encourage 1-2 page submissions of
preliminary research. You will have the option to publish your work as
part of the proceedings of The Web Conference 2021.
You can read more about the call for papers and the workshop at
http://wikiworkshop.org/2021/#call. Please note that the deadline for
the submissions to be considered for proceedings is January 29. All
other submissions should be received by March 1.
If you have questions about the workshop, please let us know on this
list or at wikiworkshop(a)googlegroups.com.
Looking forward to seeing many of you in this year's edition.
Best,
Miriam Redi, Wikimedia Foundation
Bob West, EPFL
Leila Zia, Wikimedia Foundation
[1] https://www2021.thewebconf.org/
Hi,
I am trying to get alternative names of given names in WikiData with the
following simple query:
PREFIX ps: <http://www.wikidata.org/prop/direct/>
PREFIX wd: <http://www.wikidata.org/entity/>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
CONSTRUCT {?s rdfs:label ?o}
WHERE { ?s ps:P31 wd:Q202444. ?s rdfs:label ?o}
LIMIT 1000
Initially, the query was much more complex, but I was getting time-outs on
the public WikiData SPARQL endpoint. I decided to use Linked Data Fragments
to offload some filtering from the server to the client.
comunica-sparql "https://query.wikidata.org/bigdata/ldf" -f query >
given_names.n3
(where "query" is a file with the SPARQL query shown above). Unfortunately,
the client tries to get output from the 3rd page, I am getting
the following error:
Could not retrieve
https://query.wikidata.org/bigdata/ldf?subject=http%3A%2F%2Fwww.wikidata.or…
(500: unknown error)
Following the link in fact returns HTTP 500 error with
Error details
java.lang.IllegalStateException
The link points to the 3rd page. It works if you try to go the second page:
https://query.wikidata.org/bigdata/ldf?subject=http%3A%2F%2Fwww.wikidata.or…
Is this a bug or a limitation of a service?
With kind regards,
Maciej Gawinecki
Hello all,
The Wikimedia Foundation Board of Trustees is organizing a call for
feedback[1] about community selection processes between February 1 and
March 14. While the Wikimedia Foundation and the movement have grown about
five times in the past ten years, the Board’s structure and processes have
remained basically the same. As the Board is designed today, we have a
problem of capacity, performance, and lack of representation of the
movement’s diversity. Our current processes to select individual volunteer
and affiliate seats have some limitations. Direct elections tend to favor
candidates from the leading language communities, regardless of how
relevant their skills and experience might be in serving as a Board member,
or contributing to the ability of the Board to perform its specific
responsibilities. It is also a fact that the current processes have favored
volunteers from North America and Western Europe. In the upcoming months,
we need to renew three community seats and appoint three more community
members in the new seats. This call for feedback is to see what processes
can we all collaboratively design to promote and choose candidates that
represent our movement and are prepared with the experience, skills, and
insight to perform as trustees?
In this regard, two rounds of feedback meetings are being hosted to collect
feedback from the technical communities in Wikimedia. Two rounds are being
hosted with the same agenda, to accomodate people from various time zones
across the globe. We will be discussing ideas proposed by the Board and the
community to address the above mentioned problems. Please sign-up according
to whatever is most comfortable to you. You are welcome to participate in
both as well!
Round 1 - Feb 24, 4:00 pm UTC[2]
Round 2 - Mar 3, 4:00 am UTC[3]
Sign-up and meeting details:
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_of_Trustees/Call…
Please let me know if you have any questions.
Best,
Krishna Chaitanya
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_of_Trustees/Call…
[2] https://zonestamp.toolforge.org/1614182448
[3] https://zonestamp.toolforge.org/1614744041
Hi everyone,
As you may know, you can include Sitelinks (also known as interwiki links
or interlanguage links) that go from individual Items in Wikidata to pages
on other Wikimedia sites including Wikisource
<https://en.wikipedia.org/wiki/Wikisource>. Until now, it was not possible
to manage the interwiki links to/from multilingual Wikisource on Wikidata.
This was due to the unusual technical setup of multilingual Wikisource as
well as the fact that we can only link to one page for a topic on any given
wiki (multilingual Wikisource can theoretically have several pages covering
the same topic in different languages though this is very rare).
We resolved the technical constraint and starting today you can connect
pages on multilingual Wikisource with the other language-specific versions
of Wikisource so these interwiki links don't have to be maintained in the
wikitext anymore. After adding a sitelink, the sitelink appears on the Item
page (in the Sitelinks section) with language code mul. The restriction of
only being able to link to one page on multilingual Wikisource per Item
stays in place and was considered acceptable based on feedback by the
Wikisource community.
This feature was requested by users from the Wikisource community. We hope
that it can give them more access to Wikidata and the freedom to add
multiple links to different Items.
If you encounter any issues or want to provide feedback, feel free to use this
Phabricator ticket <https://phabricator.wikimedia.org/T138332>.
Cheers,
--
Mohammed Sadat
*Community Communications Manager for Wikidata/Wikibase*
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Thank you, Gerard. I have used the tool you alluded to as an end-user, changing display language from English, to Spanish, to Chinese traditional or to Chinese simplified.
But the <language> for describing an entity in Wikidata is on the data recording side. At the moment, <Chinese> (zh) appears to represent spoken and written Chinese. But the scripts are quite all over the map. I cannot figure out whether best practices. Will you mind pointing me to the guidelines if you are aware of one?
Thank you very much.
—Jackie
> Begin forwarded message:
>
>
> Date: Sun, 21 Feb 2021 10:23:13 +0100
> From: Gerard Meijssen <gerard.meijssen(a)gmail.com>
> To: Discussion list for the Wikidata project <wikidata(a)lists.wikimedia.org>
> Subject: Re: [Wikidata] Guidelines for entering label in various languages, e.g. Chinese
>
>
> Hoi,
> For Wikipedia we have a tool that provides to the reader the simplified or
> traditional representation based on the users preference. We could do that
> for Wikidata if we cared to. The same is true for Commons, its search
> engine works for any language based on the availability of Wikidata labels
> in other languages and we could serve more than half the world's population
> if we cared to.
> Thanks,
> GerardM
>
> On Fri, 19 Feb 2021 at 21:09, j s <aa3544bd(a)gmail.com> wrote:
>
>> Hi,
>>
>> I was alerted that there are many properties without a label in zh-Hant <https://www.wikidata.org/w/index.php?search=-haslabel:zh-hant&title=Special…>.
>> When I reviewed the search result, I was surprised to see Chinese
>> traditional (zh-Hant) and simplified scripts (zh-Hans) were both recorded
>> under the lable Chinese (zh).
>>
>> Does anyone know if there are guidelines for entering data under different
>> language tags, subtags for script and region? Thank you for your help.
>>
>> ---
>> Jackie Shieh
>> Descriptive Data Management
>> Discovery Services Division
>>
>> LibrariesArchives.si.edu
>> library.si.edu
>>
>>
> *****************************************
Hi,
I'm trying to make this query work with Wikipedia redirects:
https://w.wiki/$sF
SELECT * WHERE {
VALUES ?page {
"Aachen"@en
"Akron"@en
}
?sitelink schema:name ?page;
schema:isPartOf <https://en.wikipedia.org/>;
schema:about ?city.
}
Aachen works, Akron does not, as it is a redirect. How can I make a query
which works with such redirects?
I'm trying to get Wikidata ids for the links from this table:
https://en.wikipedia.org/wiki/List_of_cities_by_GDP
(The html has mw-redirect class for the redirects).
Regards,
Zsolt
Hi everybody,
the end of February is approaching, and with it, the end of the
consultation for the UCoC - please, contain your happiness! :)
Jokes aside, in these last days of consultation there are some more
little actions that you can take to help us in the implementation of
the UCoC:
# you can take the survey[1] about the UCoC (please note that this
survey is hosted by a third-party platform, so read the Survey Privacy
Statement);[2]
# if you don't want to use the survey, you can contact me *off-list
and in private* to have a chat about the UCoC consultation and any
related issue at sannita-ctr(at)wikimedia.org;
# you can participate to the UCoC game:[3] help the young community of
the "Galactic Cookbook Database"[4] by suggesting possible methods,
tools and solutions to implement their general principles for users to
follow;
# you can answer the previous three rounds of questions, at the
consultation's talk page.[5]
As always, please let me know if you have any questions or concerns or
anything to say!
Cheers,
Luca Martinelli
UCoC facilitator for Wikidata
[1] https://docs.google.com/forms/d/e/1FAIpQLSccW9RnWZgP2YYIkBF72Wa3iGd5U4Ut6AU…
[2] https://foundation.wikimedia.org/wiki/Universal_Code_of_Conduct_Feedback_Su…
[3] https://www.wikidata.org/wiki/Wikidata:Universal_Code_of_Conduct_consultati…
[4] Yes, it's a nod to The Hitchhiker's Guide to the Galaxy.
[5] https://www.wikidata.org/wiki/Wikidata_talk:Universal_Code_of_Conduct_consu…
Hoi,
At some time Wikicite was alive and well. Now people at Wikidata state that
given that the roadmap of Wikicite has not been updated for a long time, it
is presumed dead. [1] As a consequence it is all too easy to ask for the
"cleanup" of the existing scholarly data and imho mis-representing what has
gone before.
In the years since the last WikiCite roadmap update, a lot has changed.
- Magnus rewrote many of his tools in RUST, including the SourceMD
tooks, it made no difference for the community
- Elsevier has opened up its references; they are now available at
Crossref.
- Scholia now knows where a paper is used as a reference in particularly
the English Wikipedia
- Scholia templates exist on many subjects and scientists in the English
Wikipedia
- Wikidata is now used to improve the information of the papers used as
references with information from Wikidata
- There was an initial run linking books know by their ISBN from
Wikidata to Open Library.
Personally I still add papers, one at a time, and use them as "cites work"
references. For books I add the books and often link to Open Library.. I
care about ecology, rewilding and when I feel compelled to work on a
specific paper, I will. [2] When I come across a scientist who is in the
news, I will use the author-disambiguator to link to its papers.
The last I heard about plans for Wikicite was what to do next centred
around the notion that we "could" have all the papers in a Wikibase. As far
as I am aware, whatever happened is not generally known and it may be a lot
but I expect nothing much; I prefer to be surprised.
When these people who have their own pet projects get their way, it will
destroy all the work that has been done. It will destroy mine. The notion
that it will be for the better can be understood from their perspective. My
problem is that it will make Wikidata only more biased. When you compare
any subject that has a worldwide validity, its coverage is dominated by
what we know and it is North American, European. You are unlikely to find
any city of Africa with all its mayors. We do not know all the national
ministers for the twenty first century and obviously not for the twentieth
century of the African countries.
For Wikicite is to be alive and well it needs to have a goal. For me it is
for the all the references to scientific papers to be known in Wikidata,
including the papers they cite, including its authors. This will provide a
rabbit hole where people can find additional material on a subject. In
addition it will show when the science referenced in a Wikipedia article is
out of date. It happens and old ideas are jealously protected.
So what will it be.. Is there live in Wikicite?
Thanks,
GerardM
[1]
https://www.wikidata.org/wiki/Wikidata:Project_chat#Cleanup/Import_of_scien…
[2] https://scholia.toolforge.org/work/Q105451449