Hello,
The Wikidata development team at Wikimedia Germany is planning a brief
survey of the core demographics that make up the Wikidata community in
order to provide us with a baseline for future diversity efforts.
To yield representative results, we want to deploy the survey to a broad
range of users via the CentralNotice
<https://meta.wikimedia.org/wiki/CentralNotice>.
We have put up a request for a CentralNotice banner
<https://meta.wikimedia.org/wiki/CentralNotice/Request#Wikidata_Community_Su…>;
the request is open for feedback and comments until February 17th, 2021.
You can also find more information about the survey, the use of the data
and the questions asked
<https://www.wikidata.org/wiki/Wikidata:Usability_and_usefulness/2021-2-Surv…>
.
Feel free to take a look and let us know if you have any questions.
Cheers,
--
Mohammed Sadat
*Community Communications Manager for Wikidata/Wikibase*
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
A new Wikifact project could be a resource for real-time collaborative fact checking. It could serve as a fact-checking resource for the editors of Wikinews and Wikipedia articles. It could also serve as a fact-checking resource for a broader set of end-users. Through an API, end-users could perform real-time fact checking via Wikifact while authoring or reviewing documents.
I have created a project proposal here: https://meta.wikimedia.org/wiki/Wikifact . I hope to improve it as discussion unfolds.
Best regards,
Adam Sobieski
Hi all,
Our host *wcqs-beta-01.eqiad.wmflabs* is running low on disk space due to
its blazegraph journal dataset size. In order to free up space we will need
to take the service down, delete the journal and re-import from the latest
dump. Service interruption will begin at *Feb 4 18:30 UTC* and continue
until the data reload is complete.
We'll send out a notification when the downtime begins and when it ends as
well.
*Note*: This doesn't affect WDQS, only the WCQS beta.
--Reposting --
Hi everyone,
WikiCred Demo Hour is hosting a special Q&A Session with Mark Graham,
Director at Internet Archive and the WayBack Machine on Feb 12th EST.
This is everything you always wanted to ask about the Internet Archive, and
the Wayback Machine, but were too afraid to ask. Mark Graham will give a
quick overview of some of the Internet Archive's projects and priorities
and engage with questions, suggestions, requests, and constructive
criticism about the work of this nearly 25 year old non-profit. Mark is
especially interested in learning about how the organization can evolve its
services to be of most benefit to more people.
In addition to Mark's Q&A, we will host a hands-on demo on how to query
bibliographical data from WikiData and extract visual graphics for
academic research by Wikipedia editor and WikiData expert
Houcemeddine Turki.
The call is open to the public, but we require that you sign up on this form
<https://docs.google.com/forms/d/e/1FAIpQLSeEpva_wwZA9xIjtTni7-y0tZ_VgXW8nzV…>
in
order to receive the Zoom link.
Find our agenda here
<https://docs.google.com/document/d/1Blz4OAXypJe8yqMHFt4nQDufan2ZcmTWvHs5uXW…>
.
*About WikiCred Demo Hour*
WikiCred Demo Hour is an initiative of the WikiCredibility Grant Initiative
<http://wikicred.org/> which supports wikipedia and wikimedia projects with
the goal of strengthening information reliability and credibility on the
internet. It is a regular monthly online meetup that brings together
members of the Wikimedian community, open-data and open-knowledge
enthusiasts, academics, journalists, researchers, and professionals from
the online credibility space.
Best regards,
Ahmed Medien
hackshackers.comwikicred.org
Partners <https://www.wikicred.org/#who-we-are> / Projects
<http://wikicred.org/#projects> / WikiCred Demo Videos
<https://www.youtube.com/watch?v=pyA_FVf2mGM&list=PLQO0eq9qkTnNTtk4WIA1cx-5Y…>
Please join us for the first installment of the LD4 Wikibase Working Hour
<https://www.wikidata.org/wiki/Wikidata:WikiProject_LD4_Wikidata_Affinity_Gr…>
!
When: Thurs. 11 February 2021, 11AM-12PM Eastern
Registration: Please fill in this form
<https://columbiauniversity.zoom.us/meeting/register/tJUof-yspj0uE9PcSI9mOwB…>
to register
The February Working Hour will feature two presentations:
Pascal Lefeuvre: Noemi at the Bibliothèque nationale de France
In 2017 the BnF has started the project of building a new software for
cataloguing its documents. In 2019, the BnF has decided to build the
software, called Noemi, on top of wikibase. Wikibase wil be used as an
enhanced database for managing and storing bibiographical data. The
presentation describes how Wikibase is used by Noemi and points out issues
the BnF will deal with in 2021.
Stacy Allison-Cassin: will talk about her use of Wikibase for describing
collections relating to Indigenous (First Nations, Métis, and Inuit)
collections and recent work with IFLA.
Speaker bios:
Pascal Lefeuvre has been working for the IT department of the BnF as a
business analyst since 2012 mainly on digitization issues. Since 2017 has
also served as the Scrum master of the IT team on Noemi's Project. Anila
Angjeli and Benjamin Bober are Co-Project Manager for the national project
"Fichier national d'entités", aiming at buiding a platform to jointly
create and manage entities' metadata across libraries and other cultural
and research partners. It is lead by the two bibliographic agencies in
Frace : BnF and Abes (Bibliographic Agency for Higher Education). The
platform will use Wikibase as the basis for its technical infrastructure.
Stacy Allison-Cassin, PhD, Assistant Professor, Teaching Stream, Faculty of
Information, University of Toronto, Citizen of the Métis Nation of Ontario
About the LD4 Wikibase Working Hour:
The LD4 Wikibase Working Hour
<https://www.wikidata.org/wiki/Wikidata:WikiProject_LD4_Wikidata_Affinity_Gr…>
seeks to create a space for GLAM (Galleries, Libraries, Archives, and
Museums) professionals experimenting with Wikibase, the software that
Wikidata is based on, to learn collaboratively and share tips, tools, and
resources. For more details, see the project page
<https://www.wikidata.org/wiki/Wikidata:WikiProject_LD4_Wikidata_Affinity_Gr…>.
To sign up for monthly announcements, please fill out this form
<https://docs.google.com/forms/d/e/1FAIpQLScsUtWYIVOL8RF2RwBPiydMK1nZac_c7hE…>
.
--
Timothy Ryan Mendenhall (he/him/his)
Metadata Librarian
Columbia University Libraries
Original and Special Materials Cataloging
102 Butler Library
535 West 114th Street
New York, NY 10027
trm2151(a)columbia.edu
(212) 851-2452
Hi folks!
This is an announcement for a change to the response of the wbeditentity
API module, which is a breaking change for MediaInfo entities (Structured
Data on Commons), a significant change for Lexeme entities (lexicographical
data), a minor change for Property entities, and a no-op for Item entities.
The wbeditentity API module, which can be used to edit any part of a
Wikibase Entity, has long included a part of the edited data in its
response. However, this response data was incomplete: it included e.g.
Labels, Statements, Sitelinks, but not the datatype of a Property or the
Lemmas of a Lexeme. Additionally, Statements were always returned under the
key "claims", even though MediaInfo entities generally use the key
"statements". On or around February 10, we will deploy a code change that
will make wbeditentity return the entity data using the standard
serialization format of an entity type, the same that is used by
wbgetentities and Special:EntityData. This means that the response will now
contain all the parts of an Entity, and also that, for MediaInfo entities,
the Statements will now be returned under "statements". (These Statements
will also be missing the "datatype", just like the MediaInfo data from
wbgetentities and Special:EntityData – see T246809
<https://phabricator.wikimedia.org/T246809>.)
To avoid breaking MediaInfo API users immediately, we are temporarily
adding Statements under the key "claims" as well – that is, the change on
February 10 is only significant, not yet breaking, and MediaInfo API users
can use either the "claims" or the "statements". On or around March 3, we
will remove this compatibility code, and MediaInfo API users will have to
use "statements" if they want to look at the Statements of the returned
entity data.
It’s also worth mentioning here what the wbeditentity response data
represents. The API returns the Entity data as edited by your API request,
and the revision ID of the page that the change was saved under. This is
*not* necessarily the Entity data of that revision ID: if Wikibase patched
any edit conflicts between the base revision ID that your request specified
(baserevid parameter) and the actual latest revision ID, then those changes
are not included in the response. For example, if you load an Item with
revision ID *X* and labels “a” and “b” in different languages, and make a
wbeditentity request with baserevid=*X* to change the label “a” to “A”, but
in the meantime someone else had already changed the label “b” to “B” and
saved this as revision *Y*, then the API response for your request will
have a last revision ID ("lastrevid") of *Z*, and this revision *Z* will
have labels “A” and “B” if you get it from Special:EntityData, but the API
response for your request will have labels “A” and “b” (the result of
applying your edit to the base revision). If you need the actual latest
Entity data after your edit, make a separate request to Special:EntityData
or wbgetentities. (This is nothing new, and unaffected by the change being
announced here, but we thought it was still worth mentioning.)
If you have any issue or question, feel free to leave a comment at T271105
<https://phabricator.wikimedia.org/T271105>.
Cheers,
Lucas
--
Lucas Werkmeister (he/er)
Full Stack Developer
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de
Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hi everyone,
WikiCred Demo Hour is hosting a special Q&A Session with Mark Graham,
Director at Internet Archive and the WayBack Machine on Feb 12th EST.
This is everything you always wanted to ask about the Internet Archive, and
the Wayback Machine, but were too afraid to ask. Mark Graham will give a
quick overview of some of the Internet Archive's projects and priorities
and engage with questions, suggestions, requests, and constructive
criticism about the work of this nearly 25 year old non-profit. Mark is
especially interested in learning about how the organization can evolve its
services to be of most benefit to more people.
In addition to Mark's Q&A, we will host a hands-on demo on how to query
bibliographical data from WikiData and extract visual graphics for
academic research by Wikipedia editor and WikiData expert
Houcemeddine Turki.
The call is open to the public, but we require that you sign up on this form
<https://docs.google.com/forms/d/e/1FAIpQLSeEpva_wwZA9xIjtTni7-y0tZ_VgXW8nzV…>
in order to receive the Zoom link.
Find our agenda here
<https://docs.google.com/document/d/1Blz4OAXypJe8yqMHFt4nQDufan2ZcmTWvHs5uXW…>
.
*About WikiCred Demo Hour*
WikiCred Demo Hour is an initiative of the WikiCredibility Grant Initiative
<http://wikicred.org> which supports wikipedia and wikimedia projects with
the goal of strengthening information reliability and credibility on the
internet. It is a regular monthly online meetup that brings together
members of the Wikimedian community, open-data and open-knowledge
enthusiasts, academics, journalists, researchers, and professionals from
the online credibility space.
Best regards,
Ahmed Medien
hackshackers.comwikicred.org
Partners <https://www.wikicred.org/#who-we-are> / Projects
<http://wikicred.org/#projects> / WikiCred Demo Videos
<https://www.youtube.com/watch?v=pyA_FVf2mGM&list=PLQO0eq9qkTnNTtk4WIA1cx-5Y…>
A “Wikifacts” sister project is a great idea. “Wikifacts” seems distinct from both Wikinews and Wikidata. With some schemas for facts, defining their structure and interrelations, one could, however, utilize Wikidata as a backend.
Brainstorming, while I’m sure that those in these mailing lists are more familiar with what’s possible with wiki templates, with a “Wikifacts” sister project, one could envision a wiki template for explicit facts. Perhaps such a wiki template could resemble:
{{fact|User content goes here.}}
or
{{fact|F12345678|User content goes here.}}
With such a wiki template, editors could add explicit facts to Wikinews and Wikipedia articles.
End-users could hover over explicit facts in Wikinews or Wikipedia articles to view information (e.g. the number of informational messages, warnings, or errors) about the facts in tooltips and could click on a fact – or on a superscript hyperlink symbol – to navigate to the fact’s “Wikifacts” article.
The use of explicit facts in Wikinews or Wikipedia articles could, potentially, create new “Wikifacts” articles or synchronize with existing “Wikifacts” articles. Whenever a fact were updated or annotated via the “Wikifacts” project website, the editors of any dependent Wikinews or Wikipedia articles could receive emails, resembling how they can opt to watch articles for edits. Wikinews and Wikipedia editors could receive emails so as to be able to revisit an article if or when any facts upon which the article depends change.
A “Wikifacts” project could also serve as a fact-checking resource for a set of end-users broader than the editors of Wikinews and Wikipedia articles. Through an API, end-users could perform real-time fact checking via “Wikifacts” while authoring or reviewing documents.
Best regards,
Adam
P.S.: Thank you for the information about the Wikipragmatica proposal.
From: Douglas Clark<mailto:clarkdd@gmail.com>
Sent: Thursday, February 4, 2021 6:50 PM
To: Wikimedia Mailing List<mailto:wikimedia-l@lists.wikimedia.org>
Subject: Re: [Wikimedia-l] Idea of a new project: Wikifacts ?
I proposed a project, WikiPragmatica<https://meta.wikimedia.org/wiki/Wikipragmatica>, that can support fake news detection. The retained context of the paraphrase graph can identify fake news patterns similar to what MIT does with their detector.
On Thu, Feb 4, 2021 at 12:42 PM Galder Gonzalez Larrañaga <galder158(a)hotmail.com<mailto:galder158@hotmail.com>> wrote:
Does Wikinews cover this aspect?
From: Wikimedia-l <wikimedia-l-bounces(a)lists.wikimedia.org<mailto:wikimedia-l-bounces@lists.wikimedia.org>> on behalf of Chris Gates via Wikimedia-l <wikimedia-l(a)lists.wikimedia.org<mailto:wikimedia-l@lists.wikimedia.org>>
Sent: Thursday, February 4, 2021 8:20 PM
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org<mailto:wikimedia-l@lists.wikimedia.org>>
Cc: Chris Gates <vermont(a)vtwp.org<mailto:vermont@vtwp.org>>
Subject: Re: [Wikimedia-l] Idea of a new project: Wikifacts ?
Hello,
Independent of my opinions on the validity of such a new Wikimedia project, there is currently an experiment of similar goals (and potentially structure) over at Twitter:
https://blog.twitter.com/en_us/topics/product/2021/introducing-birdwatch-a-…
Best,
Verm
On Thu, Feb 4, 2021 at 2:17 PM Leinonen Teemu <teemu.leinonen(a)aalto.fi<mailto:teemu.leinonen@aalto.fi>> wrote:
Hi all,
Has there been any discussion to start a new Wikimedia project focusing on fact checking?
Fact checking of course is in the core of editing Wikipedia, but I was thinking about dedicated wiki-site that is dedicated for fact checking of current events and news. Why this would be important?
(1) There are many fact checking site in the English speaking world but much less elsewhere. I am afraid that there is still greater need for fact checking in the rest of the world. {{Citation needed}}
(2) Our community is very well educated to do fact checking the wiki-way. Again internationally, many of our community members are real fact champions in their home countries and language groups. The practice of Wikipedia could be applied to fact checking of fast moving current events and news, too.
(3) This could help us to get new young people to the movement, as editing Wikipedias is not anymore so easy to start (because they are so good already).
(4) In many parts of the world, fact checking can also be dangerous. With our anonymous and community driven practices and services we could protect the fact checkers in many parts of the world.
I am not sure what is the state of the Wikinews, but my impression is that it is not really working. It was a good idea, but maybe wiki or wiki-way is not the way to produce news. Also the beautiful idea of citizen journalism has not really become reality. Maybe we could try if wiki and the wki-way works better in fact checking.
Peace,
- Teemu
_______________________________________________
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l(a)lists.wikimedia.org<mailto:Wikimedia-l@lists.wikimedia.org>
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, <mailto:wikimedia-l-request@lists.wikimedia.org<mailto:wikimedia-l-request@lists.wikimedia.org>?subject=unsubscribe>
_______________________________________________
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l(a)lists.wikimedia.org<mailto:Wikimedia-l@lists.wikimedia.org>
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, <mailto:wikimedia-l-request@lists.wikimedia.org<mailto:wikimedia-l-request@lists.wikimedia.org>?subject=unsubscribe>