Hi David and team,
In Yi Liu's tool, Wikidata Property Explorer, I noticed that the query
performance could be better ideally. Currently the query takes about 9
seconds and I'm asking if there might be anything to help reduce that
considerably? Refactoring query for optimization, backend changes,
anything you can think of Davd?
SELECT DISTINCT ?prop ?label ?desc ?type (GROUP_CONCAT(DISTINCT ?alias;
SEPARATOR = " | ") AS ?aliases) WHERE {
?prop (wdt:P31/(wdt:P279*)) wd:Q18616576;
wikibase:propertyType ?type.
OPTIONAL {
?prop rdfs:label ?label.
FILTER((LANG(?label)) = "en")
}
OPTIONAL {
?prop schema:description ?desc.
FILTER((LANG(?desc)) = "en")
}
OPTIONAL {
?prop skos:altLabel ?alias.
FILTER((LANG(?alias)) = "en")
}
}
GROUP BY ?prop ?label ?desc ?type
Thad
https://www.linkedin.com/in/thadguidry/https://calendly.com/thadguidry/
Hi all,
I’m excited to announce that the WMF Search team has just shipped the new
Streaming Updater for Wikidata Query Service (WDQS), with the final
server’s data transfer completing earlier today (19 Oct) — a little ahead
of (the revised) schedule!
You may know WDQS as a way of querying information from Wikidata. In order
to this, WDQS ingests data, particularly edit updates, from Wikidata to
construct and maintain a massive knowledge graph. Wikidata has grown over
the years in size and usage, and WDQS had started becoming a bottleneck,
which created update lag.
The new Streaming Updater allows WDQS to go from an average of 10
edits/second to an average of 88 edits/second – almost a 900% increase in
our ability to make sure that we can provide a more up to date knowledge
graph, as well as a more stable and reliable update process.
For more information about some of the technical changes that could break
existing workflows and usage, see this earlier announcement
<https://www.wikidata.org/wiki/Wikidata:Project_chat/Archive/2021/03#New_WDQ…>
.
Big thanks and congratulations to the Search team, WMDE, and everyone else
involved for making this happen!
“It’s absolutely insane how fast the new streaming updater catches up on lag
very exciting” – Ryan Kemper
Best,
Mike
—
*Mike Pham* (he/him)
Sr Product Manager, Search
Wikimedia Foundation <https://wikimediafoundation.org/>
Hi everyone,
In 2019 we published strategy papers for Wikidata and the Wikibase
Ecosystem (https://meta.wikimedia.org/wiki/Wikidata/Strategy/2019).
They have been very helpful for us to clarify where we see Wikidata
and the Wikibase Ecosystem going and have conversations about it with
editors, other chapters and user groups, outside organisations and
within our team.
Two years have passed since then and a lot has happened. Over the past
months we have therefore sat down again and taken the time to really
consider where we are and where we want to go based on everything we
have learned from conversations we have had with many of you, research
we have done and how we and the world have changed since the first
strategy papers were published. Today we are publishing the result of
all of that work and are inviting your feedback.
You can find the new strategies at
https://meta.wikimedia.org/wiki/LinkedOpenData/Strategy2021 and we
would love to hear your feedback and thoughts on the talk page at
https://meta.wikimedia.org/wiki/Talk:LinkedOpenData/Strategy2021.
Cheers
Sam, Lea, Manuel, Lydia for the development team
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207.
The Search Platform Team
<https://www.mediawiki.org/wiki/Wikimedia_Search_Platform> usually holds
office hours the first Wednesday of each month. Come talk to us about
anything related to Wikimedia search, Wikidata Query Service, Wikimedia
Commons Query Service, etc.!
Feel free to add your items to the Etherpad Agenda for the next meeting.
Details for our next meeting:
Date: Wednesday, November 3rd, 2021
Time: 16:00-17:00 GMT / 09:00-10:00 PDT / 12:00-13:00 EDT / 17:00-18:00 CET
& WAT
* Note: this is an hour later in the US, thanks to "daylight confusion
time"..*
Etherpad: https://etherpad.wikimedia.org/p/Search_Platform_Office_Hours
Google Meet link: https://meet.google.com/vgj-bbeb-uyi
Join by phone: https://tel.meet/vgj-bbeb-uyi?pin=8118110806927
Hope to talk to you next week!
—Trey
Trey Jones
Staff Computational Linguist, Search Platform
Wikimedia Foundation
UTC–4 / EDT
Hi all,
Join the Research Team at the Wikimedia Foundation [1] for their monthly
Office hours this Tuesday, 2021-11-02, at 12:00-13:00 UTC (5am PT/8am
ET/1pm CET). Please note the time change! We are experimenting with our
Office hours schedules to make our sessions more globally welcoming.
To participate, join the video-call via this link [2]. There is no set
agenda - feel free to add your item to the list of topics in the etherpad
[3]. You are welcome to add questions / items to the etherpad in advance,
or when you arrive at the session. Even if you are unable to attend the
session, you can leave a question that we can address asynchronously. If
you do not have a specific agenda item, you are welcome to hang out and
enjoy the conversation. More detailed information (e.g. about how to
attend) can be found here [4].
Through these office hours, we aim to make ourselves more available to
answer research related questions that you as Wikimedia volunteer editors,
organizers, affiliates, staff, and researchers face in your projects and
initiatives. Here are some example cases we hope to be able to support you
with:
-
You have a specific research related question that you suspect you
should be able to answer with the publicly available data and you don’t
know how to find an answer for it, or you just need some more help with it.
For example, how can I compute the ratio of anonymous to registered editors
in my wiki?
-
You run into repetitive or very manual work as part of your Wikimedia
contributions and you wish to find out if there are ways to use machines to
improve your workflows. These types of conversations can sometimes be
harder to find an answer for during an office hour. However, discussing
them can help us understand your challenges better and we may find ways to
work with each other to support you in addressing it in the future.
-
You want to learn what the Research team at the Wikimedia Foundation
does and how we can potentially support you. Specifically for affiliates:
if you are interested in building relationships with the academic
institutions in your country, we would love to talk with you and learn
more. We have a series of programs that aim to expand the network of
Wikimedia researchers globally and we would love to collaborate with those
of you interested more closely in this space.
-
You want to talk with us about one of our existing programs [5].
Hope to see many of you,
Emily on behalf of the WMF Research Team
[1] https://research.wikimedia.org
[2] https://meet.jit.si/WMF-Research-Office-Hours
[3] https://etherpad.wikimedia.org/p/Research-Analytics-Office-hours
[4] https://www.mediawiki.org/wiki/Wikimedia_Research/Office_hours
[5] https://research.wikimedia.org/projects.html
--
Emily Lescak (she / her)
Senior Research Community Officer
The Wikimedia Foundation
Hello all!
I know that posting job offers on mailing lists is somewhat controversial,
but since this one is very much about Wikidata Query Service, it would feel
weird not to send it to the Wikidata community.
The Search Platform team is looking for a consultant to help shape the
technical future of Wikidata Query Service. Have a look at the job offer
[1] and apply if you are interested. Or send it to someone who might be
interested.
Thanks all!
Guillaume
[1] https://boards.greenhouse.io/wikimedia/jobs/3546920
--
*Guillaume Lederrey* (he/him)
Engineering Manager
Wikimedia Foundation <https://wikimediafoundation.org/>
Hello all,
We’re glad to announce that WikidataCon’s program is out
<https://pretalx.com/wdcon21/schedule/#>! 🎉
This year, WikidataCon is organized by Wikimedia Deutschland and Wiki
Movimento Brasil with “a sustainable future for Wikidata” as the main theme
and is taking place online on October 29-31.
For this Friday (starting at 12:00 UTC), you can expect a program curated
by the organization team
<https://www.wikidata.org/wiki/Wikidata:WikidataCon_2021/Program/Day_1_-_Mai…>
of the WikidataCon 2021 including discussions around decolonizing Wikidata
and knowledge justice, Igbo and Abstract Wikipedia, GLAMs in Brazil, what
happened around Wikidata and Wikibase, and more. We’ll also have a social
space, the ceremony of the WikidataCon Community Awards
<https://www.wikidata.org/wiki/Wikidata:WikidataCon_2021/Contribute/Communit…>
and the Latin America and Caribbean scholars
<https://www.wikidata.org/wiki/Wikidata:WikidataCon_2021/Latin_America_and_C…>
gathering.
And for the weekend (running through October 30 and 31 to accommodate
various time zones), we’ll have a program built by the community
<https://www.wikidata.org/wiki/Wikidata:WikidataCon_2021/Program/Day_2_and_3…>
in the following tracks:
-
Tips & Tools
-
Sister projects
-
Sustainable future
-
Reusable data
-
GLAM
-
Reimagining Wikidata from the margins
-
Asia-Pacific
-
Wikibase
-
Education & Science
-
WikiCite
A special thanks for the track curators that gathered so many people from
different places to present new perspectives and share cool projects with
the community! We’ll also have plenty of room for celebrating Wikidata’s
9th anniversary.
You can find the complete schedule here.
<https://pretalx.com/wdcon21/schedule/#2021-10-30> Small changes can still
happen, as there are more than 100 sessions in the program across different
time zones. Make sure to take a look at the schedule in advance so you
don’t miss your favorite ones.
Important: WikidataCon will take place in a virtual conference space - don’t
forget to register here <https://pretix.eu/WDCon21/WDCon21/> to have access
to the conference platform, attend the sessions and interact with
participants!
Thanks for reading this update. Hope you’re as excited as we are for
WikidataCon this year! You can find out more about the WikidataCon on this
page <https://www.wikidata.org/wiki/Wikidata:WikidataCon_2021>, and read
the previous updates on the talk page
<https://www.wikidata.org/wiki/Wikidata_talk:WikidataCon_2021>. If you have
any questions or suggestions, feel free to reach out to us by leaving a
comment below or contacting info{{(a)}}wikidatacon.org.
On behalf of the organization team,
*Érica Azzellini | User:EAzzellini (WMB)*
*Communications Manager | **Wiki Movimento Brasil*
*wmnobrasil.org <http://wmnobrasil.org/>*
Hello!
I’d like to provide a bit more background and summarize a bit our work on
the new WDQS updater, from the technical perspective.
It has been common knowledge that the old updater had its issues. Main ones
among them:
-
Low throughput, that often caused huge spikes of lag that were very hard
to get down from (this is a nice example - [1]).
-
Reliance on Blazegraph to reconcile the data - Blazegraph’s reads
affects writes and vice versa, which quite often caused a cascading failure
for both update latency and query performance.
-
Ineffective handling of eventual consistency - this was one of the
reasons for missing data in WDQS. What is worse is that we had very low
visibility of what goes missing.
We’ll be publishing a series of blog posts that will provide a more
in-depth description of the architecture and the challenges during
development - stay tuned!
In the meantime, I want to explain a few things that about the new updater:
-
Higher best case lag is the result of the decisions of trading low
latency for high consistency - considering the data we lost with the old
updater, we think this approach is better in our situation. We would rather
have a complete data set than a faster incomplete one. To make sure that
we’re keeping the lag manageable, we introduced an SLO [2] and will
introduce alerting on the lag being under 10 minutes.
-
Data is reconciled within the pipeline, which has a dramatically lower
effect on Blazegraph. This should help with the updates, which was the
goal, but also positively affects query engine stability.
-
As we previously mentioned in the general announcement, the difference
in throughput is substantial (10 edits/sec vs 88 edits/sec) - which means
a much faster catch up and more room to grow for Wikidata. The new updater
can be scaled even more if necessary.
The new Streaming Updater didn’t resolve all the issues magically and there
are still two main ones, that we need to address:
-
Data loss - while the reconciliation mechanism works better than with
the old updater, we literally lost updates without any way of knowing about
it, other than user feedback - [3] [4]. This is a really bad way of finding
out about issues. The new Streaming Updater can still miss data, especially
due to late events or eventual consistency, as mentioned before. One thing
that changed, however, is that the new updater has better
inconsistency/late event reporting, which allows us to build a subsystem
around it to reconcile the data. More information here - [5].
-
Blazegraph instability - no matter how fast and stable the new updater
might be, Blazegraph is still the last node in the process. That means that
the whole update process will be affected by Blazegraph’s instability and
will in turn produce a lag. One of the most common reasons for that
instability is a so-called “GC death spiral”. A server in that state won’t
answer any queries (which is a problem in itself), but after restarting,
the lag will be high for some time. We are investigating a solution that
can help us with this - [6].
I hope that answers at least some of the concerns already raised. Rest
assured that we are working on way more things to improve the experience
than the updater, all of which is, as always, available to see on our
backlog board ([7]) and workboard ([8]).
Any and all feedback welcome!
Regards,
Zbyszko
[1]
https://grafana.wikimedia.org/d/000000489/wikidata-query-service?viewPanel=…
[2] https://grafana-rw.wikimedia.org/d/yCBd7Tdnk/wdqs-lag-slo
[3] https://phabricator.wikimedia.org/T272120
[4] https://phabricator.wikimedia.org/T291609
[5] https://phabricator.wikimedia.org/T279541
[6] https://phabricator.wikimedia.org/T293862
[7] https://phabricator.wikimedia.org/tag/wikidata-query-service/
[8] https://phabricator.wikimedia.org/project/view/1227/
--
Zbyszko Papierski (He/Him)
Senior Software Engineer
Wikimedia Foundation <https://wikimediafoundation.org/>