Hello everyone, 

I'm just sending a reminder that the below Showcase will be starting in half an hour. 

-Janna Layton 

Hi all,

The next Research Showcase, “Learning How to Correct a Knowledge Base
from the Edit History” and “TableNet: An Approach for Determining
Fine-grained Relations for Wikipedia Tables” will be live-streamed
this Wednesday, March 20, 2019, at 11:30 AM PST/18:30 UTC (Please note
the change in time in UTC due to daylight saving changes in the U.S.).
The first presentation is about using edit history to automatically
correct constraint violations in Wikidata, and the second is about
interlinking Wikipedia tables.

YouTube stream: https://www.youtube.com/watch?v=6p62PMhkVNM

As usual, you can join the conversation on IRC at #wikimedia-research.
You can also watch our past research showcases at
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase .

This month's presentations:

Learning How to Correct a Knowledge Base from the Edit History

By Thomas Pellissier Tanon (Télécom ParisTech), Camille Bourgaux (DI
ENS, CNRS, ENS, PSL Univ. & Inria), Fabian Suchanek (Télécom
ParisTech), WWW'19.

The curation of Wikidata (and other knowledge bases) is crucial to
keep the data consistent, to fight vandalism and to correct good faith
mistakes. However, manual curation of the data is costly. In this
work, we propose to take advantage of the edit history of the
knowledge base in order to learn how to correct constraint violations
automatically. Our method is based on rule mining, and uses the edits
that solved violations in the past to infer how to solve similar
violations in the present. For example, our system is able to learn
that the value of the [[d:Property:P21|sex or gender]] property
[[d:Q467|woman]] should be replaced by [[d:Q6581072|female]]. We
provide [https://tools.wmflabs.org/wikidata-game/distributed/#game=43
a Wikidata game] that suggests our corrections to the users in order
to improve Wikidata. Both the evaluation of our method on past
corrections, and the Wikidata game statistics show significant
improvements over baselines.


TableNet: An Approach for Determining Fine-grained Relations for
Wikipedia Tables

By Besnik Fetahu

Wikipedia tables represent an important resource, where information is
organized w.r.t table schemas consisting of columns. In turn each
column, may contain instance values that point to other Wikipedia
articles or primitive values (e.g. numbers, strings etc.). In this
work, we focus on the problem of interlinking Wikipedia tables for two
types of table relations: equivalent and subPartOf. Through such
relations, we can further harness semantically related information by
accessing related tables or facts therein. Determining the relation
type of a table pair is not trivial, as it is dependent on the
schemas, the values therein, and the semantic overlap of the cell
values in the corresponding tables. We propose TableNet, an approach
that constructs a knowledge graph of interlinked tables with subPartOf
and equivalent relations. TableNet consists of two main steps: (i) for
any source table we provide an efficient algorithm to find all
candidate related tables with high coverage, and (ii) a neural based
approach, which takes into account the table schemas, and the
corresponding table data, we determine with high accuracy the table
relation for a table pair. We perform an extensive experimental
evaluation on the entire Wikipedia with more than 3.2 million tables.
We show that with more than 88% we retain relevant candidate tables
pairs for alignment. Consequentially, with an accuracy of 90% we are
able to align tables with subPartOf or equivalent relations.
Comparisons with existing competitors show that TableNet has superior
performance in terms of coverage and alignment accuracy.

--
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology