Hi everyone,
I'm currently working on a concept to improve the current editing situation
of Wikidata's data on a client (e.g. Wikipedia & co.) in colaboration with
Lydia Pintscher, in the scope of my Bachelor's thesis.
It would be very appreciated if you could add your input and comments to
the page below and pass this info to the other Wikis and their editors
since their input is crucial to this project.
https://www.wikidata.org/wiki/Wikidata:Client_editing_input
Thanks!
Charlie
<https://www.wikidata.org/wiki/User:Lydia_Pintscher_%28WMDE%29>
After 4 successful meetings in Europe, we will cross the Atlantic for the
next one: we are happy to announce that the 5th DBpedia meeting will be
held at Stanford University, Palo Alto, on November 5th 2015.
Please read below on different ways you can participate. We are looking
forward to meeting all the US-based DBpedia enthusiasts in person.
The event will feature talks from Yahoo!, IBM Watson, Blippar, and Stanford
amongst others.
Quick facts
-
Web URL: http://wiki.dbpedia.org/meetings/California2015
-
When: November 5th, 2015
-
Where: MSOB x303, Stanford University, Palo Alto CA (map
<https://goo.gl/qOGs7Z>)
-
Host: Dumontier Lab, Stanford University
-
Call for Contribution: Submit your proposal in our form
<https://docs.google.com/forms/d/1e_sxoJYZcAh5pZ8Gb4T0ZKG4CcHZvvG6g3YYjlUd5f…>
-
Registration: https://event.gg/1742-5th-dbpedia-meeting-california
(limited seats)
-
Looking for sponsors (catering)
Acknowledgments
If you would like to become a sponsor for the 5th DBpedia Meeting, please
contact the DBpedia Association <dbpedia(a)infai.org>
Dumontier Laboratory for Biomedical Knowledge Discovery
<http://dumontierlab.stanford.edu/>
For hosting the meeting and helping with the organization
Institute for Applied Informatics <http://infai.org/de/Aktuelles>
For supporting the DBpedia Association
OpenLink Software <http://www.openlinksw.com/>
For continuous hosting of the main DBpedia Endpoint
Organisation
-
Michel Dumontier <https://med.stanford.edu/profiles/michel-dumontier>,
Stanford University
-
Pablo N. Mendes
<http://researcher.ibm.com/researcher/view_person_pubs.php?person=us-pnmende…>,
IBM Watson Research
-
Marco Fossati <https://about.me/marco.fossati>, SpazioDati
-
Dimitris Kontokostas <http://aksw.org/DimitrisKontokostas>, DBpedia
Association and AKSW, Uni Leipzig
-
Sebastian Hellmann
<http://bis.informatik.uni-leipzig.de/SebastianHellmann,>, DBpedia
Association and AKSW, Uni Leipzig
Registration
Attending the DBpedia Community meeting is free, but you need to register
<https://event.gg/1742-5th-dbpedia-meeting-california>. There are 3 types
of tickets that are booked separately:
-
Ticket for the main event (50 people)
-
Ticket for the pre-event (20 people)
-
Optional DBpedia support ticket
Call for Contribution
Please submit your proposal through our form
<https://docs.google.com/forms/d/1e_sxoJYZcAh5pZ8Gb4T0ZKG4CcHZvvG6g3YYjlUd5f…>.
Contribution proposals include (but not limited to) presentation, demos,
lightning talks and session suggestions. All talks are accepted by default
and will be added to the program in batches.
Location / Venue
The meeting will take place at Palo Alto CA, Stanford University, MSOB
x303 (map <https://goo.gl/qOGs7Z>).
parking info: There is a parking structure on Pasteur Dr. near the hospital
as well as in L-17. Both are only 5 minutes away walking. The yellow [P]
shows the area for paid public parking. see the Stanford transportation maps
<https://transportation.stanford.edu/maps.shtml> for details and especially
the south-west map.
<http://transportation.stanford.edu/images/campus-parking-map-nw.png>
Schedule (draft)
16:30 pre-event meeting (room tba)
Separate pre-meeting. Depending on the audience we can do hackathons,
tutorials, QA sessions, etc (needs separate registration)
19:00 Main Event (Room MSOB x303)
-
Introductions (5”)
-
DBpedia state of affairs (10”)
-
Invited talks (1h)
-
Michel Dumontier, Stanford
-
Anshu Jain, IBM Watson
-
Nicolas Torzec, Yahoo!
-
Karthik Gomadam, Accenture
-
Joakim Soderberg, Blippar
-
Alkis Simitsis, HP Labs
-
Yashar Mehdad, Yahoo! Labs
-
Lightning Talks (1h)
-
(Use our form
<https://docs.google.com/forms/d/1e_sxoJYZcAh5pZ8Gb4T0ZKG4CcHZvvG6g3YYjlUd5f…>
to submit your talks)
-
QA session / audience introduction
- Networking and refreshments
--
Dimitris Kontokostas
Department of Computer Science, University of Leipzig & DBpedia Association
Projects: http://dbpedia.org, http://http://aligned-project.eu,
http://rdfunit.aksw.orgHomepage:http://aksw.org/DimitrisKontokostas
Research Group: http://aksw.org
I've been reading mostly the archives and the GitHub tickets so far, but
given the interest in the Primary Sources Tool maybe it's time I joined the
mailinglist ;-)
Following up on the discussions the last weeks I added two new features to
the backend:
1. for any request listing statements it is now possible to filter by
state, with the default being "unapproved"
For example, you can select 10 random statements that have been marked
"wrong" (i.e. rejected in the UI) with
curl -i "
https://tools.wmflabs.org/wikidata-primary-sources/statements/any?state=wro…
"
or retrieve all approved statements with
curl -i "
https://tools.wmflabs.org/wikidata-primary-sources/statements/all?state=app…
"
(NB: the /all endpoint is using paging, use the offset= and limit=
parameters to control how much is returned)
the different acceptable states are defined in
https://github.com/google/primarysources/blob/master/backend/Statement.h#L14
2. all statements that already had some form of interaction (e.g. have been
approved or rejected) now contain a new JSON field "activities" listing the
activities acting on the statement; even though usually there will be at
most one activity (i.e. approved or rejected), the system stores (and
already stored since we launched it) a complete history, e.g. for
transitions like unapproved -> wrong -> unapproved -> approved.
You can try it out by retrieving a random selection of statements in other
states than "unapproved", e.g. as before:
curl -i "
https://tools.wmflabs.org/wikidata-primary-sources/statements/any?state=wro…
"
will give you results like:
{
"activities" : [
{
"state" : "wrong",
"timestamp" : "+2015-05-09T14:26:45Z/14",
"user" : "Hoo man"
}
]
,
"dataset" : "freebase",
"format" : "v1",
"id" : 31,
"state" : "wrong",
"statement" : "Q1702409\tP27\tQ145\tS854\t\"
http://www.astrotheme.com/astrology/Warren_Mitchell\"",
"upload" : 0
}
Hope it is useful ;-)
Otherwise let me know if you are interested in other analysis data. I'll
try adding features as time permits.
Cheers!
--
Dr. Sebastian Schaffert | GMail Site Reliability Manager |
schaffert(a)google.com | +41 44 668 06 25
Hi Denny, Thomas,
I would like to thank you both for your support in making the StrepHit
soccer dataset available! I owe you some hectolitres of beer :-)
There is one thing that was mentioned during our summer discussions and
that I sadly forgot: shall the Freebase ontology mappings be added to
Wikidata?
If a Freebase endpoint still exists, it may make sense to proceed as per
the DBpedia mappings.
See for instance the "equivalent class" claim in Astronaut:
https://www.wikidata.org/wiki/Q11631
Cheers!
On 10/2/15 14:00, wikidata-request(a)lists.wikimedia.org wrote:
> Date: Thu, 01 Oct 2015 18:09:21 +0000
> From: Denny Vrandečić<vrandecic(a)google.com>
> To: "Discussion list for the Wikidata project."
> <wikidata(a)lists.wikimedia.org>
> Subject: [Wikidata] Freebase to Wikidata: Results from Tpt internship
> Message-ID:
> <CAFXBQpHhAUEeKn1XDC=xb0urwiryJq2DpQL0XzdC=gU6jp-_Xg(a)mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> First, thanks to Tpt for his amazing work! I have not expected to see such
> rich results. He has exceeded my expectations by far, and produced much
> more transferable data than I expected. Additionally, he also was working
> on the primary sources tool directly and helped Marco Fossati to upload a
> second, sports-related dataset (you can select that by clicking on the
> gears icon next to the Freebase item link in the sidebar on Wikidata, when
> you switch on the Primary Sources tool).
--
Marco Fossati
http://about.me/marco.fossati
Twitter: @hjfocs
Skype: hell_j
Hey folks :)
Wikidata is turning 3 years old on October 29th. This is a reason to
celebrate for all of us. We'll be having a big party in Berlin and I'd
love to see as many of you there as possible.
You can find out more at
https://www.wikidata.org/wiki/Wikidata:Third_Birthday/Party
We also have a few more things planned where I'd love to have videos,
images and other input from you. More on that on
https://www.wikidata.org/wiki/Wikidata:Third_Birthday/Party as well.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hey everyone :)
We'll be doing the next Wikidata office hour on September 23rd at
17:00 UTC. See http://www.timeanddate.com/worldclock/fixedtime.html?hour=17&min=00&sec=0&d…
for your timezone. We'll be meeting in #wikimedia-office on Freenode
IRC.
As usual I'll start with an overview of what's been happening around
Wikidata since the last office hour and then we'll have time for
questions and discussions.
If there is a particular topic you'd like to have on the agenda please
let me know.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I had to clean up this entity that had Sister City property filled in with
lots of erroneous statements for it that I removed.
https://www.wikidata.org/wiki/Q185684
How can I figure out where the import went wrong, how it happened, and how
to ensure it doesn't happen again ? How does one look at Wikidata bots and
their efficiency or incorrectness ?
Trying to learn more,
Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>