Dear Wikidata,
I just wanted to let you know that I have spent a couple of hours
translating the Wikidata FAQ into French.
=> See http://meta.wikimedia.org/wiki/Wikidata/FAQ/fr
I also went after the first Wikidata documentation and published an
introduction to the project, as a proof that I did my due diligence :)
=> http://zecdata.com/2012/04/09/wikidata-one-common-source-of-structured-data…
I would be happy to help and collaborate if needed.
(I have worked on various related topics at Yahoo! Labs and other
places over the past 10+ years. So I may be of some help. Feel free to
look me up on LinkedIn)
Best,
Nicolas.
[@nicolastorzec on Twitter]
The Wikidata project sounds like it can be an amazing source of
information over the long term. Key to that is interoperable data with
known formats that will last the test of time. If you want an
electronic copy of my book (Silver Bullets) on this topic, email me and
I'll send along gratis.
One standard format I'd highly recommend is Common Alerting Protocol
(CAP), which has been driven by OASIS (www.oasis-open.org) . CAP is
becoming a worldwide standard for notification and alerting - used by
FEMA, NOAA, USGS in the US gov, Australia, Canada, Sri Lanka and many
other countries. Google uses CAP alerts as a basis for their new public
safety alerting, and many commercial products and companies are
incorporating CAP.
These alerts/notifications also are useful for a historical look at
events past - you can accumulate all the earthquake data for an area
over a period of years.
Thanks, JFC. Often those that will change the world tomorrow are those that
are not understood today.
Cheers,
Denny
2012/4/6 JFC Morfin <jefsey(a)jefsey.com>
> At 12:45 05/04/2012, Denny Vrandečić wrote:
>
>> In short, we have for Wikidata two pragmatic goals:
>> * Wikidata's first aim is to support the Wikipedias with their language
>> links
>> * Wikidata's second aim is to support the Wikipedias with the infoboxes
>>
>> Out of the support for these tasks, other interesting use cases might and
>> are expected to arise.
>>
>> Until I manage to understand how your comments relate to one of these
>> goals, I will personally take the liberty to ignore your comments.
>>
>
> Fair enough :-)
>
> Your assesments are correct. As I first documented it, our (iucg(a)ietf.org)
> target in this area is the Internet+ (smart fringe to fringe Internet) MDRS
> (metadata registry multilinguistic distributed referential system). The
> MDRS is to the Internet+ and to the Semiotic Internet (Intersem) that we
> explore, what the IANA is to the legacy Internet, and what Wikidata might
> be to Wikimedia.
>
> Our "use case" is the Internet+ distributed operations (I documented the
> IETF Drafts references). The MDRS will most probably be a datawiki or/and a
> DDDS (the DNS is a DDDS) of some sort. Todays IANA and wikis are humanly
> fed and read, datawikis will be more and more fed and read by intelligent
> processes. This intelligence leads to additional opportunities and
> constraints.
>
> Our targets are the same, however you have to have conceptual limits,
> while by essence I must have none. This is why I tried to poke our possible
> common interest areas. Your two confirmed documents now gives us your
> current limits (the more people understand what the "revolution" (as per
> wikimedia) datawikis are going to be, the more they may expect from them).
>
> My own target is to internally review these documents, assess their
> possible evolution, strive to stay interoperable, and permit users and
> applications to take a better advantage from your project (
> wikidata.iucg.org). We will alert you if we fear possible architectural
> conflicts through our work and tests. This seems to be in line with what
> Lydia responded today.
>
> Best
> jfc
>
>
--
Project director Wikidata
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hey guys,
what about setting up a wiki with a summary of the project? Mailing
lists with lots of traffic are hard to follow... For start we could
move the couple of pages from Meta there...
Kozuch
On 4/8/12 9:25 PM, JFC Morfin wrote:
> Is there an objection to the concept of, or cooperation with,
> "datawiki" Wikidata compatible projects? I would define a "datawiki"
> (as there are databases) as a JSON oriented NoSQL DBMS using an
> enhanced wiki as a human user I/O interface.
Why are you turning a the term "DataWiki" into something so narrow?
A DataWiki is simply the application of Wiki concepts to Data re.
Create, Update, and Delete operation.
There are quite a number of DataWiki that have been in existence in the
Linked Data realm long before Wikidata. It just so happens that Wikidata
is potentially another fine example of the DataWiki concept at Web scale.
> This would permit BigData, specialized data, and graph sources to feed
> Wikidata along their own data philosophy and collection/update policy.
It doesn't need to permit anything re. BigData. Eventually, folks will
understand that BigData agility is inextricably linked to the concept of
DataWikis.
> I suppose that the main point would be an inter-datawiki interchange
> protocol (RFC?) matching the datawiki authoritative operators' (the
> first of them being Wikidata) requirements.
No, you already have HTTP, REST patterns, SPARQL Protocol for Updates,
Inserts, and Deletions, SPARQL Graph Update Protocol etc..
> I would permit projects at different stages of R&D or with different
> main purposes in order to cooperate with Wikidata.
The Web isn't about permissions, it is really about the ability to "just
do it!" .
:-)
Links:
1. http://dig.csail.mit.edu/2007/tab/tutorial/editing.mov -- a 2007
DataWiki screencast by TimBL
> jfc
>
>
>
>
> _______________________________________________
> Wikidata-l mailing list
> Wikidata-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
--
Regards,
Kingsley Idehen
Founder& CEO
OpenLink Software
Company Web: http://www.openlinksw.com
Personal Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca handle: @kidehen
Google+ Profile: https://plus.google.com/112399767740508618350/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen
Hello!
What about the engine of Wikidata?
Do you think MediaWiki is good for structured data?
I think MediaWiki should be greatly modified at least to be proper engine
for Wikidata.
The best thing would be to create new engine specially for structured data.
It would be also better for Wikitionary.
Just remember what was Mediawiki created for. Storing marked up text pages.
Mediawiki is good for encyclopedia but not for Wikitionary and Wikidata
purposes.
Sincerely,
Soslan Khubulov
At 11:55 09/04/2012, Soslan Khubulov wrote:
>The best thing would be to create new engine specially for
>structured data. It would be also better for Wikitionary.
>Just remember what was Mediawiki created for. Storing marked up text
>pages. Mediawiki is good for encyclopedia but not for Wikitionary
>and Wikidata purposes.
Soslan,
you are most probably correct. However, I feel that every different
need that can be discussed about Wikidata may lead to this
conclusion, but with different requirements. This is why I suggest to
uncouple the storing architecture (there might be several ones) from
the project and to make central its interchange protocol. In doing so
I suggest to refer to a NoSQL typical storing system as being by
essence the most complex context since it can be format independant.
Such a protocol is more complex than a simple JSON use. It should
support concepts such as structure characteristics, confidence
levels, IP protection, plagiary filtering, authority authentication,
encryption, langtags, mandatory information, locale files, time,
embargoes, acknowledgments, etc. Possibly we may want to specify
datawiki agents for the capture of the data (DWA), some of them could
be automated processes (e.g. weather observations, scientific
experiments reporting, stock exchanges, etc.)
jfc