At 09:20 09/04/2012, Gerard Meijssen wrote:
Hoi,
First things first ... that is getting Wikidata to work for its initial
purposes. Automated updates from elsewhere are nice but introduce a
complete new set of issues including reliability.
I agree fist thing first. It seens that in a network centric world the
holistic aspects should come first. New projects necessarily come in a
context they depend on and they are networked with. They have to be in
osmosis with their context; and its possible futures, and therefore
designed for it. Foreign (whiich do not have necessarily to be automated)
batch updtes are part of their environment as well as users individual
updates. The interest of a networked datawikis approach is that
requirements can be distributed and therefore Wikidata specifications to
be simpler, as long as they are supported by a common generic basis and
an interchange protocol. The Dublin Core results from OCLC networking in
the late 70s. The W3C did not start in thinking of semantic registries
but of a semantic web. The IRI are universal. JSON is open and universal.
Denny does not even understand what my own project basics mean, howver we
can easily meet on a JSON based protocol. Why, for example, to enter
geographic coordinates or linguistic tables manually?
For example, I look in vain for a single table quiving me the name,
value, characteristics, and 32x32 bits graphic of every ISO 10646 code
poiint. If someone makes it, it should result in an easy batch transfer,
supported (both ways) by an authoritative decision. Not by millions of
human error prone manual entries. For the time being I did not see
discuss the position of the huge amount of new entries not being
validated yet. If I enter that ice melts at 5°C, will that be immediately
dessiminated or will be in stand-by somewhere until approved?
jfc
Thanks,
   Gerard
On 9 April 2012 03:25, JFC Morfin
<jefsey@jefsey.com>
wrote:
- Is there an objection to the concept of, or cooperation with,
"datawiki" Wikidata compatible projects? I would define a
"datawiki" (as there are databases) as a JSON oriented NoSQL
DBMS using an enhanced wiki as a human user I/O interface. This would
permit BigData, specialized data, and graph sources to feed Wikidata
along their own data philosophy and collection/update policy. I suppose
that the main point would be an inter-datawiki interchange protocol
(RFC?) matching the datawiki authoritative operators' (the first of them
being Wikidata) requirements. I would permit projects at different stages
of R&D or with different main purposes in order to cooperate with
Wikidata.
- jfc
- _______________________________________________
- Wikidata-l mailing list
-
Wikidata-l@lists.wikimedia.org
-
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l