Dear all,
I am wondering if the Wikimedia Commons data structure (ideally in XML) as well as the documentation thereof and sample data is something that one could find online.
There is a team at ICS FORTH who have developed a mapping technology called X3ML which allows declarative mappings between two data structures. The idea would be to map the Wikimedia Commons data structure to the CIDOC CRM, meant for heritage content users.
Where could I try to find the Wikimeida Commons data structure? or who may I ask further on this matter?
thank you much in advance for any tips ! best Trilce
Trilce Navarrete, 17/07/2018 14:52:
Where could I try to find the Wikimeida Commons data structure? or who may I ask further on this matter?
There is no such thing as a "data structure" really. Some concise background: https://commons.wikimedia.org/wiki/Commons:Structured_data/About/FAQ
Federico
Hi Trilce,
There is new set of dumps for every Wikimedia wiki at least once a month. Among those files are several database dumps in xml format. One with the most recent version of every article, one with meta data but no article texts ('stub dumps'). One with full texts for every revision of every article. Here is the latest set for Commons: https://dumps.wikimedia.org/commonswiki/20180701/
I hope this helps, Cheers, Erik
On Tue, Jul 17, 2018 at 1:52 PM Trilce Navarrete trilce.navarrete@gmail.com wrote:
Dear all,
I am wondering if the Wikimedia Commons data structure (ideally in XML) as well as the documentation thereof and sample data is something that one could find online.
There is a team at ICS FORTH who have developed a mapping technology called X3ML which allows declarative mappings between two data structures. The idea would be to map the Wikimedia Commons data structure to the CIDOC CRM, meant for heritage content users.
Where could I try to find the Wikimeida Commons data structure? or who may I ask further on this matter?
thank you much in advance for any tips ! best Trilce
-- :..::...::..::...::..: Trilce Navarrete
m: +31 (0)6 244 84998 | s: trilcen | t: @trilcenavarrete w: trilcenavarrete.com
This is very kind Erik ! thanks for the link ! will let you know if this works :)
best T
On Thu, Jul 19, 2018 at 4:04 PM Erik Zachte ezachte@wikimedia.org wrote:
Hi Trilce,
There is new set of dumps for every Wikimedia wiki at least once a month. Among those files are several database dumps in xml format. One with the most recent version of every article, one with meta data but no article texts ('stub dumps'). One with full texts for every revision of every article. Here is the latest set for Commons: https://dumps.wikimedia.org/commonswiki/20180701/
I hope this helps, Cheers, Erik
On Tue, Jul 17, 2018 at 1:52 PM Trilce Navarrete < trilce.navarrete@gmail.com> wrote:
Dear all,
I am wondering if the Wikimedia Commons data structure (ideally in XML) as well as the documentation thereof and sample data is something that one could find online.
There is a team at ICS FORTH who have developed a mapping technology called X3ML which allows declarative mappings between two data structures. The idea would be to map the Wikimedia Commons data structure to the CIDOC CRM, meant for heritage content users.
Where could I try to find the Wikimeida Commons data structure? or who may I ask further on this matter?
thank you much in advance for any tips ! best Trilce
-- :..::...::..::...::..: Trilce Navarrete
m: +31 (0)6 244 84998 | s: trilcen | t: @trilcenavarrete w: trilcenavarrete.com
wiki-research-l@lists.wikimedia.org