I believe that the QLever service does not work this way. Instead the QLever service is based on the Wikidata RDF dumps and its data is reloaded in a batch process each week. The data in the service is thus between a few days and two weeks behind the data in Wikidata.
There are several reasons for this, some from QLever and some from the Wikidata infrastructure.
peter
On 10/2/24 05:05, Denny Vrandeฤiฤ wrote:
The most promising show of "we are ready" for the Virtuoso Open Source edition would be what QLever has been doing for a while: to provide a public endpoint with the data loaded, and kept up to date using the public edit stream. That would be an undeniably strong argument for "just use this!"
On Thu, Sep 26, 2024 at 4:32โฏPM Samuel Klein <meta.sj@gmail.com mailto:meta.sj@gmail.com> wrote:
An updated benchmark eval (or self-evals) for the top db candidates seems called for :)ย we would all love to see it. Ideally with a canonical hardware or vm spec that all can use... I don't know if the QLever self-eval from the spring is the right place to start, but I believe these <https://ad-research.cs.uni-freiburg.de/benchmarks/wdqs-queries/> are the ~300 queries they used for a Wikidata benchmark. ๐๐๐๐ _______________________________________________ Wikidata mailing list -- wikidata@lists.wikimedia.org <mailto:wikidata@lists.wikimedia.org> Public archives at https://lists.wikimedia.org/hyperkitty/list/wikidata@lists.wikimedia.org/message/YJOUPRZPOIS6QMAPUJVYM65IQOM3DKIP/ <https://lists.wikimedia.org/hyperkitty/list/wikidata@lists.wikimedia.org/message/YJOUPRZPOIS6QMAPUJVYM65IQOM3DKIP/> To unsubscribe send an email to wikidata-leave@lists.wikimedia.org <mailto:wikidata-leave@lists.wikimedia.org>
Wikidata mailing list -- wikidata@lists.wikimedia.org Public archives at https://lists.wikimedia.org/hyperkitty/list/wikidata@lists.wikimedia.org/mes... To unsubscribe send an email to wikidata-leave@lists.wikimedia.org