Hi all -
We've been using a locally installed wikidata stand-alone service (https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Standalone...) for several months now. We're becoming increasingly plagued by performance issues, and are wondering if one approach to the problem might be to adopt the Blazegraph multi-GPU architecture (https://www.blazegraph.com/product/gpu-accelerated/).
Could anyone provide guidance as to how much pain would be involved in making such a transition?
Thanks,
Eric Scott
Hi
Add boolean axo prompt to built from word as key base word example wiki will be wik the denyfen works out the code opening a new gibral code to new pace and place maker to next form or key parser has now phased we in second stage technology and states builts build from state of consonants and constanims to bruild or new language called ushen soon,try my name for example with algorth and rithm to algorithm builds neushensoun to neusohsim or maxi factor answers,simply the address bar we know at the top will soon have a point and pointer to me I'm the developer and creator,build from my full name UshenKowlessarBhinbahadur and see as they run just add some good key facts and its turning code to zibielst and no more ,kanks thanks
On Wednesday, October 26, 2016, Eric Scott eric.d.scott@att.net wrote:
Hi all -
We've been using a locally installed wikidata stand-alone service (
https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Standalone...) for several months now. We're becoming increasingly plagued by performance issues, and are wondering if one approach to the problem might be to adopt the Blazegraph multi-GPU architecture ( https://www.blazegraph.com/product/gpu-accelerated/).
Could anyone provide guidance as to how much pain would be involved in
making such a transition?
Thanks,
Eric Scott
Wikidata-tech mailing list Wikidata-tech@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
Is someone trying to train a generative language model on this mailing list?
On Thu, Oct 27, 2016 at 3:59 PM Ushen Kowlessar Bhin Bhinbaha Dur < ushenkb@gmail.com> wrote:
Hi
Add boolean axo prompt to built from word as key base word example wiki will be wik the denyfen works out the code opening a new gibral code to new pace and place maker to next form or key parser has now phased we in second stage technology and states builts build from state of consonants and constanims to bruild or new language called ushen soon,try my name for example with algorth and rithm to algorithm builds neushensoun to neusohsim or maxi factor answers,simply the address bar we know at the top will soon have a point and pointer to me I'm the developer and creator,build from my full name UshenKowlessarBhinbahadur and see as they run just add some good key facts and its turning code to zibielst and no more ,kanks thanks
On Wednesday, October 26, 2016, Eric Scott eric.d.scott@att.net wrote:
Hi all -
We've been using a locally installed wikidata stand-alone service (
https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Standalone...) for several months now. We're becoming increasingly plagued by performance issues, and are wondering if one approach to the problem might be to adopt the Blazegraph multi-GPU architecture ( https://www.blazegraph.com/product/gpu-accelerated/).
Could anyone provide guidance as to how much pain would be involved in
making such a transition?
Thanks,
Eric Scott
Wikidata-tech mailing list Wikidata-tech@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
-- https://www.facebook.com/ushenkowlessarbhinbahadur _______________________________________________ Wikidata-tech mailing list Wikidata-tech@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
Stas: can you please have a look?
Cheers Lydia
On Oct 27, 2016 22:35, "Eric Scott" eric.d.scott@att.net wrote:
Hi all -
We've been using a locally installed wikidata stand-alone service ( https://www.mediawiki.org/wiki/Wikidata_query_service/User_ Manual#Standalone_service) for several months now. We're becoming increasingly plagued by performance issues, and are wondering if one approach to the problem might be to adopt the Blazegraph multi-GPU architecture (https://www.blazegraph.com/product/gpu-accelerated/).
Could anyone provide guidance as to how much pain would be involved in making such a transition?
Thanks,
Eric Scott
Wikidata-tech mailing list Wikidata-tech@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
Hi!
We've been using a locally installed wikidata stand-alone service (https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Standalone...) for several months now. We're becoming increasingly plagued by performance issues, and are wondering if one approach to the problem might be to adopt the Blazegraph multi-GPU architecture (https://www.blazegraph.com/product/gpu-accelerated/).
Could anyone provide guidance as to how much pain would be involved in making such a transition?
I'd ask on Blazegraph list: Bigdata-developers@lists.sourceforge.net
wikidata-tech@lists.wikimedia.org