On Sun, Dec 25, 2022 at 1:00 AM Anders Wennersten mail@anderswennersten.se wrote:
For me the only question is of Google come first (who has better knowledge how to interface backend knowledge repositories the Wikipedia will become) or if chatGPT will learn this
No speech interface as far as I can tell, but FYI, there now is at least one search engine that already integrates a language model based chatbot into search: https://you.com/, which has the backing from Salesforce founder & billionaire Marc Benioff (a bit more: https://www.protocol.com/you-dot-com-benioff). Unlike ChatGPT, it tries to directly cite web sources. When that source is Wikipedia, you'll note it's basically rewriting/summarizing the Wikipedia article. I don't know if it uses GPT underneath or its own language model; Salesforce has certainly funded the creation of models of its own.
When I asked You.com if it uses GPT-3, it said yes. When I asked it to provide a source, it generated a URL that does not exist.
I also observed other failure modes, such as combining multiple persons with the same name into one, or giving directly contradictory answers to the same question being asked repeatedly. All of these failure modes are characteristic of language models, which are a bit like pinball machines in that they will generate results nondeterministically from the training data.
Of course, this is the technology as it exists today, and even with those limitations in mind it can prove useful (though it seems irresponsible to market it as part of a search engine in its current form).
Warmly, Erik