digitalsoul
SEO & ONLINE MARKETING BERLIN »

SEO optimization Berlin

New OpenAI search engine

OpenAI, a company known for its generative AI tool ChatGPT, is planning to develop its own search engine to compete with market leader Google, according to media reports. OpenAI is working on a search engine based on the technology of the Microsoft Bing engine. However, the question arises as to whether OpenAI and Microsoft have any realistic chance of surpassing Google. Although ChatGPT is currently very popular, Google has around 50 times more users on its platform and so far no new online service has been able to overtake Google, despite significant financial investment. So far, not even Microsoft's search engine BING, which only has a few percent market share, has managed to do so.
Search engines versus AI search engines

Search engines versus AI search engines

Key questions about AI search services: The question is whether an AI search engine is better than a normal search engine? No, at the moment the performance of AI search technologies is still far behind conventional search services. The process of searching or requesting information is comparable; in principle, it is similar whether a query is made to an AI system or a search engine. The difference is that a conventional search engine, even if it already works with AI algorithms, presents a list of hits, sometimes as answers to questions, and the AI search engine presents an answer as text. This text was extracted from various existing websites, slightly reformulated and summarized. According to tests by DigitalSoul, your expert for SEO optimization (search engine optimization Berlin), online marketing and business consulting for e-commerce projects from Berlin Charlottenburg-Wilmersdorf, this can also be complete copies of information from other websites, which are generated as an answer by the AI.

The problem with this is that the result is often OK for uncontroversial questions, but for more complex questions it is very important to check the sources in order to rule out false information. This can lead to so-called hallucination, because a machine cannot distinguish between false and true facts. In specialist articles on the subject of artificial intelligence, the term malfunctions is being used more and more frequently. There is a lack of understanding and this cannot be determined on the basis of word-concept probabilities. In principle, an AI system is nothing more than a word probability analysis or output, but not based on individual words, but word clusters.

And this is precisely where misinformation creeps in, which could be prevented by careful, preferably human, research. But this is only possible if it is known which facts come from which sources and how reputable or trustworthy the communicators are. At this point, we are back to normal search engines, which list all facts separately and whose sources can be evaluated by human experts. For this reason, a normal search engine is generally better suited for extensive research. AI services still have an error rate of 20 to 50 percent and even new AI search services have not yet been able to establish themselves.

Conclusion: AI search services cannot yet compete with conventional search engines such as Google. However, it is possible to improve the quality of AI search engines through technical and organizational changes: A kind of "trust rank" or "trust index" must be introduced for each individual piece of indexed information, i.e. an automatic evaluation of the source or its factual fidelity. Preference should be given to information with a high trust index when issuing information. In this way, an AI search engine could keep pace with other, normal search services and would have added value compared to them, as the answer to a question would be output as continuous text.



SEO Berlin: News & Articles


News and articles are in preparation.