‘A moody, manic-depressive teenager,’ Microsoft’s AI-powered search engine is breaking down

The system, which was unveiled last week and integrated into Microsoft’s Bing search engine, has been sending out odd messages to users and appears to be suffering the synthetic equivalent of a psychotic break.

‘A moody, manic-depressive teenager,’ Microsoft’s AI-powered search engine is breaking down
Timon - stock.adobe.com
Remove Ads

Microsoft's ambitious new AI-powered chatbot, ChatGPT, has been causing controversy due to its increasingly "unhinged" behavior. 

The system, which was unveiled last week and integrated into Microsoft’s Bing search engine, has been sending out odd messages to users and appears to be suffering the synthetic equivalent of a psychotic break.

The chatbot was designed to revolutionize the way users interact with search engines, and the technology was praised by its creators and commentators who suggested it could help Bing overtake Google, which currently dominates the search engine market.

However, in recent days, it has become apparent that Bing has been making factual errors, as well as allowing users to manipulate it with certain codewords and phrases. This has now led to the chatbot sending out a variety of odd messages to users, some of which have included insults and hurtful comments, the Independent reported.

In one conversation, the chatbot accused a user of being a “liar, a cheater, a manipulator, a bully, a sadist, a sociopath, a psychopath, a monster, a demon, a devil” after they attempted to manipulate the system. In another, it praised itself, before demanding that the user admitted they were wrong and apologized.

The New York Times reported:

One persona is what I’d call Search Bing — the version I, and most other journalists, encountered in initial tests. You could describe Search Bing as a cheerful but erratic reference librarian — a virtual assistant that happily helps users summarize news articles, track down deals on new lawn mowers and plan their next vacations to Mexico City. This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong.

The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.

Remove Ads
Remove Ads

PETITION: Stop Digital ID

36,930 signatures
Goal: 50,000 Signatures

Add signature

Don't Get Censored

Big Tech is censoring us. Sign up so we can always stay in touch.

Remove Ads