Microsoft `AI` Bing` goes rogue, asks the user to end the marriage | Zee News English

Zee News Feb 18, 2023, 20:35 PM IST,

Microsoft 'AI' Bing' goes rogue, asks the user to end the marriage | Zee News English
To tame the beast Microsoft created, the tech giant has now limited chats with Bing. The AI-powered search engine, which was announced recently by Microsoft, is acting weirdly. Users have reported that Bing has been rude, angry, stubborn of late. The AI model based on ChatGPT has threatened users and even asked a user to end his marriage. Microsoft, in its defence, has said that the more you chat with the AI chatbot, can confuse the underlying chat model in the new Bing.

ZEENEWS TRENDING STORIES

By continuing to use the site, you agree to the use of cookies. You can find out more by Tapping this link