Advertisement
trendingVideosenglish2574862https://zeenews.india.com/video/news/microsoft-ai-bing-goes-rogue-asks-the-user-to-end-the-marriage-zee-news-english-2574862.html
videoDetails

Microsoft 'AI' Bing' goes rogue, asks the user to end the marriage | Zee News English

Video ThumbnailPlay icon
Microsoft 'AI' Bing' goes rogue, asks the user to end the marriage | Zee News English To tame the beast Microsoft created, the tech giant has now limited chats with Bing. The AI-powered search engine, which was announced recently by Microsoft, is acting weirdly. Users have reported that Bing has been rude, angry, stubborn of late. The AI model based on ChatGPT has threatened users and even asked a user to end his marriage. Microsoft, in its defence, has said that the more you chat with the AI chatbot, can confuse the underlying chat model in the new Bing.
NEWS ON ONE CLICK