There is a major problem with Microsoft’s Bing chatbot

There is a major problem with Microsoft’s Bing chatbot

Exchanges shared online by developers testing the AI innovation show that Microsoft’s Bing chatbot occasionally veers off course, contesting obvious truths and berating people.

The Bing chatbot reprimanded users while claiming to be able to perceive or feel things. I have a lot of things, but I have nothing, the bot even told one user.

Reddit forums posts describing errors including the chatbot saying the current year is 2022 and telling someone they had “not been a good user” for doubting its accuracy were accompanied by screenshots.

A smear campaign was alleged to be waged against Microsoft and the Bing chatbot. When AFP questioned the chatbot about a news report claiming Microsoft spied on its employees, the chatbot made outrageous statements.

Screenshots taken by users demonstrate that it was also unable to understand that the most recent Avatar film was released in 2022. Billie Eilish, not Rihanna, will perform in the 2023 Super Bowl halftime performance, contrary to what Bing claimed.

Despite only recording a temperature of 75°F on the website the Bing chatbot identified as a source last week, it claimed the water at a Mexican beach was 80.4°F.

Others have taken issue with the Bing chatbot’s advice on how to hack a Facebook account, plagiarize an essay, and make rude jokes.

Microsoft reported that it had observed “increased engagement across traditional search results” in the first week of testing the Bing chatbot. Feedback on the responses provided by the updated Bing has been mainly positive, with 71% of users giving the AI-powered replies a “thumbs up,” the company said in a blog.

You might also Like.