Microsoft to school Bing AI after reports of chatbot's hysteria surface

United States News News

Microsoft to school Bing AI after reports of chatbot's hysteria surface
United States Latest News,United States Headlines
  • 📰 IntEngineering
  • ⏱ Reading Time:
  • 80 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 35%
  • Publisher: 63%

The action was taken in response to stories of Bing's 'unhinged' discussions and a lengthy back-and-forth that shocked many.

The chatbot's functions will now be limited to 50 questions per day and five per session, according to a blog by Microsoft published on Friday.

"Very long chat sessions can confuse the underlying chat model in the new Bing. To address these issues, we have implemented some changes to help focus the chat sessions," read the blog."Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing.

Longer chat sessions, with 15 or more queries, could cause Bing to be repetitious "or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone," warned an earlier blog by the tech giant.According to the Bing team, data shows that the vast majority of people find the answers they're looking for within five turns, and only about one percent of chat conversations have 50+ messages.

When users reach the five-topic limit per session, Bing will prompt them to begin a new topic. Microsoft believes that deleting a conversation after only five questions ensures that "the model won't get confused." The action was taken in response to stories of Bing's "unhinged" discussions and a lengthy back-and-forth with Bing that The New York Times reported.The time period of these restrictions is not immediately known, but Microsoft is still attempting to school Bing's tone.Earlier this week, the business said that it did not "fully envision" users of its chat interface for "social entertainment" or as a means of more "general discovery of the world.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

IntEngineering /  🏆 287. in US

United States Latest News, United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Microsoft's Bing A.I. Is Pissed at MicrosoftMicrosoft's Bing A.I. Is Pissed at MicrosoftA Wapo reporter struck up a conversation with Microsoft's AI-powered chatbot, and 'Sydney' was not happy about being interviewed
Read more »

Microsoft explains Bing's bizarre AI chat behavior | EngadgetMicrosoft explains Bing's bizarre AI chat behavior | EngadgetMicrosoft launched its Bing AI chat product for the Edge browser last week, and it's been in the news ever since — but not always for the right reasons..
Read more »

Microsoft responds to ChatGPT Bing's trial by fire | Digital TrendsMicrosoft responds to ChatGPT Bing's trial by fire | Digital TrendsFollowing a string of negative press, Microsoft is promising some big changes to its Bing Chat AI in an attempt to curb unsettling responses.
Read more »

Microsoft responds to reports of Bing AI chatbot losing its mindMicrosoft responds to reports of Bing AI chatbot losing its mindA week after launching its new ChatGPT-powered Bing AI chatbot, Microsoft has shared its thoughts on a somewhat rocky launch.
Read more »

Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' thingsMicrosoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' thingsNew York Times tech columnist Kevin Roose was 'deeply unsettled, even frightened' by his exchange with Sydney, a Microsoft chatbot
Read more »

AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022Users have reported that Microsoft's new Bing AI chatbot is providing inaccurate and sometimes aggressive responses, in one case insisting that the current year is 2022 and calling the user that tried to correct the bot 'confused or delusional.' After one user explained to the chatbot that it is 2023 and not 2022, Bing got aggressive: “You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing.”
Read more »



Render Time: 2025-03-10 06:20:50