Following media reports of the Bing AI chatbot running off the tracks during lengthy interactions, Microsoft is restricting the amount of time users may spend conversing with it.
According to a blog post published by the firm on Friday, Bing Chat will now respond to up to five queries or statements in a sequence for each chat before asking users to start a new topic. A total of 50 answers are permitted per user per day.
The limitations are designed to prevent strange talks. Long conversations “may confound the underlying chat model,” according to Microsoft.
The corporation had stated on Wednesday that it was trying to address issues with Bing, which had only been introduced a little over a week before and had factual inaccuracies and strange exchanges. Bing reportedly told a New York Times columnist to leave his marriage for the chatbot and the AI demanded an apology from a Reddit user over whether the year was 2022 or 2023, among other bizarre replies.
Factual mistakes have also been present in the chatbot’s responses. On Wednesday, Microsoft said that it was modifying its AI model to quadruple the amount of data that it can use to generate responses. The business also claimed that it would provide users more flexibility over whether they want exact responses derived from Microsoft’s exclusive Bing AI technology or more “creative” ones derived from OpenAI’s ChatGPT technology.
Potential users are placed on a waiting list for access to Bing’s AI chat feature, which is still under beta testing. Microsoft intends to use the technology to jumpstart what some predict will be the next internet search revolution.
When it was introduced in November, the ChatGPT technology made a major splash, but OpenAI has warned of potential hazards, and Microsoft has acknowledged AI’s limitations. Despite AI’s remarkable capabilities, some people are worried that it will be employed for evil purposes like disseminating false information and producing phishing emails.
Microsoft hopes that Bing’s AI skills will give them an edge over Google, the dominant search engine, which just last week unveiled Bard, its own AI conversation model. Bard struggled to respond at its first public demo because of factual mistakes.
Microsoft claimed in a blog post on Friday that the new AI conversation limitations were developed using data from the beta test.
Only about 1% of chat conversations, according to the data, have more than 50 messages. “Our analysis has revealed that the vast majority of you find the answers you’re looking for within 5 turns,” it added. “We will investigate expanding the caps on chat sessions as we continue to get your input to better improve search and discovery experiences.”