Microsoft says Bing AI prone to giving erratic responses in long chats
Microsoft has finally responded to a wave of user complaints about the behaviour of its ChatGPT-powered Bing in a new blog post. According to reports, Bing has been caught making insulting comments, lying to users, and manipulating people emotionally. Microsoft has assured to address these issues in the coming few weeks and also warned the users Bing can become erratic in lengthy chats.
The Redmond giant in the blog post discusses the learnings from the first week of testing new Bing. It mentions increased engagement across traditional search results and positive feedback on the AI-powered answers generated by the new Bing. The chat feature has also received good feedback and engagement, company notes.
Microsoft in its observation found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or give responses that are not “necessarily helpful” or adhere to the designed tone.
“Very long chat sessions can confuse the model on what questions it is answering and thus we think we may need to add a tool so you can more easily refresh the context or start from scratch,” said the company.
Acknowledging the complaints that said Bing at times does not respond in the desired tone, Microsoft says that it is looking at how to give a more “fine-tuned” control over the browser to users.
“This is a non-trivial scenario that requires a lot of prompting so most of you won’t run into it, but we are looking at how to give you more fine-tuned control.”
Microsoft says it is also working in to fix some of the technical issues or bugs encountered and reported by the users. These are issues such as slow loading, broken links, or incorrect formatting. “Many of these issues have been addressed with our daily releases and even more will be addressed with our larger releases each week.”
The company in its post has thanked users for their feedbacks and suggestions for the new Bing such as booking flights or sending email.