Microsoft said its new AI-powered Bing chat feature isn't designed to have long conversations

 

Microsoft said its new AI-powered Bing chat feature isn't designed to have long conversations.

 In a blog post on Thursday, Microsoft said it is getting feedback on chat problems and other issues from Bing beta testers as it works to improve the new search engine features.

  • The new search engine has faced criticism for generating responses that range from inaccurate to creepy and displaying a "split personality" in its chat feature.
  • New York Times tech columnist Kevin Roose described how, during a two-hour conversation, the chatbot's "split personality" began to emerge as it shifted from search queries to more personal topics.
  • The bot has also claimed to be able to spy on Microsoft developers through webcams, told a user he was a "bad researcher," and insisted that the year is 2022.
  • In its blog, Microsoft acknowledged that Bing's new chat function can "become repetitive," give unhelpful responses, or take on the user's tone during longer chat sessions involving 15 or more questions.
  • As a result, the company said it may add ways to easily refresh the AI model and/or give users better control. 

Post a Comment

Previous Next

Contact Form