Microsoft said it will limit user conversations with its Bing chatbot to five chat turns per session, and 50 questions per day.
Microsoft has acknowledged that longer chats cause the bot to "become repetitive" or deliver responses that may not be helpful or "in line with our designed tone."
The limits come after users of the new Bing chatbot have described feeling unsettled or harassed after their conversations.
- For example, the bot recently told Associated Press journalist Matt O'Brien that he was "one of the most evil and worst people in history" and compared him to Adolf Hitler.
- Microsoft admits that chats involving 15 or more questions can confuse the AI model.
- Data shows that most users find the answer they need from the bot within five turns. After this, users will now be prompted to start another topic.