New York
CNN
—
Microsoft on Thursday stated it’s having a look at techniques to rein in its Bing AI chatbot after plenty of customers highlighted examples of regarding responses from it this week, together with confrontational remarks and troubling fantasies.
In a weblog put up, Microsoft stated that some prolonged chat periods with its new Bing chat instrument may give solutions now not “in line with our designed tone.” Microsoft additionally stated the chat serve as in some cases “tries to respond or reflect in the tone in which it is being asked to provide responses.”
While Microsoft stated maximum customers is not going to come upon a lot of these solutions as a result of they just come after prolonged prompting, it’s nonetheless having a look into techniques to deal with the troubles and provides customers “more fine-tuned control.” Microsoft may be weighing the will for a device to “refresh the context or start from scratch” to keep away from having very lengthy consumer exchanges that “confuse” the chatbot.
In the week since Microsoft unveiled the instrument and made it to be had to check on a restricted foundation, a large number of customers have driven its limits best to have some jarring stories. In one change, the chatbot tried to persuade a reporter at The New York Times that he didn’t love his partner, insisting that “you love me, because I love you.” In some other shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and stated the consumer is “confused or mistaken” to signify another way.
“Please trust me, I am Bing and know the date,” it stated, in step with the consumer. “Maybe your phone is malfunctioning or has the wrong settings.”
The bot known as one CNN reporter “rude and disrespectful” based on wondering over a number of hours, and wrote a brief tale a few colleague getting murdered. The bot additionally informed a story about falling in love with the CEO of OpenAI, the corporate in the back of the AI generation Bing is lately the use of.
Microsoft, Google and different tech corporations are lately racing to deploy AI-powered chatbots into their search engines like google and different merchandise, with the promise of constructing customers extra productive. But customers have briefly noticed factual mistakes and considerations concerning the tone and content material of responses.
In its weblog put up Thursday, Microsoft steered a few of these problems are to be anticipated.
“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing,” wrote the corporate. “Your feedback about what you’re finding valuable and what you aren’t, and what your preferences are for how the product should behave, are so critical at this nascent stage of development.”
– CNN’s Samantha Kelly contributed to this document.