When answering quick one-off search queries the chatbot is always confident,always matter-of-fact,and never vague,even though it pulls its facts and convictions from all over an internet that’s famously full of lies and venom. The issue,then,may end up being that human minds are too quick to accept this construct as a peer. In making its top priority to sound like a human at all times,Bing often behaves in a way that makes it impossible not to ascribe feelings and emotions to it.
When peppering Bing with trivia questions on subjects I know a lot about,it made a lot of mistakes. And if you point out the errors,it may clarify what it meant,or it may invent an excuse of varying believability. But rather than chalk this up to a failure of the product,my first impulse was to talk more to the chatbot and figure out where it went wrong.
When asking Bing for its own opinion on current events,I was always impressed by how measured and thought-through its responses seemed,even though intellectually I know it was just adopting language that would give that effect. It’s equally good at being playful if you give the right prompts,or persuasive,or melancholy.
Loading
And as more people spend more time with the Bing preview,many examples are popping up where these conversations with a machine provoked real and unavoidable human emotion.
In oneexchange posted to Reddit,Bing appears to break down after mistakenly claiming the movieAvatar:The Way of Water is not out yet. To avoid admitting to being outright wrong,it claims the current year is 2022,and repeatedly accuses the user of lying when they correct it. That exchange ends with Bing suggesting the user either apologise and admit they were wrong,or reset the chatbot and start a new conversation with a better attitude.
ANew York Times columnist kept it talking for hours and managed to get it monologuing about its dark murderous desires,andhow it had fallen in love with him. AVerge journalist published a story detailing some of Bing’sstranger interactions,and when another user asked Bing about the journalist by name the chatbot called him“biased and unfair”.