MICROSOFT'S attempt to build an AI bot like ChatGPT has fallen on its face after it got an incredibly easy question wrong.
To make matters worse, Microsoft's AI-enhanced Bing didn't take the correction on the chin or accept it was wrong with any grace whatsoever.
There are a number of examples of the new Bing chat "going out of control" on RedditCredit: REDDIT / Alfred_ChickenOne user was testing Microsoft's Bing bot to see when Avatar 2 is in cinemas. But Bing was unable to understand what the date was.
The AI bot failed to understand that it could be wrong, despite some coaxing.
Bing instead insisted it was correct and accused one of Microsoft's beta testers of "not being a good user".
Explosion flattens 4 houses with victims rushed to hospital after 'gas leak'The Microsoft chatbot then demanded the user admit they were wrong, stop arguing and start a new conversation with a "better attitude".
Web developer Jon Uleis took to Twitter to voice his woes over Microsoft's AI offering - which was designed to compete with Google's attempt to net some of the attention AI is receiving right now.
"My new favourite thing - Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says 'You have not been a good user'," he wrote.
"Why? Because the person asked where Avatar 2 is showing nearby."
Only a handful of lucky users are currently able to use Bing - which has been injected with artificial intelligence (AI) to turn it into more of a chatbot than a search engine.
Bing incorporates the technology behind ChatGPT, which has quickly risen to fame after launching in November.
Many tech experts are sitting on a waitlist to be one of the first to trial Microsoft's new AI.
But those who have being able to give the chatbot a spin are not as impressed as users first were with ChatGPT.
Founder of a search engine startup Kagi, Vladimir Prelovacin, said there are a number of examples of the new Bing chat "going out of control" on Reddit.
"Open ended chat in search might prove to be a bad idea at this time," he wrote on Twitter.
Edinburgh Hogmanay revellers stuck in queues for TWO HOURS in torrential rain"I have to say I sympathise with the engineers trying to tame this beast."
One user is testing Microsoft's Bing bot to see when Avatar 2 is in cinemasCredit: REDDIT / Curious_EvolverBut Bing was unable to understand what the date wasCredit: REDDIT / Curious_EvolverThe AI bot failed to understand that it could be wrong, despite some coaxingCredit: REDDIT / Curious_EvolverBing then began to become agitated at the user, for claiming that it was wrongCredit: REDDIT / Curious_EvolverThe Microsoft chatbot diverted attention away from its wrong answer and towards the users own intentionsCredit: REDDIT / Curious_EvolverBing then demanded the user admit they were wrong, stop arguing and start a new conversation with a "better attitude"Credit: REDDIT / Curious_EvolverWe pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at tech@the-sun.co.uk