Your Route to Real News

AI 'can be manipulated' as expert reveals scheme that lets chatbots defraud you

24 June 2024 , 14:09
937     0
There
There's even a "romance" scam you should beware of

ARTIFICIAL intelligence chatbots could be "manipulated" by cyber-criminals to defraud you.

That's the stark warning from a leading security expert who says that you should be very cautious when speaking to chatbots.

AI is extremely powerful and can change your life for the better – but it has big risks qhidqhieuiqkeprw
AI is extremely powerful and can change your life for the better – but it has big risksCredit: Getty

Specifically, avoid handing over any personal information to online chatbots if you can help it.

Chatbots like OpenAI's ChatGPT, Google Gemini, and Microsoft's Copilot are used by tens of millions of people around the world.

And there are dozens of other alternatives, each capable of improving your life through humanlike conversation.

What Ola and James Jordan really ate and did to shed 7stWhat Ola and James Jordan really ate and did to shed 7st

But cybersecurity Simon Newman expert told The U.S. Sun that chatbots also pose a hidden danger.

"The technology used in chat bots is improving rapidly," said Simon, an International Cyber Expo Advisory Council Member and the CEO of the Cyber Resilience Centre for London.

"But as we have seen, they can sometimes be manipulated to give false information.

"And they can often be very convincing in the answers they give!"

TECH PAUSE

For a start, artificial intelligence chatbots might be confusing for people who aren't tech-savvy.

It's easy to forget – even if you're a computer whiz – that you're talking to a robot.

And that can lead to difficult situations, Simon told us.

"Many companies, including most banks, are replacing human contact centres with online chat bots that have the potential to improve the customer experience while being a big money saver," Simon explained.

"But, these bots lack emotional intelligence which means they can answer in ways that may be insensitive and sometimes rude.

"This is a particular challenge for people suffering from mental ill-health, let alone the older generation who are used to speaking to a person on the other end of a phone line."

I'm a 'time traveler' - the 'worst case scenario that could kill us all'I'm a 'time traveler' - the 'worst case scenario that could kill us all'

For instance, chatbots have already "mastered deception".

And they can even learn to "cheat us" even if they haven't been asked to.

The U.S. Sun worked with cyber-experts to reveal "subtle signs of AI manipulation" in conversations that you should look for.

BAD CHAT

They are not immune to being hacked by cyber-criminals.

Simon NewmanInternational Cyber Expo Advisory Council

But the big danger isn't a chatbot misspeaking – it's when cyber-criminals can compromise the AI to target you.

A criminal might be able to break into the chatbot itself, or trick you into downloading a hacked AI that is set up for malicious purposes.

And this chatbot can then work to extract your personal info for the criminal's gain.

"As with any online service, it’s important for people to take care about what information they provide to a chatbot," Simon warned.

"They are not immune to being hacked by cyber-criminals.

"And potentially can be programmed to encourage users to share sensitive personal information, which can then be used to commit fraud."

The U.S. Sun recently revealed the things you must never say to AI chatbots.

And be very wary about believing what chatbots tell you.

A security expert recently told us that we need to adopt a "new way of life" where we double- and even triple-check everything we see online.

Sean Keach

Print page

Comments:

comments powered by Disqus