Your Route to Real News

AI chatbots are a 'treasure trove' for criminals, expert warns

05 June 2024 , 20:02
1030     0
There are some chatting rules you must follow
There are some chatting rules you must follow

CHATBOTS can change your life for the better – but they come with a serious danger too.

Cyber-experts have told The U.S. Sun how chatbots are now a "treasure trove" just waiting for criminals to break in.

There are countless chatbots out there now – but be careful what you trust them with qhidquiqqxiueprw
There are countless chatbots out there now – but be careful what you trust them withCredit: Getty

Artificial intelligence chatbots is getting smarter, faster, and easier to access.

And because these AI systems have learned to talk like humans, it can be tempting to be overly trusting and tell them sensitive things.

We spoke to Jake Moore, Global Cybersecurity Advisor at ESET, who revealed that the AI "models" that chatbots are based on are likely safe – but there's a hidden danger.

What Ola and James Jordan really ate and did to shed 7stWhat Ola and James Jordan really ate and did to shed 7st

"With OpenAI and Microsoft as the front runners of chatbots, they are closely guarding their networks and algorithms," Jake told us.

"If they were to become compromised, they would have little to maintain as a business being their future."

WELCOME TO THE DANGER PHONE?

Instead, Jake revealed that the big risk is that what you say to a chatbot ends up being exposed.

Ultimately, the things you tell a chatbot during a conversation are stored somewhere.

And like texts, emails, or backed-up files, your messages to chatbots are only as safe as how they're being stored.

So if you've poured your heart out to a chatbot or shared sensitive info, that's a big mistake.

“Rather than editing the algorithms to change output direction, the input and output data is essentially more at risk as this will be stored on a server somewhere," Jake explained.

“Although encrypted, this in time will become like a personal search history and very valuable to cyber thieves.

“There is a lot of personal information already being handed over.

“And as soon as OpenAI releases its own search engine, there will be even more sensitive data floating around in a relatively new space that will be like a treasure trove to criminals. "

I'm a 'time traveler' - the 'worst case scenario that could kill us all'I'm a 'time traveler' - the 'worst case scenario that could kill us all'

LOOSE LIPS SINK SHIPS

Jake said it's important to be particularly mindful when you're using a chatbot that isn't encrypting your chats.

Encryption garbles conversations so they can't be read by anyone who doesn't have the "key" to unlock them.

The good news is that OpenAI promises that all ChatGPT chats are end-to-end encrypted – whether you're a paying member or not.

Do not share your personal thoughts and intimate details either. it is safe to assume that someone else will gain access to them.

Dr. Martin J. Kraemersecurity awareness advocate at KnowBe4

But some apps might charge you for encryption – or not offer it at all.

And even with encrypted chats, your conversations might be used by the chatbot to train its models.

ChatGPT lets you opt out of this, and you also have the option to delete your data.

"People need to remain mindful of what they input into chatbots, especially into free accounts which don’t anonymise or encrypt the data," Jake warned.

WHAT NOT TO SAY!

Just last month, The U.S. Sun revealed what you shouldn't say to chatbots.

We spoke to a security expert to find out where you need to be careful.

"Never share any sensitive information with a chatbot," said Dr. Martin J. Kraemer, security awareness advocate at KnowBe4.

"You might have to share your flight booking code or parts of your address with an airline chatbot, but that should be the exception.

"You can always call instead of using the chatbot. Generally, never ever share your password or other authentication credentials with a chatbot.

"Do not share your personal thoughts and intimate details either. It is safe to assume that someone else will gain access to them."

Sean Keach

Print page

Comments:

comments powered by Disqus