Stop talking to chatbot if you see eerie sign – only few clues left as expert warns ‘almost impossible’ to spot evil AI
CHATBOTS can help you get things done faster – but they can be dangerous too.
Security experts have warned that criminals can use chatbots powered by artificial intelligence to their advantage.
Make sure you’re talking to an authentic chatbot – and not a fake[/caption]There are now plenty of powerful AI apps like OpenAI’s ChatGPT and Google Gemini that can serve as your digital assistants.
And many customer support systems – like at your bank, for instance – also use AI to answer your questions.
Sadly criminals could trick you into speaking to fake chatbots, or compromise the AI itself, putting you in serious danger.
The U.S. Sun spoke to cybersecurity expert Simon Newman who revealed that there’s a typo clue you need to look out for.
“This is a tricky one for consumers as it can be almost impossible to tell the difference between a chat box developed for a legitimate company and one that is used by a criminal,” said Simon, Co-Founder of Cyber London and International Cyber Expo Advisory Council Member.
“The best advice to give to consumers is to be cautious. If you are using a chat bot through a bank, phone them up and ask them about it.
“How secure is it and what information will it ask from me?
“Remember to make sure that you are using the real website and not a fake one.
“Look for typos in the address and if it doesn’t look right, stop.”
AI, EVERYWHERE!
The problem is that it’s going to be increasingly difficult to get by without embracing AI.
Like smartphones or the internet, AI looks likely to become a part of everyday life.
“This is a really good question. I’d like to think that we could live in a world without AI chat bots,” Simon told us.
What is a retail chatbot?
Retailers are turning to artificial intelligence to personalize and improve the shopping experience.
Retail chatbots are automated virtual customer assistants that can drive sales and help shoppers by answering questions and providing suggestions.
Chatbots are used with generative AI and typically have natural language processing capabilities so they can respond to human speech.
Using AI a chatbot can learn over time, meaning the more interactions it has, the better it will become.
Chatbots are great for:
- Increasing personalization
- Reducing costs and improving return on investments
- Improving customer service
They can also be used to extend customer support hours, track orders, collect feedback, and find brick-and-mortar locations for shoppers.
Source: Freshworks
“But it’s already becoming harder to shop or bank online without coming across them.
“There are some potential risks with chatbots, not least about their security as we have already seen cyber criminals develop fake chat bots.
“We also need to be confident about how our personal information is being used and stored when we use a chatbot, which isn’t often clear.”
There are ways to protect yourself, however.
AI ROMANCE SCAMS – BEWARE!
Watch out for criminals using AI chatbots to hoodwink you...
The U.S. Sun recently revealed the dangers of AI romance scam bots – here’s what you need to know:
AI chatbots are being used to scam people looking for romance online. These chatbots are designed to mimic human conversation and can be difficult to spot.
However, there are some warning signs that can help you identify them.
For example, if the chatbot responds too quickly and with generic answers, it’s likely not a real person.
Another clue is if the chatbot tries to move the conversation off the dating platform and onto a different app or website.
Additionally, if the chatbot asks for personal information or money, it’s definitely a scam.
It’s important to stay vigilant and use caution when interacting with strangers online, especially when it comes to matters of the heart.
If something seems too good to be true, it probably is.
Be skeptical of anyone who seems too perfect or too eager to move the relationship forward.
By being aware of these warning signs, you can protect yourself from falling victim to AI chatbot scams.
The U.S. Sun recently spoke to a cyber-expert who warned over giving away too much info to a chatbot.
“Never share any sensitive information with a chatbot,” said Dr. Martin J. Kraemar, security awareness advocate at KnowBe4.
“You might have to share your flight booking code or parts of your address with an airline chatbot, but that should be the exception.
“You can always call instead of using the chatbot. Generally, never ever share your password or other authentication credentials with a chatbot.
“Do not share your personal thoughts and intimate details either. it is safe to assume that someone else will gain access to them.