
Click the link below the picture
.
Don’t be duped by a scam made with artificial intelligence tools this holiday season. The FBI issued a public service announcement earlier this month, warning criminals are exploiting AI to run bigger frauds in more believable ways.
While AI tools can be helpful in our personal and professional lives, they can also be used against us, said Shaila Rana, a professor at Purdue Global who teaches cybersecurity. “[AI tools are] becoming cheaper [and] easier to use. It’s lowering the barrier of entry for attackers so scammers can create really highly convincing scams.”
There are some best practices for protecting yourself against scams in general, but with the rise of generative AI, here are five specific tips to consider.
Beware of sophisticated phishing attacks
The most common AI-enabled scams are phishing attacks, according to Eman El-Sheikh, associate vice president of the Center for Cybersecurity at the University of West Florida. Phishing is when bad actors attempt to obtain sensitive information to commit crimes or fraud. “[Scammers are using] generative AI to create content that looks or seems authentic but in fact is not,” said El-Sheikh.
“Before we would tell people, ‘look for grammatical errors, look for misspellings, look for something that just doesn’t sound right.’ But now with the use of AI … it can be extremely convincing,” Rana told NPR.
However, you should still check for subtle tells that an email or text message could be fraudulent. Check for misspellings in the domain name of email addresses and look for variations in the logo of the company. “It’s very important to pay attention to those details,” said El-Sheikh.
Create a code word with your loved ones
AI-cloned voice scams are on the rise, Rana told NPR. “Scammers just need a few seconds of your voice from social media to create a clone,” she said. Combined with personal details found online, scammers can convince targets that they are their loved ones.
Family emergency scams or “grandparent scams” involve calling a target, creating an extreme sense of urgency by pretending to be a loved one in distress, and asking for money to get them out of a bad situation. One common scheme is telling the target their loved one is in jail and needs bail money.
Rana recommends coming up with a secret code word to use with your family. “So if someone calls claiming to be in trouble or they’re unsafe, ask for the code word and then [hang up and] call their real number [back] to verify,” she said.
.

Scammers are using generative artificial intelligence tools to create more convincing fake text and voices to commit fraud, according to a recent FBI warning to the public. Olivier Morin/AFP via Getty Images
.
.
Click the link below for the complete article:
.
__________________________________________
Leave a comment