The emergence of AI (Artificial Intelligence) can be a source of good, as it helps increase productivity, facilitate better decision-making, and improve customer service for businesses. However, it also has a dark side, particularly regarding fraud and the exploitation of seniors.
Watch my video about AI-Powered fraud
I’ve noticed some of this in online advertising. Catchy ads, which are actually deep fake reviews, feature celebrities promoting products to improve weight loss or supplements, and can easily make you click to learn more. Usually, they include a 5-minute endless video with a sales pitch at the end. The problem is, most of those celebrities like Oprah, Hoda, or Kelly Clarkson have nothing to do with them.
The ads were created with AI, making you think they have had success with the product. Their voices are cloned to sound real. The product itself is most likely useless.
One example I found was the Himalayan salt trick diet supposedly endorsed by Oprah. It is a “natural” mixture that claims to have the same results as Ozempic. Imagine drinking almost a teaspoon of salt mixed with lemon or apple cider vinegar, or baking powder.
High sodium intake can result in high blood pressure, heart disease, and kidney problems. Consuming baking soda daily can cause digestive issues that can lead to serious complications. The mixture does not work and is bunk.
Fake products generated by AI
Scammers use AI to mass-generate 5-star reviews for products like supplements or anti-aging creams. Seniors, and just about anyone else for that matter, trust these reviews and buy ineffective or harmful products.
These may include supplements to cure arthritis or memory loss.
Chatbots pretending to be customer support
AI chatbots are designed to impersonate real support agents on scam websites. Seniors seeking help may be coerced to reveal personal information or credit card numbers.
They may also encounter a fake customer service agent. I once called a fake Facebook support number after my Facebook page was taken over. The customer service agent turned out to be another scammer who stole money from my Venmo account. Fortunately, I was able to stop him from infiltrating more of my accounts.
AI voice cloning scams known as Grandparent Scams
A scammer will use AI to mimic a loved one’s voice and call or leave voicemails asking for urgent money. Vulnerable seniors, hearing the voice of their relative, will send funds out of fear. An example is: “Grandma, I’m in jail in Mexico. Don’t tell Mom. Please send $2,000 immediately.”
Fake financial advisor chatbots
A fraudulent financial advisor may appear in an AI-generated ad or website and offer to handle a person’s retirement savings. The customer might be convinced to transfer personal financial data or money to scammers posing as AI-powered advisors. These ads often offer a free portfolio checkup and then drain your bank account after you log in.
Subscription traps using AI-personalized shopping
AI systems create tailored product recommendations and then trick customers into purchasing monthly subscriptions. This happened to me several times. Recently, I bought a package of mushroom-infused coffee. The next month, I was charged again. I thought I had clicked on a one-time purchase, but didn’t realize it was an auto-ship. After jumping through hoops, I was finally able to cancel the subscription.
Free trials are another way to trick seniors who may forget when the trial ends. All of a sudden, they see a charge of $79 per month for an overpriced supplement.
Romance scams enhanced by AI
AI can generate realistic texts, photos, and video calls that simulate a romantic relationship. Seniors seeking companionship or a committed relationship may be emotionally manipulated into sending money to a person they think they know.
An example is a retired doctor in Europe who flirted with a woman using AI messaging. After developing trust, he asked her to help him with travel costs and pay for an emergency surgery.
Phishing emails
AI creates personalized and grammatically correct emails with letterheads from well-known banks or other legitimate companies. A common one to look out for is a fake Amazon email asking about a missed delivery. NEVER click on that email. ALWAYS go directly to the company’s website via your browser to check that it is legit.
Health device frauds generated by AI
Scammers prey on seniors with health-related issues. You may see an ad with multiple testimonials for a “smart” hearing aid or miracle blood pressure monitor. Many of these ads originate in China, and the products, which often take a lengthy time to arrive, are badly made or dangerous.
Fake Medicare or Social Security agents
AI phone bots or texts may claim to be from the government, asking to “verify” information. Seniors are easily fooled into providing their Social Security or Medicare numbers. These phone messages or texts may read, “Your benefits have been suspended. Call now to reactivate.” Block these texts and calls immediately.
YouTube videos generated completely by AI
As a voiceover actor, I get annoyed listening to the same female or male voice narrating videos, Instagram posts, or ads. The voice sounds the same and is stilted and fake.
Recently, I watched a fashion video for women over 60. The clothing featured was fine, but because it’s difficult to find stock footage of an older woman wearing fashionable clothing, an AI-generated senior woman’s head with gray hair was inserted on a younger woman’s body. The video was narrated by an AI-generated voice.
Political advertising using cloned voices
With the current divisive political atmosphere in the United States, AI-generated political ads are on all social media platforms. A political figure’s voice is cloned and made to say things the person in real life never said. I spent decades dubbing foreign language cartoons into English, and can see that many of the cloned voices are out of sync with the mouth movements.
Many of these fake “videos” are on TikTok, Instagram, and YouTube. They are spreading propaganda, so beware. AI-generated memes (photos, videos, or text) are everywhere on the Internet and are used to sway voters to vote a certain way. AI-generated videos, texts, screenshots, and photos are spreading fake news on both sides of the aisle, so before you get emotionally charged and share it, do your research and make sure it’s legitimate.
Cryptocurrency scams
Seniors are often targeted by fraudsters online with sophisticated tactics to defraud them of their savings. Cryptocurrency is “a digital, encrypted, and decentralized medium of exchange.” If you can figure that out, you are much smarter than I am. You can’t hold cryptocurrency or feel it like real money. There is also no regulating body to protect you.
Unless you completely understand how cryptocurrency works, avoid it and NEVER transfer your money into crypto assets.
Secure your accounts without passwords
Keeping track of passwords and setting up two-factor authentication is annoying and can be confusing. There are USB keys you can insert into your computer to overcome that. You simply insert them, and they will protect your accounts. Two security key brands that I like are Yubico and SecuX.
Conclusion
AI is here to stay, and it will only improve and become more sophisticated. You can’t stop progress. Do not be fooled by scammers who want to make money off you or steal your information. Do your research before making any decision generated from an AI-generated source. Learn to recognize what is real and what is not. It is becoming more difficult to discern what is real or fake every day.
Have you been a victim of AI-generated fraud? Please leave a comment below and tell us about it.
Here’s what I always say…” the world we live in “…and shake my head
Very scary, so many predators out there. I guess it’s always been like this, but now it feels worse than ever.