Worldwide, the average mobile phone user receives approximately 14 spam calls per month. (BigStock Photo)

Generative AI is playing an increasing role in phone scams, and fake calls related to Amazon accounts or purchases are among the leading scams in 2023. Those are two of the takeaways from the latest study on phone fraud and spam by Seattle startup Hiya

Spam is defined as unwanted calls from non-contacts, and includes both fraud calls and nuisance calls.

In its report Thursday, Hiya said that in the first half of 2023, the phone security startup observed more than 98 billion incoming calls worldwide. Twenty percent of unidentified calls were flagged as spam, 5% as fraud, and 75% were not flagged. Worldwide, the average mobile phone user receives approximately 14 spam calls per month.

For the first time, Hiya’s study includes a list of the most common phone scams of the year, with calls related to Seattle-based e-commerce giant Amazon leading the way.

Amazon impersonators may say they suspect an unauthorized purchase, or that the credit card linked to the account needs to be updated,” Hiya said in its report. “Often the scammers start with a robocall, and if the recipient falls for the scam and presses a number, a live operator will come on and try to get the victim to reveal their Amazon login information or a credit card number.”

(Hiya Graphic)

Other top phone scams, as tracked by Hiya, include those related to insurance, Medicare, credit cards, cryptocurrency, loved ones, payment apps, and auto warranties.

The rise of artificial intelligence and generative AI is also being noticed in phone scams worldwide, with voice clones being used to fool victims, Hiya reported.

AI-powered voice scams can take various forms, such as impersonating government officials, medical professionals, or service providers. By leveraging AI-generated voices, scammers can manipulate emotions, tone, and intonation, making it difficult for victims to detect the illegitimacy of the call.

Axios previously reported that “generative AI has lowered the bar for cybercriminals looking to clone someone’s voice and use it in their schemes. Cybercriminals now need as little as three seconds of someone’s voice to successfully clone it and make it usable in a scam call.”

Hiya said scammers can easily obtain a voice clip with audio from a person’s YouTube or other social media site, or even from a recorded greeting on a person’s voicemail. Often these voices are used in conjunction with scams directed at loved ones.

Earlier this year the King County Prosecuting Attorney’s Office warned residents in the Seattle area about a phone scam in which callers claimed to be law enforcement officers or King County Sheriff’s Office deputies.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.