A major warning has been issued by Starling Bank, a UK-based digital bank, about the growing threat of artificial intelligence (AI) voice-cloning scams. Scammers are using AI technology to replicate voices with as little as three seconds of audio, which they can easily gather from videos or voice clips shared online. Once a voice is cloned, fraudsters use it to impersonate individuals in phone calls to deceive their family and friends, often asking for money.
The voice-cloning process allows scammers to craft convincing phone calls, leading victims to believe they are speaking to someone they trust. This method of fraud is particularly effective because it exploits the trust and familiarity people have with their loved ones, making it difficult for individuals to detect the scam.
According to Starling Bank, this type of fraud has the potential to impact millions of people, as the risk continues to rise. In a recent survey conducted in collaboration with Mortar Research, the bank found that over a quarter of the 3,000 respondents had encountered an AI voice-cloning scam in the past year. Despite the growing frequency of these scams, many remain unaware of the risk, with 46% of participants admitting they had never heard of AI voice-cloning fraud.
One of the most concerning findings from the survey was that 8% of respondents stated they would still send money to a friend or relative over the phone, even if the request seemed unusual. This illustrates how convincing AI-cloned voices can be, imitating not only the sound of a person’s voice but also the subtle emotional cues that can make the scam seem genuine.
With AI technology continuing to advance at a rapid pace, the potential for misuse is increasing. Starling Bank is urging people to take steps to protect themselves against these types of scams. One recommended measure is to establish a “safe phrase” with friends and family members. This unique phrase can be used to verify a person’s identity during a phone call, offering an extra layer of security against scammers who may be using a cloned voice.
However, Starling Bank advises against sharing this safe phrase via text message, as it could be intercepted by fraudsters. Instead, if the phrase must be shared through text, they recommend deleting the message immediately after it has been read to reduce the risk of it being compromised.
As AI voice-cloning fraud becomes more common, the bank’s warning reflects larger concerns about how AI can be misused. In addition to scams, AI poses risks for identity theft, bank fraud, and the spread of false information. Earlier this year, OpenAI developed a voice-replication tool but decided not to release it publicly due to the potential for abuse.
In light of the rising threat of AI-enabled scams, it’s crucial for individuals to stay alert and take necessary precautions to safeguard themselves and their loved ones from falling victim to these increasingly sophisticated forms of fraud.