You receive an unknown call, answer it and there is only silence on the other end. Keep repeating «Hello, who is it?» a couple more times, and then you hang up thinking about a mistake, very quickly forgetting what happened. Yet, that handful of seconds could have an unexpected value for cybercriminals: it can be enough to confirm that your number is active and, much worse, to capture audio samples useful for cloning your voice using artificial intelligence. This technique, which combines old phishing strategies and new voice cloning tools, is increasingly used for online fraud and identity theft. Some security experts warn that potentially this could be just enough 3 seconds of recording to recreate a voice very similar to the original. From there, scammers can use the cloned voice for their own purposes, with very serious consequences both for the person whose voice was cloned and for his or her acquaintances and contacts. Given the danger of this particular hacking strategy, let’s see how it works there fraud of silent calls used to clone the voice e how to defend yourself.
How the voice cloning scam works with silent calls
It all begins, therefore, with a silent phone call. Scammers use automatic systems that dial thousands of numbers a day. When you answer, even a simple background noise or a cough is enough to make it clear that that number is active and belongs to a real person. At that point, your contact is “marked” as active and entered into databases that circulate among various criminal networks. Some groups will use your number for further phishing attempts, others to sell the information to robocalling systems (automated calls for fraudulent purposes) or to those who intend to create voice clones.
The most insidious risk arises when your voice is recorded. AI voice cloning technologies are now so advanced that they can reproduce tone, rhythm and inflection with impressive realism. And all this starting from a few seconds of recording. Based on research conducted in 2023 by MSI-ACI in collaboration with McAfee and which took as a reference a sample of 7,000 people from 9 countries, just 3 seconds of audio would be enough to generate a clone with one 85% similarity to the original voice and, even more alarmingly, with little additional recording accuracy can exceed 95%. You understand well that, with these tools available, a scammer can create a voice message that sounds like that of a family member asking for help after an accident, an employee asking for an advance on a salary, etc.
These techniques are part of a larger phenomenon known as “spear phishing”that is, the attack aimed at a specific person using real information collected online which, clearly, makes the scam particularly credible, maximizing the effectiveness of the attack. Criminals often obtain personal details of their victims from the traces they leave online: posts, comments, tags or the location shared via the “Instagram Map”. This data serves to make the message more credible, for example by referring to a recent trip or a real family member.
According to McAfee data, 1 in 4 people have already had direct or indirect experience with voice clone scamsand the 77% of the victims lost money because of them. The 36% of those interviewed said they had suffered losses between 500 and 3,000 dollars and the 7% of them recorded damage up to $15,000!
What makes the picture of the situation even darker is the ease with which effective voice cloning tools can now be accessed. Researchers have found more than one online dozen free voice cloning programsmany of which require minimal technical skills to be used effectively. Some also allow you to reproduce accents from different languages and regions, broadening the potential reach of attacks.
Once you have the clone, the next step is fraud. Fraudsters can call a bank by impersonating their victims, for example to request the reset of credentials or to authorize bank transfers, perhaps combining voice cloning with other techniques, such as spoofing (which allows the caller’s number to be disguised). Scammers can also use cloned voices to contact friends or family and ask them for money for made-up emergencies, and use the cloned voice as “audio evidence” in blackmail attempts or romance scams.
How to defend yourself from online scams based on voice cloning
Given the insidiousness of similar attacks, let’s see which ones defense actions take to protect yourself from online scams based on voice cloning. Here are some points to always keep in mind.
- Do not answer unknown numbers: if someone is looking for you for legitimate reasons they could leave you a voicemail or they could try to contact you in other ways. If you decide to answer, stay silent and don’t say any words.
- Enable automatic blocking features for unknown numbers: these are now present on all recently manufactured smartphones and allow the problem of scam calls to be mitigated, even if they do not eliminate it completely.
- Never share personal information over the phone: you should not do this even with those who claim to represent a reliable company or even with “friendly voices” belonging to colleagues, family members and acquaintances, given that their voices may have been cloned themselves. If an interlocutor insists, end the conversation and call the organization or person in question by manually dialing their number on your smartphone’s dialer. Bonus tip: To protect your loved ones, you might also want to set up a safe word, which is a deadline you agree in advance to verify your identity in case of real emergencies.
- Limit voice content shared online: in the age of social media and instant messaging, this may not be advice applicable to everyone. And if you really can’t limit the sending or publishing of material that also contains your voice, try at least setting your social profiles to private mode.
