This is what a deepfake voice clone used in a failed fraud attempt sounds like – The Verge

Hackers use machine learning to clone somebodys voice and then integrate that voice clone with social engineering strategies to encourage people to move money where it should not be. Such frauds have been successful in the past, but how good are the voice clones being used in these attacks?

The quality is definitely not excellent. Even under the cover of a bad phone signal, the voice is a little robotic. Its satisfactory. And if you were a junior staff member, stressed after receiving an apparently immediate message from your boss, you may not be thinking too difficult about audio quality. “It definitely sounds human. They examined that box as far as: does it sound more robotic or more human? I would state more human,” Rob Volkert, a scientist at NISOS, told Motherboard. “But it does not seem like the CEO enough.”

Security consulting company NISOS has actually released a report analyzing one such attempted fraud, and shared the audio with Motherboard. The clip below belongs to a voicemail sent out to a worker at an unnamed tech firm, in which a voice that seems like the businesss CEO asks the worker for “immediate assistance to finalize an urgent organization offer.”

the target “instantly thought it suspicious”

” Hang up and call them back,” states Traynor. “Unless its a state star who can reroute telephone call or a really, very advanced hacking group, opportunities are thats the best method to determine if you were talking to who you believed you were.”

Earlier this year, the FTC cautioned about the rise of such scams, but experts state theres one simple way to beat them. As Patrick Traynor of the Herbert Wertheim College of Engineering informed The Verge in January, all you require to do is hang up the phone and call the individual back. In many rip-offs, including the one reported by NISOS, the enemies are using a burner VOIP account to contact their targets.

The attack was eventually not successful, as the staff member who got the voicemail “immediately believed it suspicious” and flagged it to the firms legal department. But such attacks will be more common as deepfake tools become increasingly available.

One of the complete stranger applications of deepfakes– AI technology utilized to manipulate audiovisual content– is the audio deepfake scam. Hackers use device learning to clone somebodys voice and then combine that voice clone with social engineering techniques to encourage people to move cash where it shouldnt be. Such rip-offs have been effective in the past, however how great are the voice clones being used in these attacks? All you need to create a voice clone is access to lots of recordings of your target. The more data you have and the better quality the audio, the better the resulting voice clone will be.

The finest known and initially reported example of an audio deepfake rip-off took location in 2019, where the chief executive of a UK energy company was deceived into sending EUR220,000 ($ 240,000) to a Hungarian provider after receiving a phone call apparently from the CEO of his companys parent firm in Germany. The executive was told that the transfer was urgent and the funds had actually to be sent within the hour.

All you require to develop a voice clone is access to lots of recordings of your target. The more data you have and the much better quality the audio, the better the resulting voice clone will be.