Will AI voice cloning appear in email scams?
Already emerging: attackers clone executive voices from public recordings, creating audio messages attached to **phishing** emails. "Urgent voice message from CEO" adds pressure and perceived authenticity.
Integration scenarios: voicemail attachments in **BEC**, phone follow-ups using cloned voices, and hybrid attacks combining email with voice. Multi-channel attacks increase effectiveness.
Defense: train users that voice can be faked, establish verification procedures not relying on voice recognition, and be suspicious of unusual voice message delivery (why voicemail attachment rather than call?).
Was this answer helpful?
Thanks for your feedback!