Part 4/14:
An AI researcher demonstrated how convincingly someone’s voice could be replicated using minimal data. They played a simulated conversation where they cloned a person's voice with only a 30-second recording. The result was an eerily accurate replication that could fool even close family members, illustrating the grave risks of voice-based scams. This technology becomes a powerful weapon for scammers to exploit emotional vulnerabilities.
The Vulnerability of Indian Populations
The survey conducted across seven countries revealed that India is particularly vulnerable to AI voice scams, with about 47% of Indian adults having experienced or knowing someone who experienced such scams—almost double the global average of 25%. Several factors contribute to this susceptibility: