Think Before You Answer: Officials Warn of Voice Cloning Scam



Scammers are always looking for new ways to defraud their victims, and with the increasing sophistication of technology, they now have more tools at their disposal than ever before. One of the latest techniques that scammers are using to carry out emergency scams is voice cloning, which uses artificial intelligence (AI) to replicate a person’s voice. Scammers only need a few seconds of a person’s voice in order to create a recording and carry out these AI-powered scams. They typically involve a fraudster impersonating a loved one in distress, urging the victim to send money immediately. They gather information about their target’s family members through social media or other online sources, and may even hack into email accounts to gain access to contact lists and other sensitive data. Once they have enough information, scammers can use voice cloning to make a convincing fake recording. You can avoid these AI-powered emergency scams by staying informed and taking the following precautions:

  1. 1. Create a family code word that only you and your loved ones know. This word can be used to verify that the person on the other end of the line is who they say they are. 
  2. 2. If you receive an unexpected call or message from someone claiming to be a loved one, take the time to verify the person’s identity before sending money. Hang up and call the person directly to verify. 
  3. 3. Consider making your social media profiles private, so that only your friends and family can see your posts.