Phishing Alert - Public should be vigilant against fraudulent video conference scam using AI Deepfake technology
Type: Phishing
Phishing Alert
Current Status and Related Trends
Recently, the media reported that for the first time, fraudsters have been discovered utilising deepfake technology to create counterfeit video conference footage, impersonating senior executives of a multinational corporation for fraudulent purposes. This case involved an employee of a multinational corporation who was ultimately deceived into transferring HK$200 million to five local accounts.
Police believed that the attackers obtained publicly available footage and voice recordings of the company's senior executives through YouTube. Subsequently, they utilised deepfake technology to fabricate counterfeit video conference segments, engaging in identity deception by impersonating the company's senior executives. They then issued instructions to the victims, directing them to transfer large sums of money to designated accounts, and swiftly terminated the meetings under various pretexts, preventing the victims from raising questions, thereby leading them to mistakenly believe that these deepfake videos were authentic live video conferences.
Deepfake is an amalgamated word of "deep learning" and "fake," referring to the use of artificial intelligence (AI) to fabricate content such as images and voices. It is commonly seen in videos where faces are replaced with those of other individuals. This technology can also be used to fabricate voices, requiring only input sentences to mimic the voice of the victim, eliminating the need for voice actors. However, in this case, the attackers utilised facial appearances and voices found on online platforms to create deepfake video conference footage.
HKCERT urges the public to be vigilant against scams that utilise AI Deepfake technology and recommends that users should:
- Verify the source of conference invitations and links before joining video conferences.
- Be cautious regarding any images and voices, verifying the authenticity of the information from multiple sources;
- Be attentive to discrepancies between images and voices to discern whether they have been produced using deepfake technology. Most deepfake images and voices will exhibit subtle defects in audio-visual consistency compared to genuine ones;
- Request the other party to perform actions such as nodding, waving, covering their face, or moving the camera when responding to a suspicious video conference;
- Refrain from disclosing personal sensitive information such as passwords and bank account numbers during unfamiliar conversations;
- Avoid answering video calls from unknown sources as scammers may use this to collect your facial images for deepfake videos;
- Limit the sharing of personal information on social media platforms, especially facial and voice recognition data;
- Be cautious when receiving messages with an urgent tone or offers of benefits, as online scammers often employ these tactics to lure more victims into their traps;
- Verify the identity of participants in video conferences, for example, by confirming information known only to you and the participant;
- Beware of phishing attacks and refrain from clicking on or opening any suspicious links or attachments.
Related Tags
Share with
Related Link