Зарегистрируйтесь сейчас для лучшей персонализированной цитаты!

Experts warn of deep fake voice cloning technology

19 мая 2023 года Hi-network.com

Threat actors are showing increasing interest in Voice Cloning-as-a-Service (VCaaS) offerings on the dark web to facilitate deep fake-based fraud, according to Recorded Future's latest report, 'I Have No Mouth and I Must Do Crime', based on threat intelligence analysis of chatter in the cybercrime underground.

Deepfake audio technology can be used to impersonate a target's voice to circumvent multi-factor authentication, distribute false and disinformation, and increase the effectiveness of social engineering in Business Email Compromise (BEC) style attacks. Recorded Future warned that the entry barrier for cybercriminals is being lowered by the increasing availability of ready-made voice cloning platforms on the dark web. Recorded Future's observed chatter frequently mentions impersonation, callback scams and voice phishing in connection with such tools. In some cases, cybercriminals misuse legitimate tools for use in audiobook narration, film and TV dubbing, voice acting, and advertising.

Recorded Future argues that an industry-wide approach is needed to tackle the threat before it escalates, as many current deep fake voice technologies are limited to generating one-off samples that cannot be used for extended real-time conversations. Strategies to mitigate the risk need to be multi-disciplinary and address the root causes of social engineering, phishing and vishing, disinformation and much more.

tag-icon Горячие метки: Искусственный интеллект киберпреступность кибербезопасность

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.