Facebookannounced new measures to identify and remove non-consensual intimate images (also known as revenge porn) shared via the social media platform, as well as to support victims of such abuse. The company will be using a new detection technology, powered by machine learning and artificial intelligence (AI), to 'proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram'. Once identified by the AI tool, the content is reviewed by a member of Facebook's Community Operations Team, who will decide whether to remove the image or the video. The removal will also be accompanied by disabling the account from which the content was shared without permission in most cases. Facebook has also launched the Not Without My Consent victim-support hub, for victims of revenge porn to be able to look for organisations and resources to support them.
Зарегистрируйтесь по электронной почте сейчас для еженедельной акции акции
100% free, Unsubscribe any time!Add 1: Room 605 6/F FA YUEN Commercial Building, 75-77 FA YUEN Street, Mongkok KL, HongKong Add 2: Room 405, Building E, MeiDu Building, Gong Shu District, Hangzhou City, Zhejiang Province, China
Whatsapp/ тел: +8618057156223 * телефон: *: 0086 571 86729517 Tel in HK: 00852 66181601
Электронная почта: [email protected]