Facebook on Friday announced a new tool to detect revenge porn on its platforms including Instagram. The company also launched a new online resource hub to help users respond to the abuse.
Facebook says its new tool is driven by machine learning and artificial intelligence which allow it to detect “near nude images or videos that are shared without permission on Facebook and Instagram.”
The company adds the tool will help detect non-consensual intimate images without anyone reporting it. If any image violates the company’s Community Standards, Facebook moderators will remove it. And in some cases, it will also disable an account for sharing the content without permission. Facebook, however, will allow users to appeal the ban if they think the company has made a mistake.
“This programme gives people an emergency option to securely and proactively submit a photo to Facebook. We then create a digital fingerprint of that image and stop it from ever being shared on our platform in the first place. After receiving positive feedback from victims and support organisations, we will expand this pilot over the coming months so more people can benefit from this option in an emergency,” Antigone Davis, Global Head of Safety, wrote in a blog post.
ALSO READ: Shedding clothes and taboos, Mexican women learn to sext safely
Facebook has also launched a “Not Without My Consent” platform to the victims of revenge porn. The forum is part of Facebook’s Safety Center.
The forum allows victims to reach out to organisations and resources for assistance including how to get the non-consensual content deleted from the platform.
“And over the coming months, we’ll build a victim support toolkit to give people around the world more information with locally and culturally relevant support,” added Davis.
ALSO READ: K-Pop stars Yong Jun-hyung and Choi Jong-hoon quit over sex videos scandal


Source: hindustantimes.com