In a move to protect intimate photos and prevent the non-consensual circulation of such photos on the social media network.
We’ve taken a careful, research-based approach that concentrates on the victims – Facebook Newsroom.
After a year’s worth of research and working with numerous international safety organizations, along with discussions with victim support advocates and victims themselves, whereby they have tried to comprehend how victims were affected socially, mentally, economically, and even professionally. Facebook has taken the decision to improve its tools for reviewing and sharing non-consensual intimate images (NCII) on the platform.
Facebook has developed its new detection technology after consulting both victims and experts, alike:
1. Supporting victims in reporting a violation
There have been cases where people have felt uncomfortable with the older robotic response that Facebook used to provide to NCII, which involved merely ousting a particular NCII without giving a second thought to the victim’s trauma.
Facebook has taken the step to re-evaluate its reporting tools and processes so as to provide easy, accessible, fast, and empathetic services. Moreover, Facebook has made it possible for anybody to report NCII.
2. Advanced prevention methods
Tools to report and proactive blocking measures for prevention of sharing NCII involve Facebook providing an emergency option because this is an option most victims and organizations alike usually wanted to be built into the reporting process. The pilot program provides users with an emergency option so that a user can submit some particular photo on Facebook that they don’t want to be shared without worrying about security issues, following which Facebook would create a digital fingerprint of that image and proactively ensure that it never gets shared on its platform:
We built a proactive reporting tool in partnership with international safety organizations, survivors, and victim advocates to provide an emergency option for people to provide a photo proactively to Facebook, so it never gets shared on our platforms in the first place. – Facebook Newsroom
3. A safe online space for victims
The conferences conducted with victims have suggested that additional information and resources are always welcome, which encouraged the new ‘Not Without My Consent’ victim-support hub on Facebook’s Safety Centre that would help people respond to non-consensual sharing of intimate images.
Facebook has started its new pilot program that consults and jointly operates with victim advocate organizations. Facebook had been using its earlier photo-matching technology to prevent resharing of NCII, but from now on Instagram and Facebook would use a new detection technology along with an online resource hub with the help of machine learning and AI, so that nonconsensual intimate images could be immediately detected even before they would be reported. This would be particularly helpful in cases where the victim feels apprehensive about reporting or in cases where they aren’t even aware of this violation done to them.
Facebook has made this new development available in Australia, and soon they will make this available in other countries as well.
Today, Facebook is holding an event with Dubravka Šimonović, the U.N. Special Rapporteur on violence against women, where some participating victim advocates, industry representatives, and nonprofit organizations would convene and “discuss how this abuse manifests around the world; its causes and consequences; the next frontier of challenges that need to be addressed; and strategies for deterrence.”
Facebook also plans to build a victim support toolkit in the near future in order to provide locally and culturally relevant information, by partnering with the Revenge Porn Helpline (UK), Cyber Civil Rights Initiative (US), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-yeon (South Korea).