By Radha Iyengar, PhD, Head of Product Policy Research and Karuna Nain, Global Safety Policy Programs Manager
The sharing of intimate images online can have serious emotional and physical consequences for the people whose photographs were posted. Sometimes called “revenge porn,” it’s really a form of sexual violence that can be motivated by an intent to control, shame, humiliate, extort and terrorize victims.
Discovering intimate images of yourself online when you didn’t consent to them being shared is devastating. That’s why we’ve taken a careful, research-based approach that concentrates on the victims — what they experience and how we can better protect them.
Over the last year, we’ve conducted our own research and partnered with many international safety organizations to review and improve our response to the sharing of what we call non-consensual intimate images (NCII) anywhere on Facebook, Messenger or Instagram. We tried to understand the experience of victims, how victims reported their experience, what barriers arose when they made a report and what support or tools they needed to feel safe on our platform. We interviewed victim support advocates and victims themselves from around the world, including Kenya, Denmark and the U.K. Last summer we brought together over 20 academics and non-profit leaders from 10 countries to improve our tools and understanding of how to support victims. This included educational information about NCII, information on where victims can go for help, and psychosocial support for those who had been victimized. For everyone, we instructed them on what precautions people can take on Facebook and other platforms to reduce their chances of being victimized.
Across the board it was clear that victims whose images were shared, or were threatened, feel violated, angry and embarrassed. They are scared and worried that their family, friends, and co-workers will see the images. In fact, harm for the victims continues long after the images are removed. The mental health consequences include anxiety, depression, suicidal thoughts, and sometimes post-traumatic stress disorder (PTSD). There can be economic and professional consequences for victims of NCII, including lost jobs, fewer professional connections, and colleagues who tease or avoid them. It can also be difficult finding new employment. Unquestionably, in many cases the costs to victims are serious and long-lasting.
Based on our discussions, the consequences that victims face vary depending on the cultural context. Victims in more traditional communities may be shunned and exiled from their communities. Organizations we’ve worked with reported cases in which victims were forced to run away from home to avoid persecution and even physical harm. And many countries lack established support organizations, or viable law enforcement solutions.
Moving forward, our overall approach will build on the work we’ve already done and focus on three key principles we heard repeatedly from victims and experts:
- Build clear, accessible tools to support victims in reporting a violation.
- Develop prevention methods such as tools to report and proactively block someone from sharing non-consensual images.
- Give victims the power they need over their online space to feel safe.
Our research indicates we can improve our reporting tools to ensure they are easier to use and don’t frustrate those who try. Victims we spoke with, for example, said they were put off by what they felt was Facebook’s robotic response to NCII — simply taking the images down but not finding a way to acknowledge the trauma that the victims endure. Other victims weren’t familiar with online reporting and weren’t sure how to navigate our processes in the first place. We also heard loud and clear that the time between when a victim reports to Facebook and when they hear a response from us is filled with anxiety and helplessness. Victims want fast but personalized responses because the damage they experience increases the longer the images remains online.
Based on this research, we are re-evaluating our reporting tools and processes to ensure they are straightforward, clear and empathetic. Anyone, not just the victim, can report NCII and we are working to better educate people who use Facebook on how to do that.
Beyond reporting, we want to help potential victims who fear their intimate images may be posted and stop them before they spread. We built a proactive reporting tool in partnership with international safety organizations, survivors, and victim advocates to provide an emergency option for people to provide a photo proactively to Facebook, so it never gets shared on our platforms in the first place. You can learn more about this pilot here.
While this pilot has been subject to some criticism, our research with victims and feedback from organizations indicates this was an option victims generally wanted, and they wanted it built into the reporting process more specifically. To date, use of the program has been relatively low in part because many victims don’t know the program exists. And understandably, many victims are concerned about sending images to people they don’t know, so we are also working to better explain and clarify the process and safeguards in place. We also will expand this pilot to additional countries in the coming months.
Almost all victims we interviewed said they would have appreciated more information and resources. These findings led us to design “Not Without My Consent,” which we launched today as a victim-centered site in our Safety Center to help people respond to intimate images shared without permission.
The goal in everything we’re doing is to improve our support and our response so we meet the needs of victims as we learn of them.