When you receive calls at all hours from women desperate to get intimate photos shared without consent taken offline, it’s a relief to hear about Facebook’s latest move to address the distribution of non-consensual intimate images. Finally! Technical solutions to social problems seldom make a good fit, even in a digitally layered world, so here is our take on some pluses and limitations of this move, as well as a lot of questions. Will this initiative that attempts to address harm be misused to kerb sexual expression, education, or pleasure?
A photo will no longer need to be reported each and every time it appears on Facebook, nor do you have to know which other accounts have uploaded it. It doesn’t matter if the photo has been uploaded a thousand times on Facebook; one report can not only take down the picture, the image’s hash value will be used to identify and eliminate duplicates of the photo on Facebook, Messenger and Instagram. A report will still have to be filed for each different photo, but only once. This spells enormous relief for people – most frequently women or LGBTQI persons – trying to control the spread of their intimate photos.
As we all know, the creation of multiple accounts to share these photos is a common strategy, especially clone accounts of the person whose photos are being shared without consent. The photo-matching technology will also alert Facebook to a non-consensual image upload in progress and stop it in its tracks.
This raises a question for us, however. A frequent strategy in “sextortion” and other threats (demands for more intimate photos, videos or sexual acts to avoid public distribution of sexually explicit material) is to publish a post and then erase it and/or the account after the target of the threat has seen it. It’s a way for the perpetrator to gain greater control over the subject. So, by the time the report is reviewed, the post is gone. Will such photos still make it into the hash system?
The hash value is what is used for photo-matching, not the actual photo, so this means that intimate images are not stockpiling in a database ripe for hacking or sharing among those with privileged access. This provides further confidence for reporting. Facebook stated that photos are stored in their original format “for a limited time” only. It would be good to know how limited that really is.
It’s positive to see that Facebook’s emphasis has been on “protecting intimate images” and not “revenge porn” (although media coverage of Facebook’s innovation, unfortunately, prefers this mistaken concept). If we keep branding the problem as “revenge porn”, responses invariably entail calls for abstention, censorship of sexual expression, and victim-blaming, rather than recognising that people’s rights to privacy and bodily autonomy have been violated. The decision to deactivate accounts that have uploaded non-consensual images can send a strong message to people about Facebook’s stance on consent and intimate images. People who attempt to upload a flagged photo will get an advisory message that it is in violation of Facebook community standards too. Facebook could go a long way towards educating its user community if it is serious about helping users understand this issue.
NEEDS MORE WORK TO GET A “LIKE”
Lack of awareness raising
News blasts and platform alerts regarding Facebook’s fake news tool, released just one week later, far surpassed publicity around its new approach to protecting sexually explicit images. Women and others being affected still don’t know this new solution exists, and the Facebook community doesn’t know the platform has a clear position on more than women’s nipples.
Warning messages about community standard violations are necessary, but Facebook could also take advantage of a thwarted upload to raise awareness regarding consent criteria. Could Facebook have a targeted public message campaign on sexual rights, expression and consent in partnership with civil society organisations? Or how about a simple ad announcing how to report a non-consensual intimate image?
In some countries, Facebook points people to organisations that can provide support when sexually explicit photos are leaked. Facebook should broaden its resource network to reflect the global geographic and multilingual diversity of its community. It should ensure that recommended organisations defend women’s rights from a human rights framework rather than a moralistic or protectionist point of view which further policies and shames sexuality. Facebook could deepen the collaborative process with the safety roundtables initiated in 2016 to help rights organisations raise awareness on its platform about bodily autonomy and why the non-consensual distribution of intimate images is a violation of people’s freedoms.
Reporting system is not intuitive
Facebook has had an option to report non-consensual sharing of intimate images for a while – what changes with this innovation is what happens after a report. Unfortunately, it’s still not as straightforward as it could be, something we’ve pointed out to Facebook consistently over the years. If you are naked in a photo, and those likes are climbing exponentially, you don’t have time to figure out which rabbit hole of drop-down options you should go down. Most women we accompany immediately choose the logical “I’m in this photo and I don’t like it”, which only takes you to seemingly petty (by comparison) options about the way your hair looks and letting your buddy know you want the pic taken down. There is no “It’s a private sexual image of me being shared without my consent” option there. Why not? Why not include the option to report a non-consensual photo in any logical place a user might go?
This is something you will like: Where to go to report intimate images shared without consent
So where do you have to go to report? Select
“This should not be on Facebook” >
“What’s wrong with this photo?” >
“This is nudity or pornography (for example, sexual acts, people soliciting sex, photos of me naked)”. This option is more intuitively selected by a bystander – a Facebook citizen concerned about community standards – than by someone directly affected. You can also use
this form to report, even if you are not a Facebook user

Lack of clarity around account banning and creation
It’s not clear how long Facebook plans to keep an account deactivated. Will people be banned for life if they have posted intimate content without consent? “Banned for life” may sound extreme, but users can usually create a new account easily. They won’t be able to post the same photo, but they can get back on Facebook. The new measure does not address this problem.
It’s also not clear if every single account that shared the photo will also be deactivated. What will happen to those accounts that attempt to upload a previously flagged photo? Will Facebook examine other information such as IP or browser fingerprint to determine repeat offenders, similar to Twitter’s
attempts to kerb abusive accounts by comparing phone numbers, or who is being targeted for attack by new accounts? Will repeat offenders face immediate deactivation of new accounts?
Context matters
Leaving such nuanced decision making up to algorithms or artificial intelligence would guarantee disaster. Rather, Facebook cites a team of specially trained community operators will vet the images. Details of what this special training entails remain murky. Minimally, multilingual, multicultural staff trained in understanding gender, sexual rights and expression, victim-blaming and unintentional censorship are needed to properly understand the context in which such photos are being shared and the harm that can result if they are not taken down.
Even with special training, context is everything. What if the photo is non-consensual but not exactly in violation of Facebook’s nudity terms? Is the fact that it is non-consensual sufficient to tag it as a violation? What if it puts the subject at risk because the image content goes against cultural or societal norms or reveals someone’s identity?
Many people post photo teasers that don’t quite violate the Facebook nudity policy, linking off-site to other online spaces after hooking in their Facebook community. What if the user is located in a country where state and community policing of sexuality can put their bodily integrity at risk if such photos are viewed? How will Facebook’s community operators respond in these context-specific situations? Is their ultimate goal with this policy to “prevent harm”, ensure consent, or enforce nudity standards?
Community operators are under pressure to make decisions in the blink of an eye given a number of reports they must address. With this widespread problem, you need more specially trained support team members, not faster decision making.
Accountability
Any decision to take down content and close accounts in a service as broadly used as Facebook demands special considerations around accountability, transparency and appeal. We hope Facebook will share and continue to consult with rights-based organisations about how its criteria for take-down can evolve, the type of training for community operators, and how it will evaluate the effectiveness of this measure. Transparency regarding the number of take-downs accounts affected, and analysis of the scope of the problem would be invaluable to develop better solutions on Facebook and beyond.
A lot of specific questions arise depending on one’s advocacy work. For example, those supporting women facing non-consensual distribution of intimate images want to know more about cooperation with legal proceedings: will relevant information such as the extent of photoduplication and accounts responsible for or attempting distribution still be available if requested by a court order, especially if accounts are being closed and photos eliminated from the system?
Freedom of expression advocates will have a lot of questions about proportional response and monitoring if Facebook has overstepped. For example, how to ensure that an account ban is proportional to the offence? Any woman who’s been deluged with harassing comments or lost her job (to name just a few consequences) due to non-consensual sharing of an intimate photo will not question if banning is proportional to this violation of Facebook community standards, but such questions are important to ask from a rights-based point of view. As banning is Facebook’s last and most extreme option, how does harm or intent play into account suspension? For example, what if you are a kid randomly sharing sexy photos, possibly without knowing the subjects, versus someone targeting a woman’s employer, colleagues, family, friends, or linking the photo to identifying and locational information? What if a user is tried under some sort of civil law or penal code, pays damages or even serves time – should they never be allowed to have a Facebook account again?
Women’s agency
It is essential to ensure that women’s consensual sexual expression is not being censored as a result of these new measures, especially when photos are reported by someone other than the subject (a positive feature but one that can be abused).
The right to appeal
The appeals process is crucial because if there is one thing women’s rights defenders know, it is that policies made to defend people whose rights are marginalised are frequently taken advantage of by those with power to further marginalise those at risk. We must all be alert to how this system might get played to attack consensual sexual expression and any expression in favour of women’s and LGBTQI rights.
Collaboration
For years Take Back the Tech! and many other women’s rights activists have been asking internet intermediaries to take some responsibility regarding the online gender-based violence that their platforms help facilitate, including the distribution of sexually explicit images without consent. We even had to do a
campaign about in 2014 – What are you doing to end violence against women?. A key demand was consulting with women’s rights activists and women Facebook users, especially those based in the “global South”, to gain a deeper understanding of women’s realities on Facebook and how the harm they experience has been magnified by the platform’s personalised, networked nature and historical lack of privacy by default.
For this innovation, Facebook collaborated with the
Cyber Civil Rights Initiative, which has advised survivors and legislators in the US on this issue since 2013. It also finally held safety roundtable discussions with some participation from women’s rights organisations based in Africa, Asia and to a lesser extent Latin America in 2016. While collaboration does take the time it clearly produces valuable, nuanced solutions. If this had been a true priority for Facebook, such a solution could have been viable sooner, especially within Facebook’s own platform.
Non-consensual distribution of intimate images is clearly a problem that is much larger than Facebook. If this measure stays limited to Facebook, people will simply increase this activity on other sites. An important next step will entail cross-platform collaboration throughout the tech industry. Collaboration with rights-based organisations to avoid unintentional censorship and the system being abused to limit LGBTQI and women’s rights, including their right to sexual expression, is also necessary. Non-consensual distribution of intimate images is not a legal violation in most countries and may be a civil rather than criminal offence. It is frequently poorly defined; some countries have even tried to outlaw sexting, the consensual sharing of intimate images. Experiences in assessing violent extremist images and establishing a shared database of hash values, although also controversial, can help inform how this initiative can grow.
If Facebook is able to cultivate a gender-aware, rights-informed support staff who can vet questions of consent and context – not just nudity – when making calls on intimate images, the resulting database will provide extensive credible content of invaluable potential if later pooled in a shared, independent cross-sector database. Such an initiative has to be focused on addressing harm, not kerbing sexual expression, education, or pleasure.