End Cyber Abuse

The Pornhub controversy that brought image-based sexual abuse into the public conversation

By Josie Gleave

After years of victims and activists trying to raise actionable awareness about image-based sexual abuse (IBSA) and child sexual abuse material (CSAM) online, the issue went viral this month. It began with a New York Times op-ed by the Pulitzer prize winning journalist Nicholas Kristof who pointed the finger at Pornhub for hosting and profiting from non-consensual and illegal content. The outcry that followed triggered a series of unprecedented events. 

First, Visa and Mastercard said they were investigating the allegations against Pornhub, and in a tactical move to preserve their payment services, Pornhub responded by granting the three wishes detailed in Kristof’s article: allow only verified users to upload content, remove the download feature, and expand content moderation. 

Despite this virtue signaling, Visa and Mastercard both independently confirmed the allegations against Pornhub and subsequently stopped processing payments on the adult site. In their last ditch effort to preserve their public image, Pornhub removed over ten million videos from unverified users which turned out to be over 75 percent of their content library. 

It was a whirlwind of a week in the porn world.

Pornhub’s new year’s resolutions

Pornhub’s policy changes are overdue, but hopefully they will benefit victims of IBSA and prevent abuse in the future. So much depends on how these changes are implemented and that the current scrutiny is maintained.

The first change allows only verified users to upload content to Pornhub. The platform’s previous verification process would be laughable except that it has allowed for videos of sexual assault to be uploaded to the site and appear legitimate. For example, a 15-year-old trafficking victim was discovered in 58 videos on Pornhub that came from a verified account. 

Pornhub has said their verification process will be more thorough in the new year, but it is unclear what will change. Previously, users only had to take a selfie while holding up a piece of paper with their username handwritten to apply for verification.  

Removing the download function is perhaps the biggest win. At the moment, if a victim of IBSA finds and reports a video and it is successfully removed, the same video is often reposted and shared within days or hours by other users. Of course, users can still take screenshots or download videos through alternative software, but removing the easy access download creates greater barriers to users and could significantly reduce copies of non-consensual content from being made and recirculated online. 

Finally, Pornhub has promised to expand moderation, but we cannot measure that progress when we know so little about their current content moderation practices. One Pornhub moderator told Kristof that the company employed about 80 human moderators worldwide in comparison to Facebook’s 15,000. Considering the enormous amount of content uploaded to the site each year – so much so it is recorded in years instead of hours – Pornhub’s claims that they review every upload with less than 100 moderators is impossible. 

In addition to the platform’s supposedly “extensive” moderation team, Pornhub announced a newly established “Red Team” that will sweep existing content for violations. It is not known how many people will be on this team, and it seems unlikely that the site will hire thousands of moderators to reach Facebook’s numbers. Even if they did, a legion of human moderators has not stopped IBSA and other harmful content from spreading on Facebook either. 

Moderation needs to be a multi-step process and should not rely on human moderators who suffer trauma due to the nature of their work. Both Facebook and Pornhub report utilizing additional tools like PhotoDNA, which uses hashing technology to categorize and identify images and compare them to confirmed images of sexual abuse. The challenge is identifying new non-consensual images as it often is left to the victims to identify themselves. 

In a tweet, Kristof reported that these changes will apply not only to Pornhub, but all sites under Mindgeek, the umbrella company that owns Pornhub as well as other major porn sites like RedTube and YouPorn. These changes feel like a positive step forward if implemented effectively in a way that actually reduces IBSA across these websites. 

No praise for Pornhub

For those victims, activists, and sex workers who have been asking Pornhub for these basic improvements, the policy announcement was met with a mixture of optimism and skepticism. Some victims voiced their frustration that they were not listened to years ago and that these changes should not have required the New York Times article or Visa and Mastercard to pressure Pornhub to change. It may be progress, but the taste is bittersweet. 

Sex workers are often left out of this story, but they too have been asking Pornhub to put preference on verified users and remove the download feature for years. Previous policies have worked against their livelihoods by encouraging pirated content to be reuploaded to Pornhub, thereby cutting out the original performer’s ability to profit from their own work. 

Now that Visa and Mastercard have suspended services, Pornhub is shielding itself by saying this will negatively impact sex workers. There is some truth here, as porn performers have said losing those payment services may affect their livelihood. Yet, if Pornhub were truly concerned about sex workers, they could have taken steps earlier to adequately respond to victims of IBSA requests which may have prevented losing Visa and Mastercard’s services in the first place. 

“Mastercard & Visa’s decision will not hurt Pornhub, who always have and always will continue to make money off of stolen content,” wrote Erika Lust, adult film director. “This decision hurts sex workers – the people that Pornhub has never cared for.” 

If there is to be any compromise between the adult industry and preventing non-consensual content, it should not be between helping victims of IBSA and harming sex worker’s livelihood. Our next move in directing this conversation is crucial. 

What comes next?

For those working in the field of fighting IBSA and non-consensual content, Kristof’s article was not much of a revelation, but it did help demonstrate to the general public the severity of the harms victims experience, including psychological distress, personal and social life disruption, challenges at school, and job loss. Victims suffer in-person and online bullying, post-traumatic stress disorder, and suicide attempts. What may seem like an unfortunate nude leak is actually a seriously distressing and life-altering form of sexual abuse. 

This kind of material does not belong on Pornhub or any other adult site; however, calling for Pornhub to be shut down is not the right move either. This would not solve the problem of IBSA but simply send it elsewhere. It is time to expand the conversation to other adult sites and beyond. 

“Instead of praising PornHub, let’s bury non-consensual image abuse once and for all,” wrote Dr. Ann Olivarius. “Let’s force MindGeek, WGCZ Holdings (owner of XVideos) and all other porn suppliers to put harm prevention before exploitative profit.” 

The porn world is full of stigma, and the public may be less surprised to read that non-consensual content exists on those platforms, but IBSA is spread all over the internet, perhaps even more so on mainstream websites like Facebook, Instagram, Twitter, and WhatsApp. Let’s add these sites to the list and hold them accountable just as Pornhub is now expected to respond to IBSA victims and remove non-consensual content. The world is now aware, and these sites can no longer pretend image-based sexual abuse does not exist on their platforms.

Author Bio

Josie Gleave is a journalist and writer. Her work has appeared in Scarleteen, Restless Magazine, Matters Journal, and Fight the New Drug.