Cloudflare’s Response to CSAM Online

RSS Feed

Participant
Joined
Dec 23, 2018
Messages
94
RSS Feed submitted a new Article:

Cloudflare’s Response to CSAM Online

F07D4D99-0F8F-403B-B5E6-F1FC15FFEF8B.png

Responding to incidents of child sexual abuse material (CSAM) online has been a priority at Cloudflare from the beginning. The stories of CSAM victims are tragic, and bring to light an appalling corner of the Internet. When it comes to CSAM, our position is simple: We don’t tolerate it. We abhor it. It’s a crime, and we do what we can to support the processes to identify and remove that content.

In 2010, within months of Cloudflare’s launch, we connected with the National Center for Missing and Exploited Children (NCMEC) and started a collaborative process to understand our role and how we could cooperate with them. Over the years, we have been in regular communication with a number of government and advocacy groups to determine what Cloudflare should and can do to respond to reports about CSAM that we receive through our abuse process, or how we can provide information supporting investigations of websites using Cloudflare’s services.

Recently, 36 tech companies, including Cloudflare, received this letter from a group of U.S Senators asking for more information about how we handle CSAM content. The Senators referred to influential New York Times stories published in late September and early November that conveyed the disturbing number of images of child sexual abuse on the Internet, with graphic detail about the horrific photos and how the recirculation of imagery retraumatizes the victims. The stories focused on shortcomings and challenges in bringing violators to justice, as well as efforts, or lack thereof, by a group of tech companies including Amazon, Facebook, Google, Microsoft, and Dropbox, to eradicate as much of this material as possible through existing processes or new tools like PhotoDNA that could proactively identify CSAM material.

We think it is important to share our response to the Senators (copied at the end of this blog post), talk publicly about what we’ve done in this space, and address what else we believe can be done.

How Cloudflare Responds to CSAM


From our work with NCMEC, we know that they are focused on doing everything they can to validate the legitimacy of CSAM reports and...

Read more about this article here...
 

Alpha1

Administrator
Joined
May 28, 2007
Messages
4,268
It's all fine that Cloudflare is now taking action against child porn websites. But I do wonder if this is just one step towards becoming internet police?
 

MagicalAzareal

Magical Developer
Joined
Apr 25, 2019
Messages
758
I'm surprised they weren't doing this before tbh, or maybe they were, and it wasn't as big of a marketing statement as it is now with the NYTimes constantly running smear pieces against tech companies accusing them of "looking the other way" on child abuse.

https://www.engadget.com/2019/05/31/sex-lies-and-surveillance-fosta-privacy/
Thorn and NCMEC are really, really shady. They have a tendency of hitting content that is not actually child pornography and are prone to exaggerating the amount of content there actually is and making it sound worse than it is (not that this content isn't absolutely abhorrent).

By definition, it is impossible to audit what they're blocking or getting platforms to block, because child pornography is illegal and they wouldn't want those devious and omnipresent paedophiles to see those images.

Once, the Internet Watch Foundation blocked the entirety of Wikipedia for three days in the U.K. once because a famous band cover from the 50s which depicted child nudity, the 50s were a very different time than today. It wasn't enough to just block that one page, they had to block the entire site, until they were forced to backdown on the decision.

Different countries also operate by different laws. In some countries like the United Kingdom, cartoons are illegal, but in others, they are not, but the IWF is a U.K. based and likely treats the two the same. This is getting awfully close to world police territory where another country tells you what you're allowed to do.

I also would not be surprised, if a lot of the images the NYTimes loses it's mind over are sexts between teenagers (I don't particularly like it, but it's not the same thing as paedophilia) and other things. Facts like this tend to come out after the big wave of panic passes and whatever measures are put in place.

Policing this stuff has all become a bit of an unhealthy obsession for people.
There are some other articles which go deeper into this, but I've just woken up and am still a little sleepy, so I'll leave that up-to you.
 

zappaDPJ

Moderator
Joined
Aug 26, 2010
Messages
8,450
Once, the Internet Watch Foundation blocked the entirety of Wikipedia for three days in the U.K. once because a famous band cover from the 50s which depicted child nudity, the 50s were a very different time than today. It wasn't enough to just block that one page, they had to block the entire site, until they were forced to backdown on the decision.

The album cover in question was actually released in 1976 and it was quite shocking even then. That said it's often the case that attempts at censorship often backfire or worse, as they did in this instance.
 

MagicalAzareal

Magical Developer
Joined
Apr 25, 2019
Messages
758
Another thing to mention is that running filters is not free. You pay good money for Cloudflare Workers and now you're going to have hash filters running in the background every-time *just in case* you are spreading contraband. The costs for this will ultimately be passed down to the customer who is being treated like a criminal over something they have nothing to do with.

These filters don't even work against a determined criminal as they can just change a few bits (that is the whole point of a hash) and their contraband will be able to make it's way through. Cloudflare is a terrible choice for distributing contraband anyway as things can easily be tracked back to you, especially if you're paying for the service.

Amazon rightfully gave the NYTimes the middle finger in this regard by telling them they prioritise the privacy of their customers over such theoretical concerns. I hope they don't do a 180.
The album cover in question was actually released in 1976 and it was quite shocking even then. That said it's often the case that attempts at censorship often backfire or worse, as they did in this instance.
That is interesting, I'm not too familiar with things from back then, but that is true.
 

MagicalAzareal

Magical Developer
Joined
Apr 25, 2019
Messages
758
Now I think about it, this is a great way to DoS a site.

Think about it.

Someone who hates you just has to upload illicit imagery, it works it's way through workers (for caching purposes) and the request gets blocked, right? How long is the page blocked for? How many "incidents" need to occur before Cloudflare decides you are a liability and cuts you off entirely? What if someone does this silently in PMs? What if you're running a large image site like imgur? Incidents are bound to occur with any scale.

You can do everything right and still end up in a world of pain.

Even if they don't do it, all you need is the media to scream hard enough about them "not doing enough" again for them to start cracking down. Just some thoughts.

One good method might be to run a sting. Run all the scanners, but secretly, have the police gather up financial paper trails (as they have to pay for workers), gain their trust and put all these people away (so they don't just hop to the next place).
 
Last edited:

mysiteguy

Fanatic
Joined
Feb 20, 2007
Messages
3,619
The album cover in question was actually released in 1976 and it was quite shocking even then. That said it's often the case that attempts at censorship often backfire or worse, as they did in this instance.

^^^ Agreed. One of the most impactful photos from the Vietnam war was the Pulitzer prize-winning "napalm girl" photo of the naked girl crying and running away from the napalm bombing behind her - the fear and agony in her eyes, as this child ran with a burned back. It needed to be seen by the world, not hidden away.
 

MagicalAzareal

Magical Developer
Joined
Apr 25, 2019
Messages
758
^^^ Agreed. One of the most impactful photos from the Vietnam war was the Pulitzer prize-winning "napalm girl" photo of the naked girl crying and running away from the napalm bombing behind her - the fear and agony in her eyes, as this child ran with a burned back. It needed to be seen by the world, not hidden away.
That very image is getting censored in some places. I do agree that it is important to expose horrific war atrocities, although I certainly would not have the stomach for a lot of it.
 
Top