Unfortunately, the negative effects from companies like Google turning in completely ethical people for doing things that should be completely legal and uncontroversial will do drastically more damage than the positive effects from said companies turning in the poorest of the pedophiles.
The company is literally building death camps, installing statues of genociders, is run by the RICH pedophiles(who have ZERO interest in seeing pedophiles prosecuted), and is using Palantir and Flock cameras to monitor everything, meanwhile having secret police disappear people and just openly slaughter them.
The United States Government is well beyond deserving the benefit of the doubt.
Great do you have a single example of what you’re claiming, lol. Google turning in a perfectly ethical person for doing something that should be legal and uncontroversial.
You’re moving the goal posts and changing your argument.
I am so confused. Did you read the article that you posted??? Are you just straight up defending pedophilia and rape?
The Toronto detective alleges that after the alerts were passed to the RCMP and then Toronto police, she looked at three of the images and found they depicted naked prepubescent girls. The images included an explicit sex act and exposed genitals.
depicted who I believe to be David Edward-Ooi Poon without a shirt, taking a selfie of himself while sticking out his tongue over an unconscious adult female," the search-warrant application states. The document goes on to describe the woman in the photo as naked below the waist and wearing a dark-coloured eye mask over her eyes.
The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled “Girls I Drugged And Raped.”
The images included adult females with breasts and genitals exposed “who appeared to be unconscious,” the ITO says. “The body positioning of the females appeared to be limp and did not significantly change throughout the images taken.”
Police allege they found other files on the iPhone that appeared to be “upskirt” images or photographs focusing on the buttocks of females, in folders with names suggesting they were underage girls.
Detectives laid 41 more charges in December including making and possessing child pornography, sexual assault, voyeurism for a sexual purpose and drugging someone to facilitate sexual assault.
Either you can’t read, or you are an incredibly disgusting person.
Nah I copied the wrong link. There’s one about a Swedish dude, but go ahead. Did you not notice you were reading the same article as the thread is about or did you skip reading it the first time?
Yea I have zero issue with the fact that accounts with pictures of children’s genitals on them should be referred to the the authorities.
If people want privacy, host the pictures locally.
When you’re storing images with a cloud provider. They become responsible for the images that they store. If it’s a photo of a child’s genitals and that’s illegal for them to have those images on their servers and they need to protect themselves.
I’m not the person you were replying to so i wasn’t really arguing any of these points, i just a saw the request and knew of an example, so i provided it.
Just in case this was for me specifically I’ll answer:
Yea I have zero issue with the fact that accounts with pictures of children’s genitals on them should be referred to the the authorities.
Pictures of children’s genitals aren’t inherently CSAM, there are plenty of parents and family members with entirely innocent pictures of their kids on their phones.
There are examples of this in the reported cases of false positives leading to bad outcomes, this is easily searchable.
I’m not saying to not do anything, I’m saying blanket reporting is an ineffective brute-force approach.
If people want privacy, host the pictures locally.
In theory yes, in practice, not so much.
on-device scanning exists and is in use/has been in use on phones, examples of this are also easily searchable.
When you’re storing images with a cloud provider. They become responsible for the images that they store. If it’s a photo of a child’s genitals and that’s illegal for them to have those images on their servers and they need to protect themselves.
The need for legal protection is valid, scanning cloud uploaded photo’s is a user privacy nightmare, but expected.
End to end encryption (where only the users device can decrypt and see the photo) would probably stand up legally but then they wouldn’t be able to use the cloud photo’s to make money.
The problem comes with the recognition of illegal and the way it’s handled.
Unfortunately, the negative effects from companies like Google turning in completely ethical people for doing things that should be completely legal and uncontroversial will do drastically more damage than the positive effects from said companies turning in the poorest of the pedophiles.
deleted by creator
Example please
The company is literally building death camps, installing statues of genociders, is run by the RICH pedophiles(who have ZERO interest in seeing pedophiles prosecuted), and is using Palantir and Flock cameras to monitor everything, meanwhile having secret police disappear people and just openly slaughter them.
The United States Government is well beyond deserving the benefit of the doubt.
Great do you have a single example of what you’re claiming, lol. Google turning in a perfectly ethical person for doing something that should be legal and uncontroversial.
You’re moving the goal posts and changing your argument.
Try reading the thread
https://www.cbc.ca/lite/story/9.7115031
This was posted 9 hours before your whinge
deleted by creator
I am so confused. Did you read the article that you posted??? Are you just straight up defending pedophilia and rape?
Either you can’t read, or you are an incredibly disgusting person.
Nah I copied the wrong link. There’s one about a Swedish dude, but go ahead. Did you not notice you were reading the same article as the thread is about or did you skip reading it the first time?
Lol yes I did notice.
“The wrong link”
“There’s one about a Swedish dude”
The gymnastics you’re going through to avoid actually facts is hilarious.
so you skipped reading it both times, huh
Here
Lol
“Do your own research”
Ok Karen sure. It’s up to me to prove other peoples random claims that they make on social media. Um no.
Yes, but also it’s the 4th result on that page.
Seems like reading 4 entries is a problem so here is a direct link
Yea I have zero issue with the fact that accounts with pictures of children’s genitals on them should be referred to the the authorities.
If people want privacy, host the pictures locally.
When you’re storing images with a cloud provider. They become responsible for the images that they store. If it’s a photo of a child’s genitals and that’s illegal for them to have those images on their servers and they need to protect themselves.
Ah, this is probably my fault.
I’m not the person you were replying to so i wasn’t really arguing any of these points, i just a saw the request and knew of an example, so i provided it.
Just in case this was for me specifically I’ll answer:
Pictures of children’s genitals aren’t inherently CSAM, there are plenty of parents and family members with entirely innocent pictures of their kids on their phones.
There are examples of this in the reported cases of false positives leading to bad outcomes, this is easily searchable.
I’m not saying to not do anything, I’m saying blanket reporting is an ineffective brute-force approach.
In theory yes, in practice, not so much.
on-device scanning exists and is in use/has been in use on phones, examples of this are also easily searchable.
The need for legal protection is valid, scanning cloud uploaded photo’s is a user privacy nightmare, but expected.
End to end encryption (where only the users device can decrypt and see the photo) would probably stand up legally but then they wouldn’t be able to use the cloud photo’s to make money.
The problem comes with the recognition of illegal and the way it’s handled.