Apple plans to scan your images for child porn
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.
Another confirmation:
Nothing that this is known to be false. The statement, which defies the official statement from Apple, is a lie because in order to do what they say they have to scan to know that data is from those sources. And notice it is "set to be" uploaded, not uploaded or uploading. This is a pretty ambiguous phrase that results in "can scan anytime."
The second part is "only for certain files." You have to scan everything to find something that you are looking for. It's physically impossible to only search files that are of a certain list because you can't know that they are on that list until after having already scanned them.
So whatever source you have for this was already stated as being false previously. That all files must be scanned has been said since the first moments of this discussion.
Whoever posted this clearly thought that people were pretty gullible and wouldn't think about this at all. That people are trying to defend Apple with such obvious lies makes it so much more obvious how bad it is and how bad they all know it is.
-
Any claim from Apple that says that they "won't" do something demanded by a government that they "can" do is a lie. Apple has no power to say no to that. It's not within their power, the government can seize the entire company if they want. Denying that kind of action is like saying you won't die if hit by a bullet, you promise.
Apple is basically resorting to just saying any crap hoping that their fanboy culture will protect them. But it isn't working. People are pointing out that it's obviously false and has no possibility of being true. Even Apple's own teams are speaking out now.
You can't use claims. If you want to say that there is some protection, you have to show how there is a technology that makes it impossible for Apple to do these things. If such a thing existed, Apple would be touting it all over the place. They are not, because there is not. Apple isn't claiming that they can't be forced to do these things, just that they will deny the government somehow which is a claim anyone can make, but no one can make honestly.
-
Apple claims that they will additionally scan iCloud. Not files as uploaded, but all files including those already there. Not a big deal as iCloud has never been private unless encrypted, obviously. But this is something most people are also missing. It's a lot of scanning.
-
https://www.washingtonpost.com/opinions/2021/08/13/apple-csam-child-safety-tool-hashing-privacy/
Great point in this article. By scanning, Apple is required by law to report. The only protection that Apple has is be keeping the data private, from themselves. So there is no logic to Apple opening themselves to prosecution for looking, but not reporting, unless there is something far bigger afoot.
Keep in mind, the CSAM laws already exist that if Apple scans that they have to report, even if they just scan on your device. They aren't allowed to wait for the iCloud upload to "maybe" happen. This is an existing law and Apple is obviously counting on this and ignoring it in the hopes that people don't bring it up.
-
Let's break down a bit from Apple's FAQ:
"Can the CSAM detection system in iCloud Photos be used to detect
things other than CSAM?Our process is designed to prevent that from happening."
This is an absolute "yes". In no way do they say "no", because they can't. Just like how "using a password" is designed to prevent the wrong people from accessing a system, we know that when you say "process is designed for..." is anything but "will only do..."
" CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations."
So we know that private companies in the US can be forced to do anything by the government, and we know that non-profits tend heavily towards corruption, and we know that non-profits can trivially be hacked. So claiming that "likely totally corrupt, insecure, private companies with zero oversight or security or skill" will provide the list of data is, to me, tantamount to advertising that far more than just the government will be in a position to control what is being scanned for. Could this be more obvious that there is a giant gaping security hole here in the system as designed!
There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, in-
cluding the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities. "The reporting is not automatic, and yet by law it has to be. If Apple believes that something has been found, they must report. "Automatic" here might not mean that it is done by a computer system, but their process, by law, must be automatic.
-
More lies from the FAQ:
"Could governments force Apple to add non-CSAM images to the
hash list?Apple will refuse any such demands."
So Apple is claiming that they will break the law, and just go rogue and somehow not be coerced? Um, this is the same as saying all agreements and contracts and promises with them are void. If they are above the law, and the law makes their agreements with the public valid, Apple is announcing here that they are under no obligation to do anything that they have promised us. So all bets are off and they've said so.
"Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. "
This is known to be computationally impossible and they are clearly insulting the intelligence of their users to make such an impossible and ridiculous claim. It's impossible to make technology that can only scan for one thing in this manner and they know it, and so does anyone who reads this.
"We have faced demands to build and deploy government-man-dated changes that degrade the privacy of users before, and have steadfastly refused those
demands. We will continue to refuse them in the future."Sure, this is likely true. But have their also caved to demands? We will never know. This isn't something they, or we, can prove. We just have to take the words of someone bold faced lying to us says.
"Let us be clear, this technology is limit-ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. "
So our only protection is that liars might this one time be telling the truth?
-
And this doesn't even begin to cover the fact that Apple has now officially created a mechanism by which they can simple claim that a number of false positives existed and manually access iCloud accounts and poke around. Of course, since iCloud is not encrypted, they've always had the ability to do this or to be coerced to do this. But there has always been a threat of discovery which keeps them from doing so casually in a way that could ever be discovered and exposed.
But now Apple can openly peer into and share with the government anything that they want and make the unprovable claim that a threshhold was detected and the account was made accessible and that they were looking for CSAM images as part of their process and "just happened to find" data exposing a government official of misconduct or whatever.
Very handy how the controls, both technical and social, are being dismantled here.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
More lies from the FAQ:
"Could governments force Apple to add non-CSAM images to the
hash list?Apple will refuse any such demands."
So Apple is claiming that they will break the law, and just go rogue and somehow not be coerced? Um, this is the same as saying all agreements and contracts and promises with them are void. If they are above the law, and the law makes their agreements with the public valid, Apple is announcing here that they are under no obligation to do anything that they have promised us. So all bets are off and they've said so.
"Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. "
This is known to be computationally impossible and they are clearly insulting the intelligence of their users to make such an impossible and ridiculous claim. It's impossible to make technology that can only scan for one thing in this manner and they know it, and so does anyone who reads this.
"We have faced demands to build and deploy government-man-dated changes that degrade the privacy of users before, and have steadfastly refused those
demands. We will continue to refuse them in the future."Sure, this is likely true. But have their also caved to demands? We will never know. This isn't something they, or we, can prove. We just have to take the words of someone bold faced lying to us says.
"Let us be clear, this technology is limit-ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. "
So our only protection is that liars might this one time be telling the truth?
TL:DR The government has some dirt on us, and in order to not let that cat out of the bag, we're selling you bitches out.
Thanks for all the fish, bye.
-
-
https://appleinsider.com/articles/21/08/17/germany-writes-to-tim-cook-to-reconsider-csam-plans
The quoted from the German government are priceless.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
https://appleinsider.com/articles/21/08/17/germany-writes-to-tim-cook-to-reconsider-csam-plans
The quoted from the German government are priceless.
That's actually a really good response from what appears to be a competent government.
-
Man from cover of Nevermind suing to classify the album as child pornography. So, does ownership of this album get your Apple flag limit up? Already major problems with this kind of scanning coming to light.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
Man from cover of Nevermind suing to classify the album as child pornography. So, does ownership of this album get your Apple flag limit up? Already major problems with this kind of scanning coming to light.
That sounds like scamming, but it's interesting to see that BBC cropped the image from the Nevermind album in the article. Meaning they chose to not show the entire image from the album cover.
However he also said that there were no model release signed. And that is a whole different matter because that means the record company didn't have the right to use the image on the album cover.
-
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed. But yes, he claims. If there was one, you'd think that there would be public pushback as that's a big deal given the circumstances.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed.
True. But pornography is not equal to nudity. No matter what the Puritanical fucking United States tries to say.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed.
True. But pornography is not equal to nudity. No matter what the Puritanical fucking United States tries to say.
But it's puritanical non-profits of unknown providence and no listed oversight who get to determine what is used by Apple. Apple itself has said it will apply no oversight and just blindly accept what some unlisted non-profits give it.
So considering that there is no standard even suggested to be applied by Apple, a simple claim like this is easily enough to meet their requirements given what Apple has publicly stated. It might not as well. We have no means of knowing as the requirements are not Apple's, but third party requirements, and any oversight of those is unlisted and unknown.
-
@pete-s said in Apple plans to scan your images for child porn:
That sounds like scamming,
You are not the only one that thinks that.
he was happy to dine out on having been Nirvana Baby until about 10 seconds ago. In 2016, in a 25th anniversary recreation of the photo shoot, he even volunteered to do it naked again, before thinking better of the idea and posing in swim trunks.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
That sounds like scamming,
You are not the only one that thinks that.
he was happy to dine out on having been Nirvana Baby until about 10 seconds ago. In 2016, in a 25th anniversary recreation of the photo shoot, he even volunteered to do it naked again, before thinking better of the idea and posing in swim trunks.
Certainly that's a popular, and potentially valid, opinion. And, haha, "poster child".
But it doesn't change the fact that a child was photographed naked and made into a commercial enterprise. I don't see any world where that should be okay. That someone feels that his parents should be allowed to sign a release for that I think is a huge problem. That's no different than how parents here look the other way in child marriages where girls far too young to have any decision making ability are given away to men in their 30s and 40s. Illegal... unless the parents agree.
The "unless parents agree" sounds not so bad when you grow up in rich western Europe, USA, or similar and live in average or above income levels. But in many cultures, and in many places with poverty, parents having their children's interest at heart is often not the case. They don't want to hurt them, but there is a price on everything about them. That's why parents routinely sell their children to slavers.
So no matter what, in my opinion, what was done was wrong. No amount of him being okay with it now matters. No amount of his parents agreeing matters. It was porn, it was wrong, it will always be wrong.
It's like a priest molesting a child and then the child repressing it or coming to terms with having been violated when they had no control. Even worse, the parents were involved! That kind of psychological trauma can never be overlooked. You have a lifetime of having to "deal with" the fact that it happened. You can't be also forced to be traumatized every day to prove that it was bad "for you".
In this case, he's a victim. That's real. Calling it victim culture is victim shaming. That's also real. There's no way to say he was "okay with it" in the past. It is who he is, he HAS to live with it. It's like saying an amputee's pain and anguish isn't valid because they come to terms with their injury by laughing about it or being happy. The person who chopped off their arm shouldn't be excused from responsibility just because the victim isn't suffering even more than necessary. He can't go back and make it not happen. He is forced to make do with the situation that he is in.
Is it the worst thing that has ever happened? Obviously not. Has it crippled his life? We have no way to know. Was it sick, pointless, and wrong? Yes, yes it was and I see no grey area on this. I think that using child nudity for commercial gain is a serious problem whether and that fame and popularity of an artist should not be grounds for looking the other way or attempting to shift blame onto someone not given a choice in the matter.
-
And the pile on officially begins.
https://9to5mac.com/2021/09/09/csam-scan-encrypted-messages/