Apple plans to scan your images for child porn
-
@stacksofplates said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
Ah, that's very different from what he said, completely.
It's not? He said this:
The scan results are private (until uploaded to iCloud).
Which would be true if uploading photos to icloud is disabled and they aren't forcing you to back up photos with icloud. Those mean the same thing.
Uploading photos to iCloud would be a different operation than uploading the scan results (which are not photos) to iCloud. Photo uploads are controlled by the end user, the scan uploads are not (as they have no setting in the OS.)
It is the uploading of the scan results to iCloud that is the issue at hand. It's not a step along the way, it is the very problem. So "until" doesn't apply since that is the end result we are concerned about.
-
And I think that we all automatically agree that the entire process logically only works for its claimed intent if it does not require the iCloud piece, so logically it does not (and Apple has made no such claim, only Carnival has.) If someone was doing something truly awful on their phone and was able to block being scanned (or the utility of such a scan) by disabling iCloud, then they would just do so. That's trivial. Almost to the point of "opting in" to being scanned.
Obviously that would totally defeat the claimed purpose of the tool and if there was even a suggestion that that might be true then Apple fanbois would be all over pushing that framing of the situation. But it's totally nonsensical as it would completely undermine both the claimed purpose and the assumed purpose here (of enabling governments to keep tabs on political dissidents and discourage journalism.) Even if the government never actively uses the tool, it serves to create fear in those that might oppose the status quo - no matter what, that component is already serving its purpose by scaring people.
So both from a "what Apple says" and from a "what the lawyers have said" and from a technical "what has to be to make any sense at all", it all lines up that it has to be "scan on device" and not "voluntarily uploaded".
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
Ah, that's very different from what he said, completely.
It's not? He said this:
The scan results are private (until uploaded to iCloud).
Which would be true if uploading photos to icloud is disabled and they aren't forcing you to back up photos with icloud. Those mean the same thing.
Uploading photos to iCloud would be a different operation than uploading the scan results (which are not photos) to iCloud. Photo uploads are controlled by the end user, the scan uploads are not (as they have no setting in the OS.)
It is the uploading of the scan results to iCloud that is the issue at hand. It's not a step along the way, it is the very problem. So "until" doesn't apply since that is the end result we are concerned about.
Where do you see that? You're making assumptions. The scan results would have to include the photo. That doesn't make any sense. What is the human verification for if the photo isn't uploaded?
Again, theae are all assumptions on your part about how this works. No one here knows how it works currently, so telling them they're wrong is infantile because you can't prove you're right.
The whole thing is bad, but don't get into arguments about things that you can't possibly understand how they work yet.
-
@stacksofplates said in Apple plans to scan your images for child porn:
The scan results would have to include the photo.
Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.
Anything that matches would start sending up red flags.
The actual photo may never get uploaded to iCloud.
-
@stacksofplates said in Apple plans to scan your images for child porn:
What is the human verification for if the photo isn't uploaded?
The human verification is only once an account has passed a threshold of known CSAM hash records being discovered on a individual Apple device.
Once that threshold is hit, someone at Apple has to check and confirm that the content is CSAM (subjective to the person and training) and then if it is, they lock your account and notify the authorities.
-
@stacksofplates said in Apple plans to scan your images for child porn:
The whole thing is bad, but don't get into arguments about things that you can't possibly understand how they work yet.
My responses to you (granted you're talking to @scottalanmiller) is from what Apple posted on this announcement itself, and taken directly from their announcement.
We can make an well educated guess in how this will work, even with it not being deployed.
-
@dustinb3403 said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
The scan results would have to include the photo.
Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.
Anything that matches would start sending up red flags.
The actual photo may never get uploaded to iCloud.
That's a joke right? You didn't read the article. They're using a neutral network to compare an image to a database of checksummed images. Presumably by features like face, exif data, etc. Then a human verifies it's a match to content in the existing checksummed image.
A 4 year old could get around comparing two images by checksum. That's clearly not what's happening here. Just change a single pixel and it's different. You don't need a neural net to compare checksums.
By the explanation in the article, they have to have the photo to compare.
-
@dustinb3403 said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
What is the human verification for if the photo isn't uploaded?
The human verification is only once an account has passed a threshold of known CSAM hash records being discovered on a individual Apple device.
Once that threshold is hit, someone at Apple has to check and confirm that the content is CSAM (subjective to the person and training) and then if it is, they lock your account and notify the authorities.
So to get around the checksum method you are describing, you just crop the picture a tiny bit and would never catch any new photos that aren't a part of that database. Again, hardly need a neural net for that. Could do that on a raspberry pi.
-
@stacksofplates said in Apple plans to scan your images for child porn:
@dustinb3403 said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
The scan results would have to include the photo.
Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.
Anything that matches would start sending up red flags.
The actual photo may never get uploaded to iCloud.
That's a joke right? You didn't read the article. They're using a neutral network to compare an image to a database of checksummed images. Presumably by features like face, exif data, etc. Then a human verifies it's a match to content in the existing checksummed image.
A 4 year old could get around comparing two images by checksum. That's clearly not what's happening here. Just change a single pixel and it's different. You don't need a neural net to compare checksums.
By the explanation in the article, they have to have the photo to compare.
Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.
The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes
-
@stacksofplates said in Apple plans to scan your images for child porn:
@dustinb3403 said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
What is the human verification for if the photo isn't uploaded?
The human verification is only once an account has passed a threshold of known CSAM hash records being discovered on a individual Apple device.
Once that threshold is hit, someone at Apple has to check and confirm that the content is CSAM (subjective to the person and training) and then if it is, they lock your account and notify the authorities.
So to get around the checksum method you are describing, you just crop the picture a tiny bit and would never catch any new photos that aren't a part of that database. Again, hardly need a neural net for that. Could do that on a raspberry pi.
Exactly, and pedophile's can easily do this, so this is just a backdoor to ease drop on Apple users.
-
@stacksofplates said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
Ah, that's very different from what he said, completely.
It's not? He said this:
The scan results are private (until uploaded to iCloud).
Which would be true if uploading photos to icloud is disabled and they aren't forcing you to back up photos with icloud. Those mean the same thing.
Uploading photos to iCloud would be a different operation than uploading the scan results (which are not photos) to iCloud. Photo uploads are controlled by the end user, the scan uploads are not (as they have no setting in the OS.)
It is the uploading of the scan results to iCloud that is the issue at hand. It's not a step along the way, it is the very problem. So "until" doesn't apply since that is the end result we are concerned about.
Where do you see that? You're making assumptions. The scan results would have to include the photo. That doesn't make any sense. What is the human verification for if the photo isn't uploaded?
Again, theae are all assumptions on your part about how this works. No one here knows how it works currently, so telling them they're wrong is infantile because you can't prove you're right.
The whole thing is bad, but don't get into arguments about things that you can't possibly understand how they work yet.
You are talking about a later step. I've not even addressed Apple employees getting access to your files. That's yet another problem. I'm only dealing with the issues prior to that point.
-
-
@dustinb3403 said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
@dustinb3403 said in Apple plans to scan your images for child porn:
@stacksofplates said in Apple plans to scan your images for child porn:
The scan results would have to include the photo.
Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.
Anything that matches would start sending up red flags.
The actual photo may never get uploaded to iCloud.
That's a joke right? You didn't read the article. They're using a neutral network to compare an image to a database of checksummed images. Presumably by features like face, exif data, etc. Then a human verifies it's a match to content in the existing checksummed image.
A 4 year old could get around comparing two images by checksum. That's clearly not what's happening here. Just change a single pixel and it's different. You don't need a neural net to compare checksums.
By the explanation in the article, they have to have the photo to compare.
Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.
The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes
The AI is running on device. Not sure where you read it's not. It's the same on device AI they are using for the iMessage sexually explicit verification.
-
Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.
The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes
The official statement doesn't even mention AI/neural in any way. Here's from their technical paper:
NeuralHash
NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this
number on features of the image instead of the precise values of pixels in the image. The system computes
these hashes by using an embedding network to produce image descriptors and then converting those
descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process
ensures that different images produce different hashes.Before an image is stored in iCloud Photos, the following on-device matching process is performed for that
image against the blinded hash table database. The device computes the image NeuralHash and looks up
the entry in the blinded hash table at the position pointed by the NeuralHash. The device uses the
computed NeuralHash to compute a cryptographic header. It also uses the blinded hash that the system
looked up to obtain a derived encryption key. This encryption key is then used to encrypt the associated
payload data.The AI is running on the phone and doing image verification based on features, not just a checksum.
Also it's eavesdrop.
-
Also, if you look at their diagram in their white paper, the photo is part of the safety voucher, which is what is uploaded to iCloud.
So this is what I was getting at earlier.
This voucher is uploaded to iCloud Photos along with the image.
Is that separate from icloud backup or is the voucher sent along with the image when it's backed up? By their process description the photo has to be sent as well, because they can't verify other-wards.
This is why it's not straightforward and why I think @Carnival-Boy was making those statements.
-
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
-
Looks like Apple has gone fully off of the rails. Full on spying on their workers at home...
-
@obsolesce said in Apple plans to scan your images for child porn:
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
Looks like Apple has gone fully off of the rails. Full on spying on their workers at home...
Not surprised - this is the exact mentality that some had around here... If I can't see you working/not working, then I assume you're not working. /sigh.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
The issue is that they are scanning without the upload, and can based on that be forced to report on you.
It seems like this whole thing is only put into action via a trigger, which is the uploading of the image to your iCloud account. Before then, there's no way for any reporting. The reporting is done via iCloud servers based on the image voucher that's with your image in iCloud.
This whole thing is all about the iCloud Photos account accumulating enough CSAM matching vouchers. Without your photos being uploaded to your iCloud Photos account, this whole thing is moot.
For reference: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf