ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Apple plans to scan your images for child porn

    News
    16
    168
    13.6k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • scottalanmillerS
      scottalanmiller @marcinozga
      last edited by

      @marcinozga said in Apple plans to scan your images for child porn:

      @carnival-boy said in Apple plans to scan your images for child porn:

      @scottalanmiller said in Apple plans to scan your images for child porn:

      @carnival-boy said in Apple plans to scan your images for child porn:

      But the software doesn't have the ability. We're going round in circles.

      But it does, Apple themselves said that it does. It scans the whole device looking for whatever third party non-profits (and the government) tell it to search for. They could not possibly be more up front and clear about that. They aren't hiding this. You're claims don't seem to be that Apple won't do something bad, but that Apple is lying to make itself look bad. Why are you taking a stance that Apple is a good company, but lying? It's a very weird position to take without any reason to do so.

      You are saying that the government could force Apple to provide them with data held on my phone. Apple can't do this, they don't have access to the data that this software gets and holds privately on my phone. The scan results are private (until uploaded to iCloud). Apple simply don't have the means to access the scan results.

      That's my last post on this, I can't discuss with someone who just calls me weird.

      How do you know they don't? Because they said so? Lol. Apple explicitly stated that this software will upload results to iCloud, so there you have it. Conditions that trigger the upload are irrelevant at this point, the fact that it can upload anything is. Scott above explained perfectly that single warrant will force them to fork any data over.

      Right, and you said it there... THIS software initiates the upload to iCloud! That's like saying a mugger will never shoot you until they've pulled the trigger. Um, sure, you are just saying the same thing twice. If this software initiates the upload to iCloud, then of course they "never get your data until it connects", it's that very connection that we are discussing! Carnival might as well have said "but it never steals your data until it steals it!"

      Um.. duh.

      stacksofplatesS 1 Reply Last reply Reply Quote 0
      • stacksofplatesS
        stacksofplates @scottalanmiller
        last edited by

        @scottalanmiller said in Apple plans to scan your images for child porn:

        @marcinozga said in Apple plans to scan your images for child porn:

        @carnival-boy said in Apple plans to scan your images for child porn:

        @scottalanmiller said in Apple plans to scan your images for child porn:

        @carnival-boy said in Apple plans to scan your images for child porn:

        But the software doesn't have the ability. We're going round in circles.

        But it does, Apple themselves said that it does. It scans the whole device looking for whatever third party non-profits (and the government) tell it to search for. They could not possibly be more up front and clear about that. They aren't hiding this. You're claims don't seem to be that Apple won't do something bad, but that Apple is lying to make itself look bad. Why are you taking a stance that Apple is a good company, but lying? It's a very weird position to take without any reason to do so.

        You are saying that the government could force Apple to provide them with data held on my phone. Apple can't do this, they don't have access to the data that this software gets and holds privately on my phone. The scan results are private (until uploaded to iCloud). Apple simply don't have the means to access the scan results.

        That's my last post on this, I can't discuss with someone who just calls me weird.

        How do you know they don't? Because they said so? Lol. Apple explicitly stated that this software will upload results to iCloud, so there you have it. Conditions that trigger the upload are irrelevant at this point, the fact that it can upload anything is. Scott above explained perfectly that single warrant will force them to fork any data over.

        Right, and you said it there... THIS software initiates the upload to iCloud! That's like saying a mugger will never shoot you until they've pulled the trigger. Um, sure, you are just saying the same thing twice. If this software initiates the upload to iCloud, then of course they "never get your data until it connects", it's that very connection that we are discussing! Carnival might as well have said "but it never steals your data until it steals it!"

        Um.. duh.

        I don't want to get in the middle of this at all but what I think @Carnival-Boy is saying is that if you don't use icloud to backup photos, it won't be uploaded to icloud. They would have to force you to use icloud for it to work that way, so if they are then that answers the question. If you have the option to not use icloud, then it wouldn't ever be sent there.

        I think this is bad overall, but I believe that's his point.

        scottalanmillerS 1 Reply Last reply Reply Quote 0
        • scottalanmillerS
          scottalanmiller @stacksofplates
          last edited by

          @stacksofplates said in Apple plans to scan your images for child porn:

          @scottalanmiller said in Apple plans to scan your images for child porn:

          @marcinozga said in Apple plans to scan your images for child porn:

          @carnival-boy said in Apple plans to scan your images for child porn:

          @scottalanmiller said in Apple plans to scan your images for child porn:

          @carnival-boy said in Apple plans to scan your images for child porn:

          But the software doesn't have the ability. We're going round in circles.

          But it does, Apple themselves said that it does. It scans the whole device looking for whatever third party non-profits (and the government) tell it to search for. They could not possibly be more up front and clear about that. They aren't hiding this. You're claims don't seem to be that Apple won't do something bad, but that Apple is lying to make itself look bad. Why are you taking a stance that Apple is a good company, but lying? It's a very weird position to take without any reason to do so.

          You are saying that the government could force Apple to provide them with data held on my phone. Apple can't do this, they don't have access to the data that this software gets and holds privately on my phone. The scan results are private (until uploaded to iCloud). Apple simply don't have the means to access the scan results.

          That's my last post on this, I can't discuss with someone who just calls me weird.

          How do you know they don't? Because they said so? Lol. Apple explicitly stated that this software will upload results to iCloud, so there you have it. Conditions that trigger the upload are irrelevant at this point, the fact that it can upload anything is. Scott above explained perfectly that single warrant will force them to fork any data over.

          Right, and you said it there... THIS software initiates the upload to iCloud! That's like saying a mugger will never shoot you until they've pulled the trigger. Um, sure, you are just saying the same thing twice. If this software initiates the upload to iCloud, then of course they "never get your data until it connects", it's that very connection that we are discussing! Carnival might as well have said "but it never steals your data until it steals it!"

          Um.. duh.

          I don't want to get in the middle of this at all but what I think @Carnival-Boy is saying is that if you don't use icloud to backup photos, it won't be uploaded to icloud. They would have to force you to use icloud for it to work that way, so if they are then that answers the question. If you have the option to not use icloud, then it wouldn't ever be sent there.

          I think this is bad overall, but I believe that's his point.

          Ah, that's very different from what he said, completely. And not at all what Apple has said (from anything that I've seen.) They claim that they don't report you until stuff goes to iCloud, but that's very different from having scanned or having uploaded the data and the upload, like many aspects of iOS, to iCloud is automated.

          People have pointed out that WhatsApp will automatically download images sent to you and place them on iCloud so the entire process could happen via a third party, end to end, for a person who no longer has the phone (or could even be dead) and trigger the whole thing without even having access to the phone (but the phone would have to be powered on an unencrypted.) I'm sure that there is a way to stop that, but at least default settings make everything automatic and as the scanning uses iCloud, it triggers the requirement simply by existing.

          stacksofplatesS 1 Reply Last reply Reply Quote 0
          • scottalanmillerS
            scottalanmiller
            last edited by

            It's worth pointing out that iCloud cannot be disabled in iOS. You can disable it for individual apps or features, but not entirely. And, of course, as the scanning "app" is part of the OS, it always has iCloud access no matter what setting you choose.

            1 Reply Last reply Reply Quote 0
            • stacksofplatesS
              stacksofplates @scottalanmiller
              last edited by

              @scottalanmiller said in Apple plans to scan your images for child porn:

              Ah, that's very different from what he said, completely.

              It's not? He said this:

              The scan results are private (until uploaded to iCloud).

              Which would be true if uploading photos to icloud is disabled and they aren't forcing you to back up photos with icloud. Those mean the same thing.

              scottalanmillerS 1 Reply Last reply Reply Quote 0
              • scottalanmillerS
                scottalanmiller @stacksofplates
                last edited by

                @stacksofplates said in Apple plans to scan your images for child porn:

                @scottalanmiller said in Apple plans to scan your images for child porn:

                Ah, that's very different from what he said, completely.

                It's not? He said this:

                The scan results are private (until uploaded to iCloud).

                Which would be true if uploading photos to icloud is disabled and they aren't forcing you to back up photos with icloud. Those mean the same thing.

                Uploading photos to iCloud would be a different operation than uploading the scan results (which are not photos) to iCloud. Photo uploads are controlled by the end user, the scan uploads are not (as they have no setting in the OS.)

                It is the uploading of the scan results to iCloud that is the issue at hand. It's not a step along the way, it is the very problem. So "until" doesn't apply since that is the end result we are concerned about.

                stacksofplatesS 1 Reply Last reply Reply Quote 0
                • scottalanmillerS
                  scottalanmiller
                  last edited by

                  And I think that we all automatically agree that the entire process logically only works for its claimed intent if it does not require the iCloud piece, so logically it does not (and Apple has made no such claim, only Carnival has.) If someone was doing something truly awful on their phone and was able to block being scanned (or the utility of such a scan) by disabling iCloud, then they would just do so. That's trivial. Almost to the point of "opting in" to being scanned.

                  Obviously that would totally defeat the claimed purpose of the tool and if there was even a suggestion that that might be true then Apple fanbois would be all over pushing that framing of the situation. But it's totally nonsensical as it would completely undermine both the claimed purpose and the assumed purpose here (of enabling governments to keep tabs on political dissidents and discourage journalism.) Even if the government never actively uses the tool, it serves to create fear in those that might oppose the status quo - no matter what, that component is already serving its purpose by scaring people.

                  So both from a "what Apple says" and from a "what the lawyers have said" and from a technical "what has to be to make any sense at all", it all lines up that it has to be "scan on device" and not "voluntarily uploaded".

                  1 Reply Last reply Reply Quote 0
                  • stacksofplatesS
                    stacksofplates @scottalanmiller
                    last edited by

                    @scottalanmiller said in Apple plans to scan your images for child porn:

                    @stacksofplates said in Apple plans to scan your images for child porn:

                    @scottalanmiller said in Apple plans to scan your images for child porn:

                    Ah, that's very different from what he said, completely.

                    It's not? He said this:

                    The scan results are private (until uploaded to iCloud).

                    Which would be true if uploading photos to icloud is disabled and they aren't forcing you to back up photos with icloud. Those mean the same thing.

                    Uploading photos to iCloud would be a different operation than uploading the scan results (which are not photos) to iCloud. Photo uploads are controlled by the end user, the scan uploads are not (as they have no setting in the OS.)

                    It is the uploading of the scan results to iCloud that is the issue at hand. It's not a step along the way, it is the very problem. So "until" doesn't apply since that is the end result we are concerned about.

                    Where do you see that? You're making assumptions. The scan results would have to include the photo. That doesn't make any sense. What is the human verification for if the photo isn't uploaded?

                    Again, theae are all assumptions on your part about how this works. No one here knows how it works currently, so telling them they're wrong is infantile because you can't prove you're right.

                    The whole thing is bad, but don't get into arguments about things that you can't possibly understand how they work yet.

                    DustinB3403D scottalanmillerS 4 Replies Last reply Reply Quote 0
                    • DustinB3403D
                      DustinB3403 @stacksofplates
                      last edited by

                      @stacksofplates said in Apple plans to scan your images for child porn:

                      The scan results would have to include the photo.

                      Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.

                      Anything that matches would start sending up red flags.

                      The actual photo may never get uploaded to iCloud.

                      stacksofplatesS 1 Reply Last reply Reply Quote 0
                      • DustinB3403D
                        DustinB3403 @stacksofplates
                        last edited by

                        @stacksofplates said in Apple plans to scan your images for child porn:

                        What is the human verification for if the photo isn't uploaded?

                        The human verification is only once an account has passed a threshold of known CSAM hash records being discovered on a individual Apple device.

                        Once that threshold is hit, someone at Apple has to check and confirm that the content is CSAM (subjective to the person and training) and then if it is, they lock your account and notify the authorities.

                        stacksofplatesS 1 Reply Last reply Reply Quote 0
                        • DustinB3403D
                          DustinB3403 @stacksofplates
                          last edited by

                          @stacksofplates said in Apple plans to scan your images for child porn:

                          The whole thing is bad, but don't get into arguments about things that you can't possibly understand how they work yet.

                          My responses to you (granted you're talking to @scottalanmiller) is from what Apple posted on this announcement itself, and taken directly from their announcement.

                          We can make an well educated guess in how this will work, even with it not being deployed.

                          1 Reply Last reply Reply Quote 0
                          • stacksofplatesS
                            stacksofplates @DustinB3403
                            last edited by

                            @dustinb3403 said in Apple plans to scan your images for child porn:

                            @stacksofplates said in Apple plans to scan your images for child porn:

                            The scan results would have to include the photo.

                            Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.

                            Anything that matches would start sending up red flags.

                            The actual photo may never get uploaded to iCloud.

                            That's a joke right? You didn't read the article. They're using a neutral network to compare an image to a database of checksummed images. Presumably by features like face, exif data, etc. Then a human verifies it's a match to content in the existing checksummed image.

                            A 4 year old could get around comparing two images by checksum. That's clearly not what's happening here. Just change a single pixel and it's different. You don't need a neural net to compare checksums.

                            By the explanation in the article, they have to have the photo to compare.

                            DustinB3403D 1 Reply Last reply Reply Quote 0
                            • stacksofplatesS
                              stacksofplates @DustinB3403
                              last edited by

                              @dustinb3403 said in Apple plans to scan your images for child porn:

                              @stacksofplates said in Apple plans to scan your images for child porn:

                              What is the human verification for if the photo isn't uploaded?

                              The human verification is only once an account has passed a threshold of known CSAM hash records being discovered on a individual Apple device.

                              Once that threshold is hit, someone at Apple has to check and confirm that the content is CSAM (subjective to the person and training) and then if it is, they lock your account and notify the authorities.

                              So to get around the checksum method you are describing, you just crop the picture a tiny bit and would never catch any new photos that aren't a part of that database. Again, hardly need a neural net for that. Could do that on a raspberry pi.

                              DustinB3403D 1 Reply Last reply Reply Quote 0
                              • DustinB3403D
                                DustinB3403 @stacksofplates
                                last edited by DustinB3403

                                @stacksofplates said in Apple plans to scan your images for child porn:

                                @dustinb3403 said in Apple plans to scan your images for child porn:

                                @stacksofplates said in Apple plans to scan your images for child porn:

                                The scan results would have to include the photo.

                                Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.

                                Anything that matches would start sending up red flags.

                                The actual photo may never get uploaded to iCloud.

                                That's a joke right? You didn't read the article. They're using a neutral network to compare an image to a database of checksummed images. Presumably by features like face, exif data, etc. Then a human verifies it's a match to content in the existing checksummed image.

                                A 4 year old could get around comparing two images by checksum. That's clearly not what's happening here. Just change a single pixel and it's different. You don't need a neural net to compare checksums.

                                By the explanation in the article, they have to have the photo to compare.

                                Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.

                                The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes

                                stacksofplatesS 1 Reply Last reply Reply Quote 0
                                • DustinB3403D
                                  DustinB3403 @stacksofplates
                                  last edited by

                                  @stacksofplates said in Apple plans to scan your images for child porn:

                                  @dustinb3403 said in Apple plans to scan your images for child porn:

                                  @stacksofplates said in Apple plans to scan your images for child porn:

                                  What is the human verification for if the photo isn't uploaded?

                                  The human verification is only once an account has passed a threshold of known CSAM hash records being discovered on a individual Apple device.

                                  Once that threshold is hit, someone at Apple has to check and confirm that the content is CSAM (subjective to the person and training) and then if it is, they lock your account and notify the authorities.

                                  So to get around the checksum method you are describing, you just crop the picture a tiny bit and would never catch any new photos that aren't a part of that database. Again, hardly need a neural net for that. Could do that on a raspberry pi.

                                  Exactly, and pedophile's can easily do this, so this is just a backdoor to ease drop on Apple users.

                                  stacksofplatesS 1 Reply Last reply Reply Quote 0
                                  • scottalanmillerS
                                    scottalanmiller @stacksofplates
                                    last edited by

                                    @stacksofplates said in Apple plans to scan your images for child porn:

                                    @scottalanmiller said in Apple plans to scan your images for child porn:

                                    @stacksofplates said in Apple plans to scan your images for child porn:

                                    @scottalanmiller said in Apple plans to scan your images for child porn:

                                    Ah, that's very different from what he said, completely.

                                    It's not? He said this:

                                    The scan results are private (until uploaded to iCloud).

                                    Which would be true if uploading photos to icloud is disabled and they aren't forcing you to back up photos with icloud. Those mean the same thing.

                                    Uploading photos to iCloud would be a different operation than uploading the scan results (which are not photos) to iCloud. Photo uploads are controlled by the end user, the scan uploads are not (as they have no setting in the OS.)

                                    It is the uploading of the scan results to iCloud that is the issue at hand. It's not a step along the way, it is the very problem. So "until" doesn't apply since that is the end result we are concerned about.

                                    Where do you see that? You're making assumptions. The scan results would have to include the photo. That doesn't make any sense. What is the human verification for if the photo isn't uploaded?

                                    Again, theae are all assumptions on your part about how this works. No one here knows how it works currently, so telling them they're wrong is infantile because you can't prove you're right.

                                    The whole thing is bad, but don't get into arguments about things that you can't possibly understand how they work yet.

                                    You are talking about a later step. I've not even addressed Apple employees getting access to your files. That's yet another problem. I'm only dealing with the issues prior to that point.

                                    1 Reply Last reply Reply Quote 0
                                    • scottalanmillerS
                                      scottalanmiller
                                      last edited by

                                      https://appleinsider.com/articles/21/08/07/epic-games-ceo-slams-apple-government-spyware

                                      1 Reply Last reply Reply Quote 0
                                      • stacksofplatesS
                                        stacksofplates @DustinB3403
                                        last edited by

                                        @dustinb3403 said in Apple plans to scan your images for child porn:

                                        @stacksofplates said in Apple plans to scan your images for child porn:

                                        @dustinb3403 said in Apple plans to scan your images for child porn:

                                        @stacksofplates said in Apple plans to scan your images for child porn:

                                        The scan results would have to include the photo.

                                        Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.

                                        Anything that matches would start sending up red flags.

                                        The actual photo may never get uploaded to iCloud.

                                        That's a joke right? You didn't read the article. They're using a neutral network to compare an image to a database of checksummed images. Presumably by features like face, exif data, etc. Then a human verifies it's a match to content in the existing checksummed image.

                                        A 4 year old could get around comparing two images by checksum. That's clearly not what's happening here. Just change a single pixel and it's different. You don't need a neural net to compare checksums.

                                        By the explanation in the article, they have to have the photo to compare.

                                        Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.

                                        The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes

                                        The AI is running on device. Not sure where you read it's not. It's the same on device AI they are using for the iMessage sexually explicit verification.

                                        1 Reply Last reply Reply Quote 0
                                        • stacksofplatesS
                                          stacksofplates @DustinB3403
                                          last edited by stacksofplates

                                          Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.

                                          The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes

                                          The official statement doesn't even mention AI/neural in any way. Here's from their technical paper:

                                          NeuralHash
                                          NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this
                                          number on features of the image instead of the precise values of pixels in the image. The system computes
                                          these hashes by using an embedding network to produce image descriptors and then converting those
                                          descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process
                                          ensures that different images produce different hashes.

                                          Before an image is stored in iCloud Photos, the following on-device matching process is performed for that
                                          image against the blinded hash table database. The device computes the image NeuralHash and looks up
                                          the entry in the blinded hash table at the position pointed by the NeuralHash. The device uses the
                                          computed NeuralHash to compute a cryptographic header. It also uses the blinded hash that the system
                                          looked up to obtain a derived encryption key. This encryption key is then used to encrypt the associated
                                          payload data.

                                          The AI is running on the phone and doing image verification based on features, not just a checksum.

                                          Also it's eavesdrop.

                                          1 Reply Last reply Reply Quote 0
                                          • stacksofplatesS
                                            stacksofplates
                                            last edited by

                                            Also, if you look at their diagram in their white paper, the photo is part of the safety voucher, which is what is uploaded to iCloud.

                                            ff468715-1dc9-4afe-a954-cb86a7fe1eb0-image.png

                                            So this is what I was getting at earlier.

                                            This voucher is uploaded to iCloud Photos along with the image.

                                            Is that separate from icloud backup or is the voucher sent along with the image when it's backed up? By their process description the photo has to be sent as well, because they can't verify other-wards.

                                            This is why it's not straightforward and why I think @Carnival-Boy was making those statements.

                                            ObsolesceO 1 Reply Last reply Reply Quote 1
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 5
                                            • 6
                                            • 7
                                            • 8
                                            • 9
                                            • 6 / 9
                                            • First post
                                              Last post