Apple plans to scan your images for child porn
-
@carnival-boy said in Apple plans to scan your images for child porn:
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
So they say.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Which bits. I don't find the article that clear. It initially suggests the data is uploaded automatically from the phone, but ends with these statements which clearly say it is only photos that are uploaded to iCloud that are affected:
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up systemLet me translate this through the eyes of a contract lawyer.
Statement one; and it is read as a singular statement unrelated to the rest of the sentence:
"Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone."Statement two, read as a singular statement, with the only reference to the first statement being the linking verb "have" which connects the "scanning" to the "pictures in the iCloud" :
"and [will continuously scan photos that] have also been uploaded to its iCloud back-up system"This is read as two different objects, not a single object with two prerequisites.
In other words, "any picture that is on your phone OR any picture that is on your iCloud"
NOT read as "any picture that is both on your phone AND in your iCloud."
Remember, you can install iTunes, or the iCloud application on your Windows PC and send pictures to your iCloud backup system without ever even owning an iPhone.
Back to the confusing sentence above...your mind probably has you thinking what is meant if you place the words in this order: "... also have been uploaded..." This still doesn't change the legal meaning to suddenly require BOTH, but this altered word order does reconcile better with a meaning of both. Sometimes there are large gaps between American English grammar and legal meanings.
-
@obsolesce said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
I bet 100% of parents have pictures of their naked children.
Definitely not. Including your child's genitals in a photo is a conscious decision you don't need to do.
We have photos of our children playing in the bathtub for example, but also made the conscious effort to not include their genitals in the photo. There's no reason to include that in the photo regardless of intentions.
Genitalia are not a requirement to classify a picture as child pornography.
New Jersey State Statute Subsection N.J. Stat. Ann. § 2C:24-4(b)(1).
Says:
"to otherwise depict a child for the purpose of sexual stimulation or gratification of any person who may view the depiction where the depiction does not have serious literary, artistic, political, or scientific value."Subsection (c) defines child pornography, not in terms of whether it depicts some sort of sexual activity, but rather in terms of the reaction the image is meant to evoke. The statute is thus remarkably broad, and the prohibition is based entirely on subjective (rather than objective) criteria.
An example may help: source
Imagine an individual who is sexually attracted to children, and who finds photographs of children bundled up in winter coats to be sexually stimulating. If that individual takes a picture of a child walking down the street who is wearing a winter coat, then the New Jersey statute would classify that picture as child pornography. All that matters is whether the image ‘‘depict[s] a child,’’ whether the individual who created the image had ‘‘the purpose of sexual stimulation,’’ and whether the resulting image had no ‘‘serious literary, artistic, political, or scientific value.’’ All three of those factors are met by the example. -
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
And the Apple link clearly says it isn't doing what the article initially says will happen.
Actually Apple's link completely confirms the concerns. Not sure what you read, but you missed Apple's statements.
"Instead of scanning images in the cloud, the system performs on-device matching using a database"
You were claiming that they only scanned cloud, but Apple says the exactly opposite, twice.
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
-
@carnival-boy said in Apple plans to scan your images for child porn:
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
So how do they know if the content of a photo is suspect? Unless mean a "human" can't access the data?
-
@jasgot said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
I bet 100% of parents have pictures of their naked children.
Definitely not. Including your child's genitals in a photo is a conscious decision you don't need to do.
We have photos of our children playing in the bathtub for example, but also made the conscious effort to not include their genitals in the photo. There's no reason to include that in the photo regardless of intentions.
Genitalia are not a requirement to classify a picture as child pornography.
New Jersey State Statute Subsection N.J. Stat. Ann. § 2C:24-4(b)(1).
Says:
"to otherwise depict a child for the purpose of sexual stimulation or gratification of any person who may view the depiction where the depiction does not have serious literary, artistic, political, or scientific value."Subsection (c) defines child pornography, not in terms of whether it depicts some sort of sexual activity, but rather in terms of the reaction the image is meant to evoke. The statute is thus remarkably broad, and the prohibition is based entirely on subjective (rather than objective) criteria.
An example may help: source
Imagine an individual who is sexually attracted to children, and who finds photographs of children bundled up in winter coats to be sexually stimulating. If that individual takes a picture of a child walking down the street who is wearing a winter coat, then the New Jersey statute would classify that picture as child pornography. All that matters is whether the image ‘‘depict[s] a child,’’ whether the individual who created the image had ‘‘the purpose of sexual stimulation,’’ and whether the resulting image had no ‘‘serious literary, artistic, political, or scientific value.’’ All three of those factors are met by the example.Yes, context plays a big part of it and definitely understandable.
But I wasn't saying what was and was not condisered child pornography. I was disagreeing with what I quoted specifically.
-
@carnival-boy said in Apple plans to scan your images for child porn:
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data.
How can they not access the phone? They must be keeping a copy of the hash records somewhere, and then comparing it with what is on the phone. Maybe they are keeping a record of the hashes on the phone as well, but this seems unlikely.
-
Here is Apple's official statement:
https://www.apple.com/child-safety/
Maybe if some of you read it and stopped believing all the click bait news articles you are reading you would know what's really going on.
-
These are the important bits, to summarize, Apple is scanning your local files, to generate Hash files of the content on your phone, which then if its uploaded to iCloud, starts "counting against you" (tinfoil hat bit there) and then if you hit a threshold, then a human at apple reviews the potential CSAM content and confirms it, locks your account and calls the authorities.
AKA Apple is surfing the web for child porn for their own kicks.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices. Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image. Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
-
@dustinb3403 said in Apple plans to scan your images for child porn:
These are the important bits, to summarize, Apple is scanning your local files, to generate Hash files of the content on your phone, which then if its uploaded to iCloud, starts "counting against you" (tinfoil hat bit there) and then if you hit a threshold, then a human at apple reviews the potential CSAM content and confirms it, locks your account and calls the authorities.
AKA Apple is surfing the web for child porn for their own kicks.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
I'm sure law enforcement is providing the hashes.
-
@irj said in Apple plans to scan your images for child porn:
@dustinb3403 said in Apple plans to scan your images for child porn:
These are the important bits, to summarize, Apple is scanning your local files, to generate Hash files of the content on your phone, which then if its uploaded to iCloud, starts "counting against you" (tinfoil hat bit there) and then if you hit a threshold, then a human at apple reviews the potential CSAM content and confirms it, locks your account and calls the authorities.
AKA Apple is surfing the web for child porn for their own kicks.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices. Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image. Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
I'm sure law enforcement is providing the hashes.
I haven't read the artice or official announcement. Assuming things like that is where most people go wrong. I'd imagine law enforcement is providing some of those hashes, but do we actually know the source?
-
@irj said in Apple plans to scan your images for child porn:
@dustinb3403 said in Apple plans to scan your images for child porn:
These are the important bits, to summarize, Apple is scanning your local files, to generate Hash files of the content on your phone, which then if its uploaded to iCloud, starts "counting against you" (tinfoil hat bit there) and then if you hit a threshold, then a human at apple reviews the potential CSAM content and confirms it, locks your account and calls the authorities.
AKA Apple is surfing the web for child porn for their own kicks.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices. Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image. Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
I'm sure law enforcement is providing the hashes.
Apple said very clearly that it is private companies of which they have listed one and only said "there are others". It is not law enforcement at this time.
-
@travisdh1 said in Apple plans to scan your images for child porn:
@irj said in Apple plans to scan your images for child porn:
@dustinb3403 said in Apple plans to scan your images for child porn:
These are the important bits, to summarize, Apple is scanning your local files, to generate Hash files of the content on your phone, which then if its uploaded to iCloud, starts "counting against you" (tinfoil hat bit there) and then if you hit a threshold, then a human at apple reviews the potential CSAM content and confirms it, locks your account and calls the authorities.
AKA Apple is surfing the web for child porn for their own kicks.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices. Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image. Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
I'm sure law enforcement is providing the hashes.
I haven't read the artice or official announcement. Assuming things like that is where most people go wrong. I'd imagine law enforcement is providing some of those hashes, but do we actually know the source?
I did. The official announcement is as bad as can be imagined.
-
@travisdh1 said in Apple plans to scan your images for child porn:
I'd imagine law enforcement is providing some of those hashes, but do we actually know the source?
Private, non-profits of which one is listed and we are only told that there are "more". I already mentioned this above that anyone that can pay off, hack, or infiltrate a non-profit can then scan your phone for anything that they want.
-
@voip_n00b said in Apple plans to scan your images for child porn:
Here is Apple's official statement:
https://www.apple.com/child-safety/
Maybe if some of you read it and stopped believing all the click bait news articles you are reading you would know what's really going on.
Um, you just posted the official link, that we are discussing, that we've been quoting. LOL
-
@jasgot said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
So how do they know if the content of a photo is suspect? Unless mean a "human" can't access the data?
You are correct. They even SAID that a human would verify it in the text.
But the great thing about the hash... they don't need to pull it off of your phone because they already know what it is. So they can look at it remotely without getting access to your phone because they already had access to your phone to know what it is!
-
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
And the Apple link clearly says it isn't doing what the article initially says will happen.
Actually Apple's link completely confirms the concerns. Not sure what you read, but you missed Apple's statements.
"Instead of scanning images in the cloud, the system performs on-device matching using a database"
You were claiming that they only scanned cloud, but Apple says the exactly opposite, twice.
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
That's not what Apple has said. The scanning is done on the phone, and it can alert people. They may claim that they won't use that capability, but once served with a warrant, Apple's statement of what they will or won't do means nothing. They don't need to have any process for being willing to do this to do it. Anything that they CAN do, they've agreed TO DO when they operate in the US. Period. No grey area.
That's why all security has to be "non-vendor reversible" because only the lack of ability to access your data protects it from the US government.
-
@jasgot said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
So they say.
The problem is, the scanning tool CAN upload it. The phone has that power. So since the government can force them to do something that they CAN do, what they say doesn't matter.
That's the thing about the US legal system. Anything you "say" you won't do is nullified if you have the ability to do said thing. Making yourself able to do something is the same as choosing to potentially do it in this context.
That's the problem. Apple has chosen to enable governments to censor journalists and seek out political dissidents and isn't even making the slightest attempt to hide it.
-
@jasgot said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
I bet 100% of parents have pictures of their naked children.
Definitely not. Including your child's genitals in a photo is a conscious decision you don't need to do.
We have photos of our children playing in the bathtub for example, but also made the conscious effort to not include their genitals in the photo. There's no reason to include that in the photo regardless of intentions.
Genitalia are not a requirement to classify a picture as child pornography.
New Jersey State Statute Subsection N.J. Stat. Ann. § 2C:24-4(b)(1).
Says:
"to otherwise depict a child for the purpose of sexual stimulation or gratification of any person who may view the depiction where the depiction does not have serious literary, artistic, political, or scientific value."Subsection (c) defines child pornography, not in terms of whether it depicts some sort of sexual activity, but rather in terms of the reaction the image is meant to evoke. The statute is thus remarkably broad, and the prohibition is based entirely on subjective (rather than objective) criteria.
An example may help: source
Imagine an individual who is sexually attracted to children, and who finds photographs of children bundled up in winter coats to be sexually stimulating. If that individual takes a picture of a child walking down the street who is wearing a winter coat, then the New Jersey statute would classify that picture as child pornography. All that matters is whether the image ‘‘depict[s] a child,’’ whether the individual who created the image had ‘‘the purpose of sexual stimulation,’’ and whether the resulting image had no ‘‘serious literary, artistic, political, or scientific value.’’ All three of those factors are met by the example.And that's just the US. Every country has equally bizarre rules. Australia is a great example. In Australia they classify girls as children based on breast size, not age.
So technically a picture of a skinny sixty year old woman with A cups is pornography. So what is just totally boring grandma porn in most countries is super illegal in Australia (and massively sexist that they actually have laws that say that women can't be fully considered adults unless they are curvy enough to meet the government's porn standards!)
Imagine the problems that will arise as every country's totally bizarre and "nothing to do with what we claim to be searching for" starts to become illegal. And then add to it people traveling between these insane jurisdictions.
There's no possibility of this system functioning for what they claim to intend it for. But so many terrible things it could be used for.
-
Good timing on this as I was about to buy my first iPad in a really long time when I went to the US in a few weeks. F that.