Apple plans to scan your images for child porn
-
@carnival-boy said in Apple plans to scan your images for child porn:
Which bits. I don't find the article that clear.
Very clear, first line: "Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans"
Installing software. On your device. Unless the sources are incorrect, this is crystal clear that they can access YOUR DATA and that the iCloud component is a red herring. This is the part that the researchers, and the people here, are concerned about.
This is like Pegasus. Sure, in theory it's used to stop terrorism. In the real world it is used to threaten reporters.
Anything that can scan your pictures can also steal them or plant new ones, at will. Will they? Sure, they say that they won't. But you cannot, ever, under any circumstances, trust a company that would do this to your phone. So that they cannot be trusted anymore, if the article is real, is a given.
-
According to the article: "The proposals are Apple’s attempt to find a compromise between its own promise to protect customers’ privacy...."
The article states that this is Apple compromising on the promise of privacy. Compromising on privacy is, obviously, the same as going back on it. If the article is correct, Apple has flat out decided that their privacy stance is no longer their policy (in the USA.)
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
The claim of the news release is that they are accessing the phones. That's the entire set of concern. If the entire thing is fake, then of course, it's fake and it's not a problem. The concern is not Apple scanning data that THEY host, it's scanning data that WE host.
Not fake, confirmed by Apple, https://www.apple.com/child-safety/
These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*
and further
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
-
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.
I might actually go back to flip phones. Once Apple implements this, Google and others will follow in no time.
-
@marcinozga said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.
I might actually go back to flip phones. Once Apple implements this, Google and others will follow in no time.
Exactly - we're just kinda stuck.
I now need a non connected thin device for camera purposes... don't want to go to Point and Shoot.
-
@marcinozga said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
The claim of the news release is that they are accessing the phones. That's the entire set of concern. If the entire thing is fake, then of course, it's fake and it's not a problem. The concern is not Apple scanning data that THEY host, it's scanning data that WE host.
Not fake, confirmed by Apple, https://www.apple.com/child-safety/
These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*
and further
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Yup, just found it. Yeah, Ars Technica was completely correct.
-
@dashrender said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.
I might actually go back to flip phones. Once Apple implements this, Google and others will follow in no time.
Exactly - we're just kinda stuck.
I now need a non connected thin device for camera purposes... don't want to go to Point and Shoot.
Why not? Some point and shoots are fantastic! I just bought the Olympus E-PL9!
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@dashrender said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.
I might actually go back to flip phones. Once Apple implements this, Google and others will follow in no time.
Exactly - we're just kinda stuck.
I now need a non connected thin device for camera purposes... don't want to go to Point and Shoot.
Why not? Some point and shoots are fantastic! I just bought the Olympus E-PL9!
Cause it won't fit in my back pocket, that's all.
-
@carnival-boy said in Apple plans to scan your images for child porn:
And the Apple link clearly says it isn't doing what the article initially says will happen.
Actually Apple's link completely confirms the concerns. Not sure what you read, but you missed Apple's statements.
"Instead of scanning images in the cloud, the system performs on-device matching using a database"
You were claiming that they only scanned cloud, but Apple says the exactly opposite, twice.
-
@dashrender said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@dashrender said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.
I might actually go back to flip phones. Once Apple implements this, Google and others will follow in no time.
Exactly - we're just kinda stuck.
I now need a non connected thin device for camera purposes... don't want to go to Point and Shoot.
Why not? Some point and shoots are fantastic! I just bought the Olympus E-PL9!
Cause it won't fit in my back pocket, that's all.
I would never put a phone in my back pocket, either. It'll fit in a normal pocket, though.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@dashrender said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@dashrender said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.
I might actually go back to flip phones. Once Apple implements this, Google and others will follow in no time.
Exactly - we're just kinda stuck.
I now need a non connected thin device for camera purposes... don't want to go to Point and Shoot.
Why not? Some point and shoots are fantastic! I just bought the Olympus E-PL9!
Cause it won't fit in my back pocket, that's all.
I would never put a phone in my back pocket, either. It'll fit in a normal pocket, though.
Well, I guess that's good for you. I do all day nearly everyday, have my phone in my back pocket.
I definitely won't be carrying around a camera in my front jeans pocket.
If I was like you wearing cargo shorts every day, everywhere, it wouldn't be an issue, but I don't - I wear jeans 90% of the time.
-
Here's a real world example how badly AI can screw up:
If Tesla thinks Moon is a traffic light, what are the odds of AI on your phone mistaking child in a bathtub for actual porn image?
-
@marcinozga said in Apple plans to scan your images for child porn:
Here's a real world example how badly AI can screw up:
If Tesla thinks Moon is a traffic light, what are the odds of AI on your phone mistaking child in a bathtub for actual porn image?
More importantly, directly from Apple... the list comes from "other organizations." Not even a strict list of organizations. If you want to detect something on someone's phone all you need to do is convince, social engineer, or pay off any of those random third parties to add something that you want into their database and voila, Apple is now searching for anything that you want them to. Apple won't even know that they are looking for people wanting to promote voter rights or reporting news about police brutality, or journalists exposing a new concentration camp inside our own borders! Apple is just giving tools for the government to spy on you, they are providing no mechanism to ensure, or even suggest, that it sticks to the supposed topic. Their own press release tells us how it is open to anything an unlisted number of third parties (that are private companies) decide to search for.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
So they say.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Which bits. I don't find the article that clear. It initially suggests the data is uploaded automatically from the phone, but ends with these statements which clearly say it is only photos that are uploaded to iCloud that are affected:
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up systemLet me translate this through the eyes of a contract lawyer.
Statement one; and it is read as a singular statement unrelated to the rest of the sentence:
"Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone."Statement two, read as a singular statement, with the only reference to the first statement being the linking verb "have" which connects the "scanning" to the "pictures in the iCloud" :
"and [will continuously scan photos that] have also been uploaded to its iCloud back-up system"This is read as two different objects, not a single object with two prerequisites.
In other words, "any picture that is on your phone OR any picture that is on your iCloud"
NOT read as "any picture that is both on your phone AND in your iCloud."
Remember, you can install iTunes, or the iCloud application on your Windows PC and send pictures to your iCloud backup system without ever even owning an iPhone.
Back to the confusing sentence above...your mind probably has you thinking what is meant if you place the words in this order: "... also have been uploaded..." This still doesn't change the legal meaning to suddenly require BOTH, but this altered word order does reconcile better with a meaning of both. Sometimes there are large gaps between American English grammar and legal meanings.
-
@obsolesce said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
I bet 100% of parents have pictures of their naked children.
Definitely not. Including your child's genitals in a photo is a conscious decision you don't need to do.
We have photos of our children playing in the bathtub for example, but also made the conscious effort to not include their genitals in the photo. There's no reason to include that in the photo regardless of intentions.
Genitalia are not a requirement to classify a picture as child pornography.
New Jersey State Statute Subsection N.J. Stat. Ann. § 2C:24-4(b)(1).
Says:
"to otherwise depict a child for the purpose of sexual stimulation or gratification of any person who may view the depiction where the depiction does not have serious literary, artistic, political, or scientific value."Subsection (c) defines child pornography, not in terms of whether it depicts some sort of sexual activity, but rather in terms of the reaction the image is meant to evoke. The statute is thus remarkably broad, and the prohibition is based entirely on subjective (rather than objective) criteria.
An example may help: source
Imagine an individual who is sexually attracted to children, and who finds photographs of children bundled up in winter coats to be sexually stimulating. If that individual takes a picture of a child walking down the street who is wearing a winter coat, then the New Jersey statute would classify that picture as child pornography. All that matters is whether the image ‘‘depict[s] a child,’’ whether the individual who created the image had ‘‘the purpose of sexual stimulation,’’ and whether the resulting image had no ‘‘serious literary, artistic, political, or scientific value.’’ All three of those factors are met by the example. -
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
And the Apple link clearly says it isn't doing what the article initially says will happen.
Actually Apple's link completely confirms the concerns. Not sure what you read, but you missed Apple's statements.
"Instead of scanning images in the cloud, the system performs on-device matching using a database"
You were claiming that they only scanned cloud, but Apple says the exactly opposite, twice.
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
-
@carnival-boy said in Apple plans to scan your images for child porn:
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
So how do they know if the content of a photo is suspect? Unless mean a "human" can't access the data?
-
@jasgot said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
I bet 100% of parents have pictures of their naked children.
Definitely not. Including your child's genitals in a photo is a conscious decision you don't need to do.
We have photos of our children playing in the bathtub for example, but also made the conscious effort to not include their genitals in the photo. There's no reason to include that in the photo regardless of intentions.
Genitalia are not a requirement to classify a picture as child pornography.
New Jersey State Statute Subsection N.J. Stat. Ann. § 2C:24-4(b)(1).
Says:
"to otherwise depict a child for the purpose of sexual stimulation or gratification of any person who may view the depiction where the depiction does not have serious literary, artistic, political, or scientific value."Subsection (c) defines child pornography, not in terms of whether it depicts some sort of sexual activity, but rather in terms of the reaction the image is meant to evoke. The statute is thus remarkably broad, and the prohibition is based entirely on subjective (rather than objective) criteria.
An example may help: source
Imagine an individual who is sexually attracted to children, and who finds photographs of children bundled up in winter coats to be sexually stimulating. If that individual takes a picture of a child walking down the street who is wearing a winter coat, then the New Jersey statute would classify that picture as child pornography. All that matters is whether the image ‘‘depict[s] a child,’’ whether the individual who created the image had ‘‘the purpose of sexual stimulation,’’ and whether the resulting image had no ‘‘serious literary, artistic, political, or scientific value.’’ All three of those factors are met by the example.Yes, context plays a big part of it and definitely understandable.
But I wasn't saying what was and was not condisered child pornography. I was disagreeing with what I quoted specifically.