Apple plans to scan your images for child porn
-
@jasgot said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
So how do they know if the content of a photo is suspect? Unless mean a "human" can't access the data?
You are correct. They even SAID that a human would verify it in the text.
But the great thing about the hash... they don't need to pull it off of your phone because they already know what it is. So they can look at it remotely without getting access to your phone because they already had access to your phone to know what it is!
-
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
And the Apple link clearly says it isn't doing what the article initially says will happen.
Actually Apple's link completely confirms the concerns. Not sure what you read, but you missed Apple's statements.
"Instead of scanning images in the cloud, the system performs on-device matching using a database"
You were claiming that they only scanned cloud, but Apple says the exactly opposite, twice.
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data. The data is private until it us uploaded to the cloud.
That's not what Apple has said. The scanning is done on the phone, and it can alert people. They may claim that they won't use that capability, but once served with a warrant, Apple's statement of what they will or won't do means nothing. They don't need to have any process for being willing to do this to do it. Anything that they CAN do, they've agreed TO DO when they operate in the US. Period. No grey area.
That's why all security has to be "non-vendor reversible" because only the lack of ability to access your data protects it from the US government.
-
@jasgot said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
So they say.
The problem is, the scanning tool CAN upload it. The phone has that power. So since the government can force them to do something that they CAN do, what they say doesn't matter.
That's the thing about the US legal system. Anything you "say" you won't do is nullified if you have the ability to do said thing. Making yourself able to do something is the same as choosing to potentially do it in this context.
That's the problem. Apple has chosen to enable governments to censor journalists and seek out political dissidents and isn't even making the slightest attempt to hide it.
-
@jasgot said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
I bet 100% of parents have pictures of their naked children.
Definitely not. Including your child's genitals in a photo is a conscious decision you don't need to do.
We have photos of our children playing in the bathtub for example, but also made the conscious effort to not include their genitals in the photo. There's no reason to include that in the photo regardless of intentions.
Genitalia are not a requirement to classify a picture as child pornography.
New Jersey State Statute Subsection N.J. Stat. Ann. § 2C:24-4(b)(1).
Says:
"to otherwise depict a child for the purpose of sexual stimulation or gratification of any person who may view the depiction where the depiction does not have serious literary, artistic, political, or scientific value."Subsection (c) defines child pornography, not in terms of whether it depicts some sort of sexual activity, but rather in terms of the reaction the image is meant to evoke. The statute is thus remarkably broad, and the prohibition is based entirely on subjective (rather than objective) criteria.
An example may help: source
Imagine an individual who is sexually attracted to children, and who finds photographs of children bundled up in winter coats to be sexually stimulating. If that individual takes a picture of a child walking down the street who is wearing a winter coat, then the New Jersey statute would classify that picture as child pornography. All that matters is whether the image ‘‘depict[s] a child,’’ whether the individual who created the image had ‘‘the purpose of sexual stimulation,’’ and whether the resulting image had no ‘‘serious literary, artistic, political, or scientific value.’’ All three of those factors are met by the example.And that's just the US. Every country has equally bizarre rules. Australia is a great example. In Australia they classify girls as children based on breast size, not age.
So technically a picture of a skinny sixty year old woman with A cups is pornography. So what is just totally boring grandma porn in most countries is super illegal in Australia (and massively sexist that they actually have laws that say that women can't be fully considered adults unless they are curvy enough to meet the government's porn standards!)
Imagine the problems that will arise as every country's totally bizarre and "nothing to do with what we claim to be searching for" starts to become illegal. And then add to it people traveling between these insane jurisdictions.
There's no possibility of this system functioning for what they claim to intend it for. But so many terrible things it could be used for.
-
Good timing on this as I was about to buy my first iPad in a really long time when I went to the US in a few weeks. F that.
-
Something that those outside of the US need to remember...
- The US government can and does force companies to turn over data of all sorts all the time.
- It is standard to issue gag orders so that the people being spied on, nor the media can ever be alerted to it. These generally also involve forbidding the courts to be alerts so even if something is massively criminal, the company has no right to get a court to intervene. There is no law protecting the company nor those being spied on in these cases.
So be enabling the technically ability to do this, it could be happening broadly on the first day and other than someone risking their lives to report data on a corrupt government, there's no means to find out how the technology is being used.
The Patriot Act is a standard example of how these "gag orders" are used. They used to force book stores to report people buying certain books in the early 2000s and made it illegal for the bookstores to seek legal protection, tell anyone, or alert those whose private data was being stolen.
-
But the software doesn't have the ability. We're going round in circles.
-
@carnival-boy said in Apple plans to scan your images for child porn:
But the software doesn't have the ability.
Can you finish this sentence? Not sure what you are saying.
-
EFF’s take on this. Instead of click bait shit.
-
@carnival-boy said in Apple plans to scan your images for child porn:
But the software doesn't have the ability. We're going round in circles.
But it does, Apple themselves said that it does. It scans the whole device looking for whatever third party non-profits (and the government) tell it to search for. They could not possibly be more up front and clear about that. They aren't hiding this. You're claims don't seem to be that Apple won't do something bad, but that Apple is lying to make itself look bad. Why are you taking a stance that Apple is a good company, but lying? It's a very weird position to take without any reason to do so.
-
-
The on device scan to compare hashes is honestly not even a concern to me. Inside the OS, nothing is encrypted anyway.
The concern to me is the back door to messaging. They are 100% breaking end to end encryption.
Only for accounts “flagged as minors” is ducking pointless. There is no way in hell to be secure once it is not.
-
@jaredbusch said in Apple plans to scan your images for child porn:
The concern to me is the back door to messaging. They are 100% breaking end to end encryption.
At least Messaging can be disabled or ignored. Not excusing it, that's still a problem that there is anything on there spying on you (and your kids) but at least it is essentially optional.
-
@jaredbusch said in Apple plans to scan your images for child porn:
Inside the OS, nothing is encrypted anyway.
Sure, but it was at least private before.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
Inside the OS, nothing is encrypted anyway.
Sure, but it was at least private before.
Just being specific. Because privacy in the device and encryption are different things
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
But the software doesn't have the ability. We're going round in circles.
But it does, Apple themselves said that it does. It scans the whole device looking for whatever third party non-profits (and the government) tell it to search for. They could not possibly be more up front and clear about that. They aren't hiding this. You're claims don't seem to be that Apple won't do something bad, but that Apple is lying to make itself look bad. Why are you taking a stance that Apple is a good company, but lying? It's a very weird position to take without any reason to do so.
You are saying that the government could force Apple to provide them with data held on my phone. Apple can't do this, they don't have access to the data that this software gets and holds privately on my phone. The scan results are private (until uploaded to iCloud). Apple simply don't have the means to access the scan results.
That's my last post on this, I can't discuss with someone who just calls me weird.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
But the software doesn't have the ability. We're going round in circles.
But it does, Apple themselves said that it does. It scans the whole device looking for whatever third party non-profits (and the government) tell it to search for. They could not possibly be more up front and clear about that. They aren't hiding this. You're claims don't seem to be that Apple won't do something bad, but that Apple is lying to make itself look bad. Why are you taking a stance that Apple is a good company, but lying? It's a very weird position to take without any reason to do so.
You are saying that the government could force Apple to provide them with data held on my phone. Apple can't do this, they don't have access to the data that this software gets and holds privately on my phone. The scan results are private (until uploaded to iCloud). Apple simply don't have the means to access the scan results.
That's my last post on this, I can't discuss with someone who just calls me weird.
How do you know they don't? Because they said so? Lol. Apple explicitly stated that this software will upload results to iCloud, so there you have it. Conditions that trigger the upload are irrelevant at this point, the fact that it can upload anything is. Scott above explained perfectly that single warrant will force them to fork any data over.
-
@carnival-boy said in Apple plans to scan your images for child porn:
You are saying that the government could force Apple to provide them with data held on my phone.
If you are in the US, yes. That is the law. If Apple makes it possible to scan the phone, the government has the right to obtain data as to what is on the phone. The ability to do it under the law makes it accessible to the government.
-
@carnival-boy said in Apple plans to scan your images for child porn:
The scan results are private (until uploaded to iCloud). Apple simply don't have the means to access the scan results.
Can you find any source for this? Why are you the only one aware of this totally unknown, unannounced, and very dubious claim? This sounds completely made up. Nothing from any source suggests any such limitation and Apple certainly has not claimed to have created a limit like this.
There are two things that sound made up here:
-
That Apple does not use the data until connected to iCloud (but don't all phones always connect to iCloud anyway, so this is a moot point?)
-
That they have created a means by which the software cannot be forced to upload otherwise?
Because both points must be true for your claim to be possible. And from Apple has said, it appears that both points are totally from your imagination.
-
-
@marcinozga said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
But the software doesn't have the ability. We're going round in circles.
But it does, Apple themselves said that it does. It scans the whole device looking for whatever third party non-profits (and the government) tell it to search for. They could not possibly be more up front and clear about that. They aren't hiding this. You're claims don't seem to be that Apple won't do something bad, but that Apple is lying to make itself look bad. Why are you taking a stance that Apple is a good company, but lying? It's a very weird position to take without any reason to do so.
You are saying that the government could force Apple to provide them with data held on my phone. Apple can't do this, they don't have access to the data that this software gets and holds privately on my phone. The scan results are private (until uploaded to iCloud). Apple simply don't have the means to access the scan results.
That's my last post on this, I can't discuss with someone who just calls me weird.
How do you know they don't? Because they said so? Lol. Apple explicitly stated that this software will upload results to iCloud, so there you have it. Conditions that trigger the upload are irrelevant at this point, the fact that it can upload anything is. Scott above explained perfectly that single warrant will force them to fork any data over.
Right, and you said it there... THIS software initiates the upload to iCloud! That's like saying a mugger will never shoot you until they've pulled the trigger. Um, sure, you are just saying the same thing twice. If this software initiates the upload to iCloud, then of course they "never get your data until it connects", it's that very connection that we are discussing! Carnival might as well have said "but it never steals your data until it steals it!"
Um.. duh.