Apple plans to scan your images for child porn
-
@popester said in Apple plans to scan your images for child porn:
@scottalanmiller yes it is!!! I hope it is not used as a tool to incarcerate enemies of whoever is in power. How hard would it be to plant a porn mine?
Exactly. Takes no effort from "scanning your private devices" to "controlling your private devices."
-
Then don't use iCloud. iCloud has been doing this for years and no-one has complained?
-
@carnival-boy said in Apple plans to scan your images for child porn:
If I understand it correctly, it's still only notifying anyone when it is uploaded to iCloud. It's really just moving the processing of the images from iCloud servers to local devices, but the end result is the same. I'm not sure this is an issue.
Only NOTIFYING, maybe. We just have to trust them on that, if that's even what they are saying. The concern is that they are putting something on your device that scans your data (any data, it has to scan everything to look for one thing) and then sometimes reports what it finds to the government.
In a make believe world where both companies and governments can be trusted to be lawful and trusted to act in the public interest, we'd still have the problem that we have to trust both to be insanely secure on top of that.
There's no way that this works. This sounds like the tool of authoritarian regimes to allow them to plant stuff. This isn't how it sounds when people are looking for actual porn.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Then don't use iCloud. iCloud has been doing this for years and no-one has complained?
That doesn't change anything at all though. None of the concerns are with iCloud.
-
@marcinozga said in Apple plans to scan your images for child porn:
What if we elect someone like trump again and he/she decides to go after whistleblowers, journalists, immigrants, minorities, LGBTQ, muslims, jews, etc?
Where "what if" is "we will" in this case. Just a matter of time.
-
@marcinozga said in Apple plans to scan your images for child porn:
What Apple is doing is a job for law enforcement, not a tech company.
The problem is, in America, the line between public utilities and government agencies and private companies doesn't exist. The entire healthcare, telecommunications, or power distribution utilities that are necessarily monopolies and part of the public space are all run my private companies given government level power over people. So that private companies with no oversight or ethical requirements are being handed the power of the FBI is no surprise.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
We just have to trust them on that, if that's even what they are saying. The concern is that they are putting something on your device that scans your data (any data, it has to scan everything to look for one thing) and then sometimes reports what it finds to the government.
So the concern is about Apple doing something that they have explicitly said they won't do rather than any concerns about what they are actually saying they are doing. You could get tinhat with every tech company in that case.
-
Fundamentally, I don't understand why you think there is such a big difference between scanning a photo in the cloud, and scanning one on a device - in both cases the photo is getting scanned.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
We just have to trust them on that, if that's even what they are saying.
HAHAHAHAHAHAHHAHAHHAHAHA!!!!! nope......
I dont care who you are, dont trust ya. Keep yer booger hooks away from my info.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
We just have to trust them on that, if that's even what they are saying. The concern is that they are putting something on your device that scans your data (any data, it has to scan everything to look for one thing) and then sometimes reports what it finds to the government.
So the concern is about Apple doing something that they have explicitly said they won't do rather than any concerns about what they are actually saying they are doing. You could get tinhat with every tech company in that case.
The concern is Apple is bringing 1984 into reality. This isn't happening on their devices (iCloud, Google photos, Facebook, etc.) anymore, your devices are being used as a surveillance tools against you.
If you don't see an issue in non-zero chance of you being locked up because some machine learning that can't tell the difference between a moon and a traffic light decided your pictures are child porn, then you're lost.
Government wanted backdoors in encryption, now they got one. From a company that championed end to end encryption and refused to create such backdoors in the past. -
@carnival-boy said in Apple plans to scan your images for child porn:
Fundamentally, I don't understand why you think there is such a big difference between scanning a photo in the cloud, and scanning one on a device - in both cases the photo is getting scanned.
The difference is you don't have to upload the photos to the cloud. You could also encrypt them before uploading. Now your photos will always be scanned on YOUR device.
On top of that, are you ok with their employees looking at pictures of your children in case there's a match with the database?
-
Ok, here's a real life example that caused me to lose quite a few nights of sleep.
Couple of years ago my son had an ear infection, we were in doctors office, my wife took a picture of him to send it to her parents while he was undressed. I was chatting with a friend on discord, who just had her own baby. I tried to send that picture to her, but discord would constantly delete the upload, not because picture was too big. My guess was because my son's nipple was visible. I was really scared and was ready for feds to knock on my doors. I spent hours with discord tech support and kept screenshots of the chat with my friend just in case I was flagged for sending child porn.Now with Apple scanning all images, it's easy to imagine false positives will happen quite frequently, and some will have their lives ruined.
-
@marcinozga said in Apple plans to scan your images for child porn:
The difference is you don't have to upload the photos to the cloud. You could also encrypt them before uploading. Now your photos will always be scanned on YOUR device.
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
The difference is you don't have to upload the photos to the cloud. You could also encrypt them before uploading. Now your photos will always be scanned on YOUR device.
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
Now they do. Do you really think they will only scan images that are being uploaded to iCloud? How many pedos are that dumb to upload pictures to 3rd party cloud service? They will scan pictures on the phone, otherwise this entire surveillance framework is useless.
Actually, this is all useless already. Everyone knows about it, so real pedos will simply not use Apple devices.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Fundamentally, I don't understand why you think there is such a big difference between scanning a photo in the cloud, and scanning one on a device - in both cases the photo is getting scanned.
Because there is no expectation of privacy concerning your data on their servers, unless they told you there was. Even so, they can change the rules about their storage facilities as they wish.
The issue is that they are looking at your data stored on your device. This is where the line gets crossed.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud.
It is no accident that totalitarian systems in which there is no freedom whatsoever also tolerate no privacy.
Recent history, in my lifetime, privacy has eroded. I dont care what side of the isle you are on, if someone says your information is private, they are lying to you. It doesnt matter who is leading the ship, if you dont protect your privacy like you protect your own life.
It is a slippery slope to "Find me the person and I will find a crime that suits your needs".
They are all lying, until such time as their actions prove otherwise.
-
@jasgot said in Apple plans to scan your images for child porn:
Because there is no expectation of privacy concerning your data on their servers, unless they told you there was. Even so, they can change the rules about their storage facilities as they wish.
It is unfortunate but true. Caveat emptor.
-
@jasgot said in Apple plans to scan your images for child porn:
The issue is that they are looking at your data stored on your device. This is where the line gets crossed.
No they're not, unless I've misunderstood. Apple are not looking my phone. The data is private until I upload it to iCloud. My phone is analysing the data, but my phone and Apple are not the same thing.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@jasgot said in Apple plans to scan your images for child porn:
The issue is that they are looking at your data stored on your device. This is where the line gets crossed.
No they're not, unless I've misunderstood. Apple are not looking my phone. The data is private until I upload it to iCloud. My phone is analysing the data, but my phone and Apple are not the same thing.
Not your phone, a piece of software written by Apple. And said software will upload results to Apple without your consent. And it just goes downhill from there.
-
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.