Apple plans to scan your images for child porn
-
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
We just have to trust them on that, if that's even what they are saying. The concern is that they are putting something on your device that scans your data (any data, it has to scan everything to look for one thing) and then sometimes reports what it finds to the government.
So the concern is about Apple doing something that they have explicitly said they won't do rather than any concerns about what they are actually saying they are doing. You could get tinhat with every tech company in that case.
The concern is Apple is bringing 1984 into reality. This isn't happening on their devices (iCloud, Google photos, Facebook, etc.) anymore, your devices are being used as a surveillance tools against you.
If you don't see an issue in non-zero chance of you being locked up because some machine learning that can't tell the difference between a moon and a traffic light decided your pictures are child porn, then you're lost.
Government wanted backdoors in encryption, now they got one. From a company that championed end to end encryption and refused to create such backdoors in the past. -
@carnival-boy said in Apple plans to scan your images for child porn:
Fundamentally, I don't understand why you think there is such a big difference between scanning a photo in the cloud, and scanning one on a device - in both cases the photo is getting scanned.
The difference is you don't have to upload the photos to the cloud. You could also encrypt them before uploading. Now your photos will always be scanned on YOUR device.
On top of that, are you ok with their employees looking at pictures of your children in case there's a match with the database?
-
Ok, here's a real life example that caused me to lose quite a few nights of sleep.
Couple of years ago my son had an ear infection, we were in doctors office, my wife took a picture of him to send it to her parents while he was undressed. I was chatting with a friend on discord, who just had her own baby. I tried to send that picture to her, but discord would constantly delete the upload, not because picture was too big. My guess was because my son's nipple was visible. I was really scared and was ready for feds to knock on my doors. I spent hours with discord tech support and kept screenshots of the chat with my friend just in case I was flagged for sending child porn.Now with Apple scanning all images, it's easy to imagine false positives will happen quite frequently, and some will have their lives ruined.
-
@marcinozga said in Apple plans to scan your images for child porn:
The difference is you don't have to upload the photos to the cloud. You could also encrypt them before uploading. Now your photos will always be scanned on YOUR device.
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
The difference is you don't have to upload the photos to the cloud. You could also encrypt them before uploading. Now your photos will always be scanned on YOUR device.
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud. That's my point.
Now they do. Do you really think they will only scan images that are being uploaded to iCloud? How many pedos are that dumb to upload pictures to 3rd party cloud service? They will scan pictures on the phone, otherwise this entire surveillance framework is useless.
Actually, this is all useless already. Everyone knows about it, so real pedos will simply not use Apple devices.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Fundamentally, I don't understand why you think there is such a big difference between scanning a photo in the cloud, and scanning one on a device - in both cases the photo is getting scanned.
Because there is no expectation of privacy concerning your data on their servers, unless they told you there was. Even so, they can change the rules about their storage facilities as they wish.
The issue is that they are looking at your data stored on your device. This is where the line gets crossed.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Scanned, but the info is private. Apple explicitly don't have access to the data until it is uploaded to iCloud.
It is no accident that totalitarian systems in which there is no freedom whatsoever also tolerate no privacy.
Recent history, in my lifetime, privacy has eroded. I dont care what side of the isle you are on, if someone says your information is private, they are lying to you. It doesnt matter who is leading the ship, if you dont protect your privacy like you protect your own life.
It is a slippery slope to "Find me the person and I will find a crime that suits your needs".
They are all lying, until such time as their actions prove otherwise.
-
@jasgot said in Apple plans to scan your images for child porn:
Because there is no expectation of privacy concerning your data on their servers, unless they told you there was. Even so, they can change the rules about their storage facilities as they wish.
It is unfortunate but true. Caveat emptor.
-
@jasgot said in Apple plans to scan your images for child porn:
The issue is that they are looking at your data stored on your device. This is where the line gets crossed.
No they're not, unless I've misunderstood. Apple are not looking my phone. The data is private until I upload it to iCloud. My phone is analysing the data, but my phone and Apple are not the same thing.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@jasgot said in Apple plans to scan your images for child porn:
The issue is that they are looking at your data stored on your device. This is where the line gets crossed.
No they're not, unless I've misunderstood. Apple are not looking my phone. The data is private until I upload it to iCloud. My phone is analysing the data, but my phone and Apple are not the same thing.
Not your phone, a piece of software written by Apple. And said software will upload results to Apple without your consent. And it just goes downhill from there.
-
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
-
@marcinozga said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
We just have to trust them on that, if that's even what they are saying. The concern is that they are putting something on your device that scans your data (any data, it has to scan everything to look for one thing) and then sometimes reports what it finds to the government.
So the concern is about Apple doing something that they have explicitly said they won't do rather than any concerns about what they are actually saying they are doing. You could get tinhat with every tech company in that case.
The concern is Apple is bringing 1984 into reality. This isn't happening on their devices (iCloud, Google photos, Facebook, etc.) anymore, your devices are being used as a surveillance tools against you.
If you don't see an issue in non-zero chance of you being locked up because some machine learning that can't tell the difference between a moon and a traffic light decided your pictures are child porn, then you're lost.
Government wanted backdoors in encryption, now they got one. From a company that championed end to end encryption and refused to create such backdoors in the past.talk about tinfoil hat!
This isn't anything new. They are simply TELLING you about it now. Hell since cellphone came out they have been tracking us anywhere and everywhere.
Is it bad - oh hell yeah it's bad - is there a damned thing we can do about it? Nope, not if you want to live in the modern world.
Scott seems like he's indicating he might be immune to this stuff because he's not in the USA - and of course, that's not true, not strickly speaking anyway. The gov't there would LOVE to do this as much as the US gov't does, they simply can't afford it.
But again, the original post is about Apple scanning to keep child porn off their platform/devices (I'm guessing they are now claiming you don't own your iPhone, you only rent it) otherwise I'm not sure of the legality of what they are doing...
-
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Believe me - they'll get your consent on the next software update on page 3645 of the new EULA that almost no one reads.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Did you bother to read the article? If a match is found, it will be uploaded to Apple for manual verification. This alone should creep you out. Some random person looking at your pictures deciding if it's child porn or not. And then passed on to some non-profit setup by government. Zero transparency, no way to audit the whole process. What could possibly go wrong there?
-
I don't think so. I don't think Apple are too bad with privacy since their business model is still based on selling hardware rather than selling data. I trust them more than others.
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
-
@carnival-boy said in Apple plans to scan your images for child porn:
I don't think so. I don't think Apple are too bad with privacy since their business model is still based on selling hardware rather than selling data. I trust them more than others.
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
Again, read the article. You will get the software on the phone in iOS 15, and it will phone home. And unless you cut off internet access completely, there's not a thing you can do about it.
-
@marcinozga said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Did you bother to read the article? If a match is found, it will be uploaded to Apple for manual verification. This alone should creep you out. Some random person looking at your pictures deciding if it's child porn or not. And then passed on to some non-profit setup by government. Zero transparency, no way to audit the whole process. What could possibly go wrong there?
I wonder why apple is doing this? why now?
Child porn is horrific - but damn.. any time anyone wants to trample on your rights, it's the first thing they trot out - We gotta save the children... /sigh - Is child porn really this prolific?
I say the same thing with gun violence.... more people still die from car crashes every single day - why isn't the gov't mandating driverless cars, the rate won't be zero, but would be significantly less than it is today. ok that was a tangent. -
@marcinozga said in Apple plans to scan your images for child porn:
Again, read the article. You will get the software on the phone in iOS 15, and it will phone home. And unless you cut off internet access completely, there's not a thing you can do about it.
Which bits. I don't find the article that clear. It initially suggests the data is uploaded automatically from the phone, but ends with these statements which clearly say it is only photos that are uploaded to iCloud that are affected:
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system
According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a “safety voucher” saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.
The article is also based on speculation and they haven't got Apple to comment.
-
@dashrender said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Did you bother to read the article? If a match is found, it will be uploaded to Apple for manual verification. This alone should creep you out. Some random person looking at your pictures deciding if it's child porn or not. And then passed on to some non-profit setup by government. Zero transparency, no way to audit the whole process. What could possibly go wrong there?
I wonder why apple is doing this? why now?
Child porn is horrific - but damn.. any time anyone wants to trample on your rights, it's the first thing they trot out - We gotta save the children... /sigh - Is child porn really this prolific?
I say the same thing with gun violence.... more people still die from car crashes every single day - why isn't the gov't mandating driverless cars, the rate won't be zero, but would be significantly less than it is today. ok that was a tangent.It always starts with children. And it's really not about that, because now that it's out, any pedophile that has 2 brain cells will simply stop using Apple devices.
-
And the Apple link clearly says it isn't doing what the article initially says will happen.