How would Apple roll this back?
Seems unlikely, but stepping into "what if", what are people's thoughts on what it would like for Apple? Does it mean Cook resigning, firing a team. How do we get from here to there? Is CSAM on iPhone the next AirPower where it just gets cancelled?Reply
I keep hearing the same argument from people, and I'm mostly in agreement, but to summarize the argument:
>"I am not afraid of personally being found with child pornography, but I don't want Apple searching my files looking for criminal intent"
But I would argue that you SHOULD be afraid of personally being found with child pornography.
First and foremost, this feature could be exploited for harm. If anyone ever wants to hurt you, all they have to do is get access to your unlocked Apple device long enough to drop CASM into a backed-up folder, and you get reported to the feds.
Or, way more realistically, maybe you downloaded some amateur porno that you didn't know had underage participants in it, and you saved it where it got backed up to the cloud, and two years later the government has added the video to their known CSAM database and you get reported to the feds for having it.
Given how the mere accusation of child sex crimes is enough to destroy careers, marriages, and lives, I see absolutely no reason anyone should trust their Apple devices with anything should this become normal practice.
And if this becomes normal practice, it follows that CASM won't be the only thing they end up looking for.Reply
Why won't you buy the new iPhone Billy? Are you a pedophile? Guilty until proven innocent.Reply
There are two things that make me think this will be walked back.
Firstly, and most importantly, this kind of backdoor is the kind of thing that makes big corps prohibit the use/purchases of devices.
Secondly, it seems rife for abuse: dont like someone who uses an ios device, msg them some cp and destroy their entire life.Reply
I don't get it, I thought my data was safely in the clouds...
You're not telling me they can access the clouds right? /sReply
I‘m still baffled by this whole issue.
Snooping around on users devices, goes against anything regarding privacy they seemed to stand for. It makes me sick to think this might even be legal. It is also sad they completely ignore any valid concerns about this intrusion.
Whats the next step if we as a society accept this behavior? Homepod listening in, until hashes of conversation snippets match some prehashed terms and sounds potentially related to child abuse and then alerting the authorities?Reply
This was such a weird way for them to announce that they’re no longer pursuing privacy as a differentiation strategy.Reply
Out of principle, I've wiped my iPhone and moved my primary SIM card to a Pixel 5 running https://calyxos.org
This CSAM scanning will do effectively zero to stop children from being abused. It will only serve to breach the client side of E2E across the industry. Apple's privacy credibility is shot. We need to push hard for true open source mobile.
Today's mobile duopoly (including the look don't touch that is Android), is the digital yoke of our times.
Support the Calyx Institute: https://calyxinstitute.org/Reply
a bit of stretched speculation - may be Apple is just ahead of the curve getting ready (and thus naturally bringing it closer - dialectics though) for a future where "common carrier" protections would get eroded even more. We already have FOSTA laws against websites and it isn't a big stretch to imagine that the Apple style scanning would become a "reasonable care/etc." under FOSTA style laws.Reply
"The ark of the covenant is open and now all your faces are going to melt" - to paraphrase "The Thick of It".
Prior to this when a government tapped Apple on the shoulder asking to scan for what they (the government) deem illicit material, Apple could have feasibly say "we cannot do this, the technology does not exist".
Now the box is open, the technology does exist, publicly and on the record - Now we are but a national security letter, or a court order with an attached super-injunction, away from our computers and phones ratting us out for thought crime, government critique and embarrassing the establishment. At first it's the photos, but what about messages and browsing history or the legion other things that people use their devices for, when do they follow?
CSAM is beyond abhorrent, morally reprehensible and absolutely should be stopped. This is not in question. And I have no reason to doubt that there is sincerity in their goal to reduce and eliminate it from Apple's perspective.
But they have root; this system and it's implementation is a fundamental betrayal of the implicit trust that their users have placed in them (however foolish it was, in hindsight). It just is not possible to regain that trust now that it is gone, when the level of control is so one sided and the systems involved are so opaque.Reply
CSAM, like anything other pretensed upon "protect the children", it's not mainly about actually protecting the children.Reply
I feel like this is a good thing given the circumstances we're already in. It might catch some predators while still keeping everyone's photos private. Sure, it's ripe for abuse, but when has it not been that way? Haven't we always had to trust Apple to do the right thing with our iDevices?
This could marginally reduce Apple's leverage against evil governments seeking to pry into iPhones since "if you can do it for CP, you can do it for tank man". But with Apple already being the dictator of all OS-level code running on your device, this capability isn't anything fundamentally new. Do we have any guarantee that certain devices in China haven't received a "special" version of some software update that introduces a backdoor?
Personally, I have an Android and hope that my next phone will run some flavor of Linux.Reply
Why on earth is Apple, a good tech company, with a culture of secrecy, using slack? Can it not build its own secure version of slack?Reply
I think Apple actually thought they were doing the right thing. Maybe they are, and we're overreacting. Things have to calm down a bit first before that will be clear. But either way, it was [yet another] PR miscalculation. A very Apple thing to do...Reply
I sure am happy I don't work at their retail stores any more; I shudder to imagine the abuse the folks in the stores are getting over this.Reply
Stay tuned! On the next episode:
"Apple bravely scans your text messages. Users greeted by 'Messages noticed this website contains misleading content, are you sure you want to share it?' in iOS 16." -MacRumors (2022)Reply
0. I've initially been ambivalent or even supportive of Apple on this, because much of the initial criticism was uninformed and easily refuted, or hysterical. But I'm getting around to this being a momentous blunder.
1. I understand Apple doesn't like to discuss its plans publicly, but rather present the final product when it is ready to ship. But if anything was a good candidate for one of the rare exceptions, then this, no?
2. Why announce unambiguously that this feature is "available" in the US only at first, in other words, can be enabled/disabled/configured per region? What is the plan when country X comes and says "fine feature you've built there, here's our list of perceptual hashes of illegal imagery, and the contact data of our competent authority to notify in case of matches."? Replying "ah, sorry, we can't" is not going to fly anymore, ever (!).
Even if they walk this back, what defence will they have left in the future if any government requires them to implement this odious feature and scan all images on-device and notify some authority if anything untoward (in that jurisdiction) is found?Reply
Question for Android users: do you have Google Photos backup enabled?
I’d say most people have iCloud Photos enabled, so I’m trying to gauge whether that’s true of Google Photos too.
Google Photos also does CSAM scanning, I believeReply
Now I know what the "i" in iPhone stands for: intelligence.Reply
Message to Apple should they care:
The next Apple iPhone upgrade is on hold, until all plans about CSAM scanning on-device is cancelled.
I have a story to share:
When my daughter was three years old, we started giving her some IPad time.
She learned about Kids YouTube, and wanted to become a “Youtuber.”
Unbeknownst to us, she began making extremely long, detailed videos teachings other kids how to perform her latest accomplishment: how to use the potty.
She then pushed a series of icons which, she believed, uploaded the videos to YouTube for the world to see.
She then offhandedly informed us that she had been teaching other kids how to use the potty. By YouTube.
Much screaming and shouting ensued. Ultimately we determine she wasn’t uploading anything, and couldn’t even if she knew how.
She did however have several hours worth of videos showing th world how to use the potty.
She was quite upset when we deleted them.
Leaving aside debate on the “scanning photos against a database of hashes”, I hope that the parent-enabled features are useful.Reply
> But Apple was surprised its stance then was not more popular, and the global tide since then has been toward more monitoring of private communication.
Personally speaking, Apple denying breaking into a phone in 2016 was indeed one of the reasons I believed them when they said they stand up for privacy. This is now developing a foul after taste - when it really counts, they don’t publicly stand up against governments.Reply
Our only option is minority screech.Reply
Those being outraged about this decision haven’t accepted yet that one can’t solve legal issues around encryption with technology - not even with open source.
The UK for example can throw people in prison if they don’t hand over their passwords/keys.
Admitting this however would by necessity lead to admitting that citizens of the US/EU have almost zero say in how their government legislate encryption. All these countries wanted to subvert encryption for decades already and they have all the time in the world, unlimited budgets and every other possible tool in their arsenal.Reply
Apple is already scanning photos for faces and other objects. Just search in the photos app for something like dog or baby and you’ll see (assuming you’ve taken a photo of a dog or baby before).
How is this much different?Reply
Yeah this makes sense, scan everyone's phones...
Because if there's one thing the whole Jeffrey Epstein thing taught us, is that there's nothing more important to the authorities than children, and they'll do anything to protect them, no matter the cost, even if an actual prince or a current president is involved.
Oh wait, hang on. Actually nothing happened and they don't give a shit about kids.
So why do they want this again?Reply
I stopped my Apple Music subscription and downgraded iCloud to free, then donated a year of those fees to the EFF.
$13/month is nothing to Apple, and I really regret it since photos in iCloud is really convenient, however it feels unethical to contribute to the Apple services system when it will inevitably be used to hurt people like dissidents and whistle blowers.
It is arrogant to blow off security and human rights experts outside of Apple as simply confused or mistaken, when there are sufficiently informed and imaginative people within Apple to understand the practical implications of preemptively investigating users. Such contempt for users is also a sign of worrisome complacency.
On the plus side, Apple has seemed to crowd out interest in third party OS user interfaces. Knowing that Apple believes you're a pedophile until they've run image analysis software on your stuff is one way to motivate people. Maybe 2021 will be the year of Linux on the desktop.
Anyway I hope that more users sacrifice the convenience of Apple services now that its hostility is apparent, and I hope that more people tell their political representatives that it is time for meaningful updates to anti-trust legislation. I know that I was complacent when I thought that having a choice, where at least one of the companies didn't place ads in the start menu, was a tenable state of the market.Reply
People will just go back to use normal cameras for photos where there is no scanning, the bad guys will use that anyway. It makes no sense for Apple to do that, unless they are preparing for something more big.Reply
I’m more mad that they made a tool that will clearly be abused by totalitarian governments that need to squash protests in HK for example.Reply
I understand they are going to make perceptual hashes of images, but what are they going to do to videos?
Did they not see Fight Club where frames were added on videos? Are they going to hash each frame from videos as well? All 60 fps 4k videos? I’m sure that a lot of people saw that movie, and some of them might be pedophiles. It really looks like an half assed solutionReply
It makes no sense to me to create technology that incentivizes the creation of fresh, unindexed CSAM.
Makes me wonder if the management at apple feels the current stock of CSAM content is getting stale? /s
At a technological level, this highlights the difference between perceptual hashing and image classification, which (ostensibly) can extrapolate from the training data. Regardless of how you feel about running this thing on your phone, the approach they chose has some significant perverse incentives.Reply
The only reason Apple would be so cavalier with CSAM is knowing that the data is already accessible using other means. I guess that's why they have a disregard for the privacy aspect of it.Reply
Its a terrible move for their business, I am already looking for an alternative.Reply
CSAM is such a blatant excuse to me it blows my mind how such a huge mindshare on here believes this is actually about child protection or is a valid argument. Since this is public information do you think pedophiles will just get caught by this? Will this curb their appetite?
If somehow I found myself in an authoritarian state where porn was illegal just on apple devices do you think I wouldn't just switch to a different OS?
This is a bullshit excuse and we've seen it countless times. It is manipulation 101, get the victim to agree to a small request first then ask more and more.
I'm glad I never used apple devices and I'm not caught in their ecosystem.Reply
A bit off topic, but I'm amazed about the huge media attention this is getting, while a far more serious precedent is being set in the EU, completely banning all private communications between individuals:
The short version is: all communication providers may include back doors and analyse messages of users with the same excuse. The follow-up is to make this mandatory. In practice, this will be a ban on any end-to-end encrypted communications.Reply
Cancelled my Apple TV+, iCloud. In the process of selling my iPhone and Apple watch. I know it seems crazy but I feel betrayed and this is the only way I can protest this. Will I have less privacy on android? Yes.Reply
I think this has been developed by Apple attempting to balance a complex issue: E.g. Government request asking them to report with certainty that their cloud storage is not used for illegal activity (CP in this particular case but each country/govt may end up with different requirements).
Apple thought they found a clever way to satisfy a government request of "Can you guarantee your storage is not used to house XYZ". Apple then continues to be able to advertise 'privacy' while retaining E2E encryption (or at least have E2E across all services eventually).
What they didn't forsee is the potential slippery slope backlash that the IT community has become concerned about.
The scanning feature could be used in more nefarious ways (if modified in future). For example. The hash checking could be altered to check and hash against each photo metadata field instead of the photo itself.
Now we can find who took photos in a particular location at a particular point in history while still retaining E2E!
Would it go that far? Or can we trust Apple to stop moving the line in the sand?Reply
There is no way this doesn't get out of hand. This program is now the number one hacking target for every nation on earth. Currently it scans for CSAM, but can 'Free Hong Kong' images get added? What about Q or anti Putin literature? What if the US government believes incredibly sensitive documents were photographed and lives depend on finding out who did it? Is apple going to start reporting teens for selfies? This is a stone throw from escalating to minority report. I know this seems a little far fetched but it is very rare for governments or companies to willingly become less intrusive, it always escalates.Reply
The problem is that, as much as this capability is disturbing, Apple chose it's first target well: It's difficult to be too vocal about this without the risk of getting labelled weak on underage sex trafficking and kiddie porn.
So, no matter the criticism, Apple won't seriously be pressured to reverse course, and after that the door is open on incremental increases in content monitoring.
(Not to mention the problems that could arise from false positives.)Reply
Has someone built a guide to self host on iOS/MacOS so the CSAM issue is moot? Or what are the other platform options, like on Android?
I’m concerned that Apple’s stance on security has been compromised and it is time to dump the platform or find an acceptable modification.
I’m surprised there’s no gist being linked on HN about to get people bootstrapped for secure device backups as an alternative to iCloud (and without a jailbreak).Reply
If it's on device, is it possible to reverse engineer the hashing algorithm? I'm assuming some clever people can then brute force some matching images with nonsense in them that test positiveReply
I have advice for Apple employees. Think Different.
You've become Microsoft. I might as well buy a facebook phone.Reply
What worries me most is that hackers and others with ulterior motives, will engineer a situation creating a false positive, or worse. If you want to absolutely destroy someone, accuse them of possessing pornographic material of children. Even if it turns out to be absolutely false, the stain is still there, the bell cannot be unrung. Damage both in the short and long-term to reputations, careers, the lives, will be devastating. None of us are safe from this.
“yeah but there was that one time Apple thought she had kid porn on her phone.. she said it was a false positive, police investigated never prosecuted etc., but…better to not hire, etc..you know..just in case”Reply
As it should be sparked! Apple is asking its engineers to create spyware and deploy it on millions of devices. "Concern" is majorly understating my feelings right now.Reply
I'm not an expert here but it may be time for white box no name mobile phones that come with nothing that we have to set up ourselves. My second computer was the first time I downloaded from both kernel and kerneli and rolled my first Linux Kernel based gnu system....Reply
As I see people straining to defend Apple, coming up with all types of reasons about on-device image checking will not be abused or expanded, about how people shouldn't really expect to not be scanned, I can't help but just think of Apple's own words and current public position on privacy.
Here's their privacy page: https://www.apple.com/privacy/
"Privacy is a fundamental human right."
Apple has adopted this as a "core value". Their executives use this exact phrasing.
It is an extraordinary claim. They don't say, "We think privacy is important". They claim it as a fundamental human right (i.e. applying to everyone, unconditionally) and wrap themselves in the banner of it.
How anyone can square that with statement, that self-described "core value" with their recent on-device image checking announcement, I really have no idea.Reply
How is Apple's CSAM implementation not a violation of the fourth amendment?Reply
Can we talk about the irony that the NCMEC's database of CSAM (two abbreviations I didn't know before this week) is literally a huge child porn collection, or is that too immature? The whole story sounds like bad science fiction to me, I gotta say. But sadly I think we have to beware of anything that's packaged as anti-child-porn just like whatever's anti-terrorism, anti-crime etc. etc. You should be thinking "What is the thing that's slightly less heinous than child porn, that needed to be wrapped in the magic cloak of comparative heinousness for anyone to be horrified enough to accept it?"Reply
> A fundamental problem with Apple's new plan on scanning child abuse images, critics said, is that the company is making cautious policy decisions that it can be forced to change, now that the capability is there, in exactly the same way it warned would happen if it broke into the terrorism suspect's phone.
This is the most important part that seems to be missing in every denial that Apple won't work with governments to spy on various types of content. _It's not Apple's choice to make._ They have no control over whether they're forced to or not.Reply
Apple 2019: ‘what happens on your iPhone, stays on your iPhone’
Apple 2021: but we will scan your photos anyway
I don't understand something they are scanning "using a database of known CSAM image hashes"
This means new photos will not be in that database, so what is the point of scanning for old photos as it will not prevent new crimes!Reply
I'd rather prefer server side than local device scanning.
What prevents CSAM from being compromised on the device by a rouge state actor to smear and false flag opponents ???.Reply
"Apple says it will scan only in the United States and other countries to be added one by one, only when images are set to be uploaded to iCloud, and only for images that have been identified by the National Center for Exploited and Missing Children and a small number of other groups."
First reference I read about adding other countries as a done deal, and especially about an incredibly opaque "small number of other groups" being involved as well.
Man, after all they've said about being obsessed with privacy they're really not doing themselves any favors here. What a tragedy.Reply
I happened to speak to an employee of the police here in Germany. He is in charge of backing up data from suspects. It is mostly drug related, but second most: Child Porn. It is really frightening about the amount of data he has worked on in the last week. And that is only for a county of approx. 200k inhabitants.Reply
This is the ultimate lock-in strategy. If you switch from iPhone now, everyone will suspect you of being a pedo and you'll have to endure the social consequences.Reply
> It's a complete change of narrative and there's no easy way to explain it and still defend Apple's Privacy narrative, wihout doing extreme mental gymnastics.
Everyone who took Apple at their word was already doing extreme mental gymnastics because Apple's privacy stance was a farce on borrowed time to begin with. Now it's just blatantly obvious to everyone.Reply
I want to own the devices I buy, hardware and software. Apple has no business doing what is the role of a state.
I, quite simply, don't trust them.Reply
> Apple was surprised its stance then was not more popular, and the global tide since then has been toward more monitoring of private communication
Huh. That’s a big problem. If the American public doesn’t support privacy, we won’t have it.Reply
Good. I would be furious if I worked there. After the San Bernardino case, I viewed them as the paragon of privacy and security, to the extent I ignored most criticisms, including their lack of support for right-to-repair and concerns over App Store rejections. All of that is back on the table for me after this decision.
It is out-of-step with everything they've been doing up to this point, and it makes me wonder who has something over the head of Apple that we aren't hearing about. The stated benefits of this tech are far outweighed by the potential for harm, in my view.
If Apple pushes forward on this, I want to hear a lot more from them, and I want it to be a continuing conversation that they bear until we all understand what's going on.Reply
What’s going to be sad to note over the next few months is yet another record breaking year and another record breaking first quarter for Apple (this is as per Apple’s financial calendar).
All these concerns and outrage don’t seem to matter to most people, and a few boycotts here for Apple, a few more boycotts there for Facebook, and some more for <name-another-for-profit-bad-company> aren’t making any difference to their bottomline.
To say I’m seriously disappointed and frustrated is putting it very mildly.Reply
States need to put a law on the books with fines to prevent faux encryption and scans without warrants which are unauthorized by the device owner, further consent may not be covered nor may electronic suppliers attempt to lease or license equipment to circumvent these protections.Reply
I'm putting the golden gate bridge up for sale.Reply
Is there really anyone who does not see this for what it is, a really transparent effort to introduce surveillance infrastructure under the guise of "saving the children"?
If so, I would love to hear what assures you so much about this.
A couple reasons I don't believe a single word:
* After years of the surveillance state begging and pleading about compromising encryption, they went silent after talks with Apple and Google, over about the very time it can reasonably be expected it would take to develop this "feature".
* The number of pedos this could potentially even catch is supposedly so small that it is a compromise of everyone's rights … an inherent contradiction of not only Constitutional laws, but the very concept of innocence before being proven guilty. Not to mention that we have long been told there are no pedo rings, even in spite of the fact that even without this "feature" the last administration broke up more of these pedo rings than ever before. Why do we need this "feature" now then?
* Any actual pedo rings will easily work around this feature simply by altering images and changing the hash in various ways too numerous to list right now.
* Innocent people could easily be swept up in this if they happen to have images that hash to something the feds are comparing against, within the threshold of tolerance. What happens then, a secret court provides a secret court order to penetrate your network and devices to confirm whether you are a pedo because you took pictures of your grandchildren in a bathing suit?
* oh, did I mention the mountain of history of nothing but abuse and lies and lies and abuse in every single way possible and even to the point that he people exposing the not just illegal acts, but evil acts, are deliberately targeted by the state?
Why anyone would believe Apple, let alone the government that is clearly suspiciously standing in the background, looking away, whistling into the sky trying to act as if it has nothing to do with this, is beyond me. It's all just lies, front to back, top to bottom, left to right and in every other dimension.
What this clearly will be misused for is to identify or fingerprint dissidents and worngthinkers of any kind, including those who thing they are now rightthinkers, who will find themselves on the other side of the divide when they've expended their usefulness.Reply
This is what Apple's policy USED to sound like. - https://www.youtube.com/watch?v=BZmeZyDGkQ0
"...You can't have a backdoor that's only for the good guys. That any back door is something that bad guys can exploit."
"No one should have to choose between privacy and security. We should be smart enough to do both."
"You're assuming they're all good guys. You're saying they're good, and it's okay for them to look. But that's not the reality of today."
"If someone can get into data, it's subject to great abuse".
"If there was a way to expose only bad people.. that would be a great thing. But this is not the world. .... It's in everyone's best interest that everybody is blacked out."
"You're making the assumption that the only way to security is to have a back door.....to be able to view this data. And I wouldn't be so quick to make that judgement."
No matter the mental gymnastics now, it's still a pretty drastic departure from what it used to be.Reply
If they’re going to be scanning in the client, wouldn’t it be better to scan when someone is downloading something—-the way viruses abs malware are scanned? This way they can -prevent- crime.Reply
It is the first time that I have a reason to leave Apple ecosystem. And I will if this goes trough, even though I do not live in the US.Reply
On the contrary, it would be weird if everyone inside unanimously agreed.Reply
I just do not want Apple scanning my phone for the purpose of finding something they can send to the police.
I’m not even talking about any “slippery slope” scenarios and I’ll never have any of the material they are looking for. And right or wrong, I don’t really fear a false identification, so this isn’t about a worry that they will actually turn me in. I just don’t want them scanning my phone for the purpose of turning me in. I definitely do not want to take an OS update or new phone that does this.
(False positives are the huge flaw in this system, though. There are human judgements by anonymous people using opaque processes without appeal throughout this. A lot of damage can be done before anything goes to court, not to mention courts themselves are susceptible to the illusion of infallible technology. Others have well laid out the scenarios that will result in serious consequences for innocent people.)Reply
How long before journalists/activists in different countries get jailed/killed for taking pictures of some event that the government did not like and requested CSAM hashes from Apple, unrelated to "child-safety"?Reply
Imagine some state decides to focus the "war on crime" efforts on scanning people's phones for selfies with handguns and/or white powder.
That's what the CSAM initiatives sound like to me.Reply
Many people take the following maximalist stance: "Apple can't save users' privacy in China, nor can it push back against laws, so therefore it's unreasonable to ask them to hold the privacy line." But for every China there is a Brazil, Hungary, or India.
For every "completely authoritarian" regime, you have handfuls of (historically democratic) governments on their way toward authoritarianism. In these countries, it's often not the laws calling the shots but the politicians that happen to be in charge at that moment.
Even in democracies with laws as guardrails, presidents will push for data on political opposition - at times threatening techies along the way (see: Brazil jailing WhatsApp execs for access to data , Twitter/India raids, or, in the USA: Trump DOJ asking Apple to turn over info on House Dems )
Simply "doing what you're told" around the world (no-questions-asked, aka "being in the arena" as Tim Cook says) will turn you into an enabler of authoritarianism, and not just in totally FUBAR countries like China but in many nations with a lot of freedom at stake.Reply
I assume this is some sort of machine learning algorithm. How did they get the data set to train it on? That seems really strange. How did they test it? Seems like any engineers involved on this would come out with PTSD. Whatever they used to test it and train it with is illegal to possess and could only be used with some sort of massive government cooperation.Reply
They must comply with Chinese law. So they put iCloud data in China. So users with harmful images of "free hongkong" and winnie the pooh should be reported to authorities. Apple complies with local laws right?Reply
Regarding smartphones, I wonder if we’ve just lost a grip on what it means to have privacy. Even our expectations for privacy have degraded and eroded over time. We just carry this device with our life encoded in a flash chip with keys to cloud access. Life. Everything from emails to our walking gait, photos, text, history, health records, etc is stored on this one device. Dictators of the past would have a field day with this infrastructure.
“Wouldn’t it be wonderful if the people of future would willingly carry a surveillance device that they depend on everyday to communicate to others and carry with them everywhere?”
We need to move towards OSS and OSH phones with extreme urgency. We also need to get more people involved in Ham Radio.Reply
Personally I don't see on device scanning as significantly different than cloud scanning. I think the widespread acceptance of scanning personal data stored on the cloud is a serious mistake. Cloud storage services are acting as agents of the user and so should not be doing any scanning or interpreting of data not explicitly for providing the service to the end user. Scanning/interpreting should only happen when data is shared or disseminated, as that is a non-personal action. But that cat is out of the bag already.
Sure, scanning a personal device vs. data stored in the cloud feels different because you supposedly own the device. But you don't own the software that runs on it, so the distinction seems artificial to me.Reply
Can I take photos with 3rd party camera app + save it within its app sandbox (not to camera roll) to avoid scanning?Reply
I dont know, somehow since Apple decided it was going to force spyware onto its most dedicated (all-in-on iCloud) users, my excitement about the M1 and all the cool things I was going to do in the Apple ecosystem has withered away. I am looking around for alternatives, the way I was in thn 00s, when I abandoned Microsoft, initially embracing Linux, but then happily transitioning to macOS and later iOS. Unless Apple does an about-face, I know I will be going back to Linux, on some privacy-friendly hardware vendor. We have cross platform communication tools and decent alternatives to most of what Apple has to offer. Obviously it is not as pleasant an experience, but there are limits to what I can tolerate for the sake of comfort.Reply
Can you imagine the beautiful New World we are building under the wise leadership of corporations and politicians motivated by unstoppable greed, control and power?
Can you imagine "the screeching voices of minorities with critical thinking"?
The big picture of hyperconnected future in which automated systems will decide your fate is in place and it's running well. It is not perfect. Yet. But it will be. Soon.
You will find a way to cope. As usual. As a collective. As a "exemplary citizen with nothing to hide".
"A blood black nothingness began to spin. Began to spin. Let's move on to system. System."
The system will grow. Powered by emotional reasoning created by the social engineers, paid well to package and sell "The New Faith" for the masses.
You are not consumers anymore. You are not the product anymore. You are the fuel. Your thoughts and dreams, emotions and work are the blood of the system. You will be consumed.
"Do you get pleasure out of being a part of the system? System. Have they created you to be a part of the system? System."
Yes. The pleasure of collective truism. In the name of the "common good" - the big, the bold and the brave brush, build to remove any form of suspicion or oversight.
"A blood black nothingness. A system of cells. Within cells interlinked. Interlinked"
Wake up, Neo...Reply
If you recently purchased an iPhone with your Credit Card, now is the time to file a dispute. I called my company (the American one), and the person I was connected to was instantly aware when I mentioned my Apple phone was now spying on me. 5 minutes later I have a dispute and some semblance of hope. YMMV of course, but what other choice do we have than to hit them in their pocket book on our way out of their walled garden?Reply
The damage is already done.Reply
Imho is Apple not so much privacy focused.
From my network logs, my iPhone is always connected to Apple servers.
Moving the locked iPhone a little bit on the table, very often it establishes a connection to Apple. Why?
If I block all Apple servers, my iPhone tries continously connect to Apple servers - which seems to drain the battery very fast.
Some years ago, I asked Apple wheter it is possilbe to switch of these 'connections'. Answer: No, I have to trust Apple.
I’m a little uncomfortable about my iCloud photos being scanned but ok with Apple doing it because I will give them benefit of the doubt they can make it work and if not, reverse it. If google were to do that, that’s a hard noReply
Seems to me like an instance of 'bundling problem'. What government actually wants is yes/no answer to the question 'does this photo contains child abuse', but with this 'solution' they get much more information than simple yes/no answer, making abuse of power possible.
Is it possible to just scan photos locally for child pornography, encrypt those photos, and upload those encrypted photos and the 'yes/no' answer that the scan returns plus some cryptographic proof that the actual scan (some calculation) was performed on the uploaded encrypted photo?
If this was possible it would enable privacy and also help prevent child abuse (or at least prevent child pornography from being stored on iCloud).Reply
Don't use cloud services or closed-source services if you want your stuff to be safe and you want your privacy to be maintained.Reply
Perhaps the fundamental flaw in thinking behind this tactical error is that Apple thinks of your iPhone as their iPhone.Reply
NCMEC is a private charity. Why isn’t this being handled by law enforcement?
Is this the beginning of the age of vigilante justice?Reply
Good, I hope Apple will die. Not so much of value will be lostReply
In their attempt to make this extra private by scanning 'on device', I think they've managed to make it feel worse.
If they scan my iCloud photos in iCloud, well lots of companies scan stuff when you upload it. It's on their servers, they're responsible for it. They don't want to be hosting CSAM.
It feels much worse them turning your own, trusty iPhone against you.
I know that isn't how you should look at it, but that's still how it feels.Reply
So it boils down to companies make most of the OSs for devices.
Governments can compel companies to do stuff.
The only option for the truly privacy conscious is to not use a company provided OS. The future is in buying a phone and flashing the ROM you want.
Until even owning a non-standard phone becomes illegal.Reply
I always knew the new valium-lgbtq-ceo is going to fuck the company over.Reply
I maintain that I actually think Apple got told do this by the Gov't the same way AT&T was told -- and gagged- to spy on americans.
Look at it from the point of view of the majority non-abusive parents: This can only increase the risk that their children will be taken away from them. This includes Apple employees.Reply
I actually really enjoy the loop of “we can trust them to police us, they are trustworthy” to “oh no, they abused that power; why did they do that” that society goes through.
All this HN rebellion is a storm in a teacup. Some innocent dude will get in trouble and everyone will talk about how “tech companies need to be regulated” and then they’ll go back to “we should trust the government to police us” after a few weeks of burning through their outrage. Always exceptions. Always surprised. Always trusting.Reply
First this and then it'll be gradually expanded to offensive memes against official political narratives.Reply
One thing I don’t understand about this debate is that one of the bigger concerns folks who are against the measure have is that Apple might one day add non-CSAM photos to the set of photos they scan for.
As far as I understand it, the CSAM hash database is part of the OS, which Apple can update in any way they like, including to read your messages or surreptitiously compromise the encryption of your photos (and they can force your device to install this code via a security update). We trust them not to do these things (they already have a track record of resisting the creation of backdoors), so I’m not sure why they we don’t trust them to also use this capability only for CSAM.
Sure, it would be technically easier for them to add the hash of the tank man photo (an example of something an oppressive government might be interested in) to their database after something like this is implemented, but it’s also not very hard for them to add scanning-for-the-tank-man-photo to their OS as it currently exists. Indeed, if the database of hashes lives on your device it makes it easier for researchers to verify that politically sensitive content is not present in that database.Reply
Apple's track record on user protection compared to Google or Samsung has been very good. For example, resisting wide net Federal "because terrorism" warrants as much as they legally can.
There's a reason why Apple 0 day exploits sell for way more on the black market than Android exploits.
Trust is hard to earn, easy to lose and even harder to regain. User trust is such a huge part of Apple's business and success, it's shocking to me they'd damage it so much and so fast.
A system that exists but has "protections" can and will be abused. A system that doesn't exist can't be. It's really that simple.
Here's the big question though: did Apple executives not realize there would be a backlash against this? Or did they just not care? Either is hard to fathom. Apple's board and shareholders should be asking some pretty tough questions of Tim Cook at this point.Reply
The road to hell is paved with good intentions.
I'm going to start rallying for my employer to dump apple as a vendor if they stay the course.Reply
Can someone explain how Apple being coaxed or coerced into searching all of our personal devices for illegal files by federal law enforcement is not an unconstitutional warrantless search?Reply
This CSAM Prevention initiative by Apple is a 180 degress change of their general message around privacy. Imagine investing hundreds of millions of dollars in pro-privacy programs, privacy features, privacy marketing, etc... just to pull this reverse card.
Of course this is going to spark concern within their own ranks. It's like working for a food company that claims to use organic, non-processed, fair-trade ingredients and in just a day deciding that you're going to switch to industrial farming sourcing and ultra-processed ingredients.
It's a complete change of narrative and there's no easy way to explain it and still defend Apple's Privacy narrative, wihout doing extreme mental gymnastics.Reply