Common good trumps individual privacy concerns.
Apple created the lock that helped the bad guy ... Now as the locksmith they must produce the key.
It is common sense.
-20 for apple.Reply
It's almost certain that Apple helped American government to violate customer's privacy. It looks to me this is just a marketing stunt for post-Snowden revelation.
If Apple cared customer's privacy and security so much, how could they sell non-free software that is hard to audit, computer with baseband processor, relies on central server which allows the single point of failure.
My understanding is Apple customer don't much care about their own privacy and security but has weakness on marketing.Reply
> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The scary part here is that the iPhone data is really not that secure. If apple can overwrite the OS and get access to the data, this means the keys are stored on the phone somewhere, and not password protected, or "fingerprint" protected.Reply
well, no one protected from thermo-rectal crypto-analysis. The only difference is that gov guys want to keep it hidden from targetReply
Wouldn't one assume that once the phone is powered up there is some kind of code at startup or scheduled that would query an apple update server about updates,fixes,etc. At that point it is reasonable that a company such as Apple would force certain updates into the phone whether the customer wanted that or not? All Apple would have to do is direct the phone to a phoney update site containing code that would dump RAM to an outside server. No other phones would be affected and the data would be retrievedReply
If the UK record on anti-terror scope creep is anything to go by, not creating this backdoor is a very good idea.
In the UK, laws originally intended for surveilling terrorists were/are routinely used by local councils (similar to districts I think) to monitor whether citizens are putting the correct rubbish/recycling into the correct bin. 
This is a pandora's box, and the correct answer is not to debate whether we should open it just this once, it's to encase it in lead and throw it into the nearest volcano. Good on Apple for "wasting" shareholders money and standing up for this.
 http://www.telegraph.co.uk/news/uknews/3333366/Half-of-counc... - and lest the source be questioned, this is one of the more reactionary newspapers in the UK.Reply
"In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."
Someone who believes in conspiracy theories would make a statement that "now it is official" :)Reply
HN is the first place I came to for discussion on this and I just wanted to thank you all for keeping it civil, intelligent, and objectiveReply
What's the potential fallout on this case? I assume Apple is appealing the ruling - what happens if the ruling is upheld and Apple refuses to comply (unlikely IMO, but what if)? Could the DOJ target individual Apple engineers and order them to do it or face contempt of court charges?Reply
I can't recall any previous instance of a mega corporate opposing the tyrannical US Govt. I fully expect Apple to lose here but it is a valiant and rare effort.Reply
Good for them. Freedom comes with a price, sometimes that price of freedom is protecting the privacy of the worst of us to protect all of us.Reply
While basically being on Apple's side here, as I understand it, jailbroken devices are unofficial builds of iOS that have some security features removed (e.g. limits on which apps can be installed).
Is it not possible for law enforcement to get what they want from that, if all they want is a custom build of iOS that can be hacked around? And why is it even possible for that to work if the data is supposed to be kept secure?Reply
If you oppose this, please let President Obama know. The FBI is part of the executive branch of the government, and as such, directly reports to the president. In other words, if he tells them to stop, they must comply. Please register your complaint here:
Here is my letter to them:
Dear President Obama,
I've voted for you in both elections, and have been a firm supporter on all your causes (affordable care act, and more). However, your FBI has clearly overstepped it's authority by demanding that Apple spend engineering resources building a software product that can break the encryption of a terrorist's iPhone.
Seriously, you need to stop this. You are the head of the executive branch of the government, of which the FBI is directly underneath your jurisdiction. Director James Comey is directly within your chain of command.
What the FBI is asking for is a master key to be created that can decrypt any iPhone. This makes all Americans with Apple devices insecure in the face of threats to our personal security and privacy. I hope you can understand that this is clearly unacceptable, and needs to be stopped.
I want to register my complete opposition to the FBI in this circumstance. Please stop this.
Huge respect to Tim Cook for standing up for the personal information security of Apples users around the world. When a non tech demands something as stupid as a back door, they do not acknowledge how weak they make data security.Reply
I'm sorry but "Smartphones, led by iPhone"? Bit presumptuous.Reply
If I understand correctly, any piece of software that would be used would need to be signed by Apple. Furthermore, the FBI's warrant(?) says specifically that it would only need to work for one device ID. Thus it would be relatively straightforward to create an update for the FBI that could pretty clearly only be used on the phone in question. Unless the FBI had Apple's signing key they could not reuse the software (assuming they couldn't break the bootloader chain of trust, which they apparently cannot if this is the route they are taking). Capability is not the grounds on which they are arguing.
This is a very clearly political refusal. Apple is saying that about as explicitly as they can in this message. Whether or not they can do it, Apple doesn't want to be caught in the game of being a government surrogate or having to determine for themselves if government requests are legitimate (imagine, say, if the Chinese government asked for data from a dissident's phone - would Apple want to risk that market by denying a request that they have complied with in the US?). It's unfortunate for them that the FBI is making this request while people still own phones like the 5c for which they could theoretically disable security features, as opposed to the newer phones which it is possible they are completely unable to defeat.Reply
Wouldn't one assume that once the phone is powered up there is some kind of code at startup or scheduled that would query an apple update server about updates,fixes,etc. At that point it is reasonable that a company such as Apple would force certain updates into the phone whether the customer wanted that or not? All Apple would have to do is direct the phone to a phoney update site(for this IMEI only)containing code that would dump RAM to an outside server. No other phones would be affected and the data would be retrieved. World saved!Reply
Does anyone know what the consequences for Apple will be if they keep refusing, but the courts say they must?
Massive fines? (we know they have the cash to cover it)
Jail time for execs (whoa!)
If it is possible to build the requested OS, then it can be said that the iPhone already has a backdoor.
If the device were truely locked down, there would be no aftermarket solution to unlock it.
My understanding is that Apple was asked to supply software that would prevent their software from destroying evidence on a particular device. They should comply with this order, especially given the device in question.Reply
Question: is it possible to design a cryptographic system that, whenever it is accessed by a third party (government), this is made publically visible in a log? Can blockchain technology help here?Reply
So this was a work phone owned by his employer. Does that change things? Surprised they didn't have IT software installed already to monitor the device.Reply
I'm curious: is it likely that Apple was under a gag order regarding the backdoor proposals/discussions?
I've always wondered why large tech companies/corporations abide by such orders instead of speaking out. Even if Apple was under a gag order, they've created a PR nightmare for the alphabet agencies; Apple could be pursued in court, but that pursuit would now likely be done in the face of negative public opinion.Reply
Are iphone hard disks (and the files within it) and cloud content encrypted based on this single private key that is stored in the secure enclave on iphone?Reply
This is just huge hypocrisy and full of lies. First of all, Apple CAN attempt to brute force the password. Compiling whatever new firmware is needed and signing it with their keys will not introduce any new backdoor like they claimed and lied to the public - the backdoor is already there, and it is their private keys. Just like that "backdoor" somehow end up at some bad guy's hand, so could their private keys.
I would agree with Apple if they wanted FBI to pre-submit all their guessed passcodes for brute force for apple to try, and for apple to have the sole responsibility for that, so that getting said "backdoor" (which really is nothing more than a door handle) will be as hard as getting their private keys, and governments will not keep the said backdoor in their hands. I would also agree if Apple claimed they don't want to be able to crack devices at a judge's order (although that would be against the law - so they can't claim that).
But this is NOT what Apple said. This whole letter is just one big PR bulshit. They CAN brute force a passcode. They failed enforcing significant delay incurred when failing a passcode attempt - even tho this issue was already known for YEARS (will give citation if needed) when apple designed the discussed iPhone 5C - and they also failed requiring passcode to update the device. They already have their convenient backdoor in place in the form of their private keys.Reply
You don't own apple hardware, so you can't protect your device.Reply
This is an interesting chapter in the "Tim will never be Steve" saga that so many people are infatuated with.
This particular hill that Tim Cook has decided to defend is as important as anything Steve Jobs ever did at Apple.Reply
If I were Cook, I'd draw a line in the sand. If we are force to comply, we exit the phone business, because we won't make phones that compromise our customer's security.
But that would take more balls than anyone left here in this "Land of the free and home of the brave" seems to have left anymore.Reply
Bollucks. I'm sick and tired of these limp-wristed sissies cry babyin about their privacyReply
This is a great way to build public awareness for this issue. Hopefully this will allow more people to get involved in the fight.Reply
> And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
Good one Tim! I mean how long did the LE think they can abuse constitution, put spy devices on people's cars without warrant, use stingrays and do all sort of other crazy stuff including planning and executing white-flag attacks without any consequences whatsoever?? I mean, at some point, we the people - for a good reason - will lose all and any trust we have in them! And that's what Tim is saying in this one sentence that with overwhelming evidence, the US Gov would have hard time arguing against!Reply
You can't just be skeptical like that.
This is an Apple post.
Praise the iLord.Reply
If they provide to the government what the government wants now, next year the government will come back with even more ridiculous request. Mr.Cook is right - it'd be great if we can avoid creating a precedent.
Oh wait they already did by providing their clients' data. Trying to stop the government now is like trying to stop a high-speed train. Still, good luck to them! Good to know they are not just pushed around without any resistance.Reply
Good on them. I was hoping that they'd be able to manage a way to unlock this one without potentially breaking the whole model (by exploiting some bug in the presumably outdated version installed or something that wouldn't positively degrade the security model), but given that that's not the case then I think they're making the right choice.Reply
OT but this post is well on its way to becoming the most popular post since Steve Jobs died.Reply
[In walk the drones]
"Today we celebrate the first glorious anniversary of the Information Purification Directives.
[Apple's hammer-thrower enters, pursued by storm troopers.]
We have created for the first time in all history a garden of pure ideology, where each worker may bloom, secure from the pests of any contradictory true thoughts.
Our Unification of Thoughts is more powerful a weapon than any fleet or army on earth.
We are one people, with one will, one resolve, one cause.
Our enemies shall talk themselves to death and we will bury them with their own confusion.
[Hammer is thrown at the screen]
We shall prevail!
On January 24th Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984.'"
Apple Superbowl AD "1984"
Transcription courtesy of George Gollin, 1997
Edit:Removed the link to the video. My goal wasn't to draw traffic anywhere it was just to point out that some of Big Brother sentences in an Ad aired 30 years ago still have strong resonance today.
"Our enemies shall talk themselves to death" Hum... just read yesterday that NSA is believed to use machine learning over cell big-data to determine drone target...Reply
Kudos to this guy for standing up to an idea.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect. Tim Cook
Now on practical notes, this is about security, providing a digitally secure platform to both users and providers, prevent tampering, keeping data secure.
Microsoft could take a cue.Reply
Huge props to Apple - here's hoping against hope that Google, Facebook, and Amazon get behind this.
One thing I was wondering is how Apple is even able to create a backdoor. It is explained toward the end:
"The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer."
This is actually quite reassuring - what this means is that even Apple can't break into an iPhone with a secure passphrase (10+ characters) and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint.Reply
Tim found dead under mysterious circumstances in 3...2....1Reply
A phone without a backdoor would be illegal in the UK once the Snooper's Charter comes in to full effect. I'm very interested to see how the UK government will react to Apple's stance.Reply
Way to go Apple.
And Edward Snowden just tweeted this a few minutes ago in response to another tweet proposing Google back up Tim Cook: "This is the most important tech case in a decade. Silence means @google picked a side, but it's not the public's."Reply
As a software developer i'm always looking for the real bug. Weapons kill. Not Iphones.Reply
I heard this morning on the (semi-conservative FM radio) that this was a national security issue, and that Apple is helping terrorists in not bypassing this.
I don't get it- the shooters are dead. How is what is on their phone a matter of national security? We probably have 99% of the information we'll ever have on them. There is no larger plot. Not having what's on this device I cannot imagine puts anyone at risk.Reply
We need to surrender all firearms, encryption, and gold to our serf lords so as to ease the pain of "feeling the Bern".
plus my obligatory comments:
-believing apple is on the side of your rights is naive -gray text on white background promotes illiteracyReply
if you are interested in the technical details of iOS security: https://www.apple.com/business/docs/iOS_Security_Guide.pdfReply
To play devil's advocate:
Mr. Cook expressed concern that "the government could intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge".
As I read this I wondered, "what harm would actually happen if that occurred"? If the government did read my messages and get my health records & financial data and track my whereabouts, I can't think of anything bad that would actually happen as a result of that.
Is there anything specific that I should be worried about in that scenario?Reply
I'm surprised that nobody on this thread has commented on the real substance of this response. It has nothing to do with Apple brute forcing iPhones for the police (which it has done for years, with a simple court order) - but instead, is Apple making it abundantly clear, that if they comply (or are forced to comply) with the All Writs Act of 1789 to create this particular back door, then that opens the floodgate moving forward for all sorts of requests to add backdoors/decrease security.
It's entirely possible, that the FBI can then use this precedent to simply have Apple remove all security from an iPhone in pursuit of an active investigation, which can be done with a straightforward firmware update - which IOS users tend to do without much thought.Reply
Apple does deserve the respect their getting for standing up to the government about this. They're absolutely right that this is an attempt to fatally undermine security for a whole host of devices, and sets a disturbing precedent.
What do find interesting, is that Apple isn't the first manufacturer that the government as ordered to crack a device. An "unnamed smartphone manufacturer" was ordered to crack the lock screen on October 31, 2014. No one made a fuss then, so someone caved.Reply
I like the position apple is taking, However, after reading the letter, I noticed it misses a point I consider even more important than just "a dangerous precedent".
Apple is selling devices on the whole planet, not just in the USA. So, what's the FBI (an American agency) is requesting is not dangerous for only American citizen, but also for iPhones' owners in Europe, Asia, Africa, Oceania. Hell, these people are not even part of the debate, because they don't belong in the "American democracy".
If I'm going to be affected by someone else's policies, I would like to be at least allowed in the discussion.Reply
Ok, so I completely fail to see how a random crazy guy with a gun who shoots up a bunch of unarmed people has "national security implications". This seems to be a "fact" that everyone wants to agree on, but is frankly a load of BS if one considers the government probably already has his entire call/texting history for the last couple years.
I see this as just another "its for the children" ploy, of which I'm completely sick of.
In that I fully support Apple/etc for finally gaining a backbone. If more people stood up, then I wouldn't have to be naked body scanned at the airport, or the dozens of other privacy invasions the government performs on a daily basis simply to give themselves something to do. So, rather than admit they won't ever be able to predict or protect the population in any meaningful way from random people willing to give their lives to make a statement, they waste our time and money coming up with ever more invasive ways to peek into everyone's most private possessions.Reply
A friend of mine at Apple reported multiple Black Vehicles (Lincoln Town Cars and Escalades) with at least one having MD License Plates at the Apple Executive Briefing Center this morning between 11AM and Noon. Occupants had ear pieces and sun glasses and were accompanied by a CHP (California Highway Patrol) cruiser and three motorcycle escorts. I suppose it's possible this was a quick (less than 1 hour) VIP stop but given Tim's message last night, as well as the reaction of folks on campus who were bandying about comments like "I don't want to work on this or because I don't want to be deposed" the impression certainly was it was not a friendly visit. Given Tim's very public push-back I'd think delivery of an NSL with accompanying intimidation is at least possible. I submitted this HN and updated in real-time. There's a bit more discussion here:Reply
To all in-love-with-Apple downvoters, please read this Schneier sound analysis of the same type of situation that RIM(Blackberry) has been met with: https://www.schneier.com/blog/archives/2010/08/uae_to_ban_bl...
/quote: "RIM's carefully worded statements about BlackBerry security are designed to make their customers feel better, while giving the company ample room to screw them." /endquote
I have lost enough points on this thread to simply double down on this issue.
This is not a good sign at all. While Google can't compete with Apple on the principle of "not spying on their users". All Apple has to to is to publicize it and then ask for forgiveness from it's users later.Reply
So the FBI is asking Apple to build a tool that will unlock security measures of an existing iPhone, like the one in the San Bernadino shooting, and allow it to be read.
The problem with this is that no such tool should be possible to build. It should not be a matter of yes or no; it should be simply impossible for Apple to build such a tool without the private key of the user, which Apple does not have.
If it is possible to write a piece of software which can circumvent the protections of the iPhone without the user's private key, then Apple wrote its security software incorrectly. Either they wrote it with an appalling lack of security understanding; or they left in important backdoors, either knowingly or through ignorance. But if they wrote the software correctly and did not create backdoors of which they're aware, then the government's request is actually impossible -- cannot be done.
So which is it, Apple? Is the point moot because you did this right? Or have you already placed backdoors in the product which the FBI is now asking you to exploit for their benefit?Reply
Instead of FBI paying apple engineers to hack a phone, why don't they ask their kids !? It would probably save millions of dollars.Reply
While I think protecting user data is important, I don't understand what the fuss is about. Anyone could (given technical knowledge + tools) take apart a phone, pull the encrypted data out of storage, and then brute force the encryption on a large machine.
The FBI doesn't need the modified iOS code, and that Apple write/not-write it doesn't change anything in the end, since someone else could just as well write the software with some reverse engineering.
[edit: if you downvote because I'm wrong, please explain because I'd love to know why]Reply
Can they publish a copy of the FBI letter. Otherwise, Apple's description feels a bit circumstantial and opinionated. I feel like I can make a better judgement on this whole issue if the request is made public.Reply
What are the odds that Apple has been ordered to do this before, but every other time they were asked it was in a FISA court? That would mean that this is the first time they've been allowed to talk about it.Reply
Doesn't the phone belong to San Bernardino County?Reply
This is the FBI going after a Parallel Construction path. They already have all the information from the NSA bag o'tricks, but none of those can be used in court. But an unlocked phone unlocks the legal obstacles.Reply
This is the only acceptable response.Reply
I am happy AAPL is taking this stance. But I can't help but believe that is has very little to do with liberty, and very much with the bottom line. Either way, I guess we should be grateful for little mercies.Reply
Tim Cook admits iOS is already back-doored in the most weaselly worded message I've ever seen.Reply
Heads up that I just recently discovered - if your iphone has touch id enabled, you can go into the touch id settings and selectively disable touch id for phone unlocking while keeping it for the app store.Reply
The real security risk is the ability to update the phone's OS without authorized user consent at least as strong as the original protection the FBI are trying to break.
Right now it all hinges on Apple's private key and that's a very thin wire to hang all this privacy off.Reply
I hope apple employees + executives read this:
I am now officially, an Apple fanboy. That's right, I'm gloating to family and friends, about how Apple is standing up to the man, doing the right thing, and refusing to compromise their security.
Keep up the good fight.Reply
As others have noted, this is probably mostly about branding. But that's why it is genius. Tim Cook is committing Apple to this pro-privacy position in a very public way. This means that a reversal of this position or a revelation that Apple has been acting contra it, would be extremely expensive to Apple's reputation with its customers, effectively costing the company a huge amount of money.
By publicly committing Apple to this cause, Cook makes it more likely that internal teams at Apple as well as future versions of the company will adhere to this position. By defining a set of actions which, if made public, would ruin the company's brand, Cook makes it less likely Apple will take those actions.Reply
With the due legal process the police can search property, safety deposit boxes, bank accounts, vehicles, etc. etc. Why should a smartphone be any different just because Apple says it is ?
As much as I value privacy I really don't agree with Apple's stance here - if due legal process has been followed, why shouldn't they be able to read the contents of an iPhone ?
And yes I get that third party encryption can be used, which isn't owned by Apple and that there's little the authorities could do about it - but that's not the case at hand here.Reply
The easy solution to this is to have the gov send Apple the phone. They break into themselves and then hand back the phone with the pass code turned off and whatever software they need to install to do it removed, leaving no trace of how they actually did it.
No software backdoor is created, the FBI gets its data and we all go on with our lives. Why are we spending so much time gnashing teeth over something that has a very simple solution to it?Reply
What's all this talk about pushing updates to locked phones? I have to get involved every time there's an OS update for any of my iDevices. That damn red dot on Settings.app just stares at me while I try to find a time I'd like to be without my device for half an hour.Reply
The way I read this, is that Tim Cook has and said it can't be done, only that it shouldn't be done. This leads me to suspect that Apple can decrypt your phone, and they know precisely how to do it, but in doing so would disrupt their entire marketing campaign around safe and secure encryption.
I'm just a government relations guy, not a security person, so please forgive me, but I'm not sure where I fall on this. I want the FBI to be able to decrypt the San Bernardino attackers phone. The same time, I don't want the government to be able to decrypt my phone. This is one hell of a damned if you do, damned if you don't situation, and I'm really stuck.Reply
I wonder what will be response of other manufacturers making phones with Android.Reply
Applying an update to break encryption would violate chain of custody and render the information obtained inadmissible in court.Reply
Apple's encryption appears to be done in such a way that government entities can safely use them as well as "consumers." But what may happen, is that Apple will be force to produce 2 kinds of iPhones. One for consumers, with strong encryption, but a "backdoor" for warrant "cough" based access. 2nd type of iPhones for government use (string encryption, no backdoors)
They may already have this in place now, but what we are seeing now is a show. They are testing how people/consumers are going to react to this situation. Out government probably figures that nobody will care in the end.
In the USA, we have lost our liberty. It's time to wake up and see what is happening. It's getting worse & the people within our government are working hard to enslave us even more.Reply
The fact that they can create this backdoor, doesn't that mean it already exists?
What Apple needs to do then instead of writing this letter, is release an update that closes this backdoor.Reply
I've never been an Apple fan but this was a fantastic and bold move by them. Software security and hacking is already an enormous problem that every single person has to deal with. Even major companies like the NYTimes have been hacked by malicious users in the recent past. We need to take every reasonable action to combat this threat. Building deliberate vulnerabilities (yes, every backdoor is a vulnerability) into our software and devices is going to make all of us less safe, and all of us more vulnerable to unforeseeable attacks in the future.Reply
I'd love to know the names of the people within the FBI who are pushing this agenda. The only way this foolishness is going to stop is if those people are out of a job.Reply
And what happens to the engineer tasked with writing this hack if he fails and ends up bricking the phone?Reply
In the future, once terrorists have TouchID iPhones, couldn't they just use the corpse's finger to unlock the phone?Reply
As much as I would love to believe in Apple (and any other large tech company), a part of me still thinks that maybe they are working with the government in this letter. The FBI knows that the average US citizen does not want to be hacked. What is to stop the FBI from allowing Apple to say these things and put on a show publicly while simultaneously giving over the 'master key' anyway?Reply
I think, as a society, it boils down to this: "And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."
Can a private, for profit, company deny the will of an elected government working to solve a heinous crime based not on what they say they will do but because they cannot give a 100% guarantee that this is the only time/way it'll be used? Apple acknowledges that the government is saying it's limited to this case but because there's no guarantee (100% certainty) they feel they can deny it?
If yes, what does that mean as a broader precedent. Are we comfortable with private companies denying an elected government based not on what they agree to, but instead because there's a chance it'll be used in other ways?
As terribly flawed one might feel about government very few would think it has less accountability than a private company.Reply
This may be one of the most important things Apple has done. Whether or not you agree with their position, it's incredibly important that tech companies start publicly explaining things like the fundamental problems with backdoors so that a lay person can understand it. Apple have the credibility to make non-technical people take their argument seriously, and the reach to get the message out to a vast number of people. I'm really pleased they're taking this position.Reply
Everyone in the U.S., please write to your Congressional representatives and also to the Presidential candidates you support. They need to know they can't get away with this.Reply
under what kind of pressure would Tim write this public letter?Reply
I thought Apple already had backdoors. I feel relieved that my iPhone is not backdoored and I'm also very happy for a company who's products I use daily.Reply
I don't see how this message is reassuring. Are they expecting the customers to just take their word? Without Apple showing the world, every bit of software that they run on their phones, these statements are at best, meant to mislead the users that Apple is doing something on the user's behalf.Reply
I'm betting there are similar vulnerabilities in the current "Apple doesn't have the keys" versions of iOS and the hardware. For instance, do a similar mandated firmware update to the secure enclave, and now you get unlimited guesses at a PIN.
Ah, I've found a couple of sources claiming that the secure enclave wipes its keys if its firmware is updated. Makes sense.Reply
I'm clearly in the minority here, but I don't really understand Apple's position here, nor do I understand why everyone is rallying behind them.
Apple built hardware which was not particularly secure. The software defaults to a four-digit PIN. They attempt to mitigate this by adding an escalating interval between entries, and by optionally wiping the phone after too many failed tries, but this is not set in stone and those limits can be removed with a software update.
The government is coming to Apple and saying, "You can remove these limits. Do that for us on this phone." Coming as a legitimate court order, I see no problem with this request. The government isn't even asking them to crack the phone, they just want Apple to remove the limits so the government can try to brute force it. They're even paying Apple for their trouble.
If Apple didn't want to be put in a position where the government can ask them to compromise their users' privacy, they should have built hardware which even they couldn't crack. And of course they did; starting with the A7 CPUs, the "secure enclave" system prevents even Apple from bypassing these limits. The phone in question just happens to predate that change.
If the government was demanding that Apple do the impossible, I'd be on their side. If the government was demanding that Apple stop manufacturing secure phones, I'd be on their side. But here, all they're asking is for a bit of help to crack an insecure system. They're doing this in the open, with a court order. What's the problem?Reply
This is now the most-upvoted story HN has ever had.Reply
sounds like the backdoor already exists, but only Apple knows how to use it. same as if Apple knew a master password for this phone but refused to give it. they are saying they don't want to give it because once the FBI has it, then they are free to use it anywhere. pretty strange post from Apple.
probably they try to fight this request by arguing that the government is actually asking them to effectively remove security from all the phones (of this model at least). they would be happy to help break this one phone as long as it doesn't affect any other phone.
in that case, then Apple should just break the phone and give it back to the FBI after removing the backdoor.Reply
Why is there very little talk about the First Amendment in this whole discussion? They are asking to write custom software.
The supreme court has ruled in separate cases that: 1. that software is speech 2. that a person (corporations are people according to them) cannot be compelled to speak
It would seem to me that the FBI could perhaps subpoena technical documentation from Apple but it should be required to hire their own developers to write this software.Reply
Not getting an iPhone, even secured - Check!
I bet hardware vendors are just salivating at the concept of having to produce thousands of iPhone cracking docking stations.Reply
• Can Apple upgrade iOS on a single device that is locked, from a new untrusted laptop without wiping it?
• Can Apple OTA upgrade iOS when the device is locked?Reply
"Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE."
People hyperventilating that the tool could be used to crack other phones can relax, given the last clause in the quoted text (from the actual order).Reply
A company with courage. Hard to believe when virtually no institution, government or corporate has it.Reply
under what kind of pressure would Tim decide to write this public letter?Reply
It may well be that Cook's stand will soon become unworkable in the US. The US is always at war, after all, at least effectively. I wonder if Apple would just leave. It's already earning ~60% of revenue outside the US, after all. And hey, it's sitting on tons of offshore cash. Maybe it could build its own country on an unclaimed reef somewhere.Reply
Hmmm. If this pans out in Apple's favor, I may finally buy an iPhone.Reply
seems that this forum is 'moderated'. Views that don't kiss the self proclaimed savior of freedom are deleted.Reply
The important lesson here is that it is time to design the next phones in a way that makes it impossible to either install a software update without unlocking the device or implement auto-erase functionality in hardware.
That way for future phones at least, the issue would become moot: there would be no way for Apple to build and/or install a custom software image that allows brute-force password cracking.Reply
If I were a betting man I would put good money on the bet that a bypass exists and is well known to the government.
What parts of the government is a different matter.
This is a perfect setup. Get all the bad guys to run out and buy iPhones (good for Apple) believing that they are safe from the US surveillance machine.
Then the appropriate agency can slurp up whatever it wants.Reply
I'm a libertarian. But islamic terrorist phone is just evidence - Apple must unlock it for the FBI.Reply
I wonder how much of that was personally written by Tim Cook, vs. various other people within Apple (I'm sure legal, PR, product, etc. all had input, but this feels like something he wrote himself.)Reply
Can someone explain this to me? The FBI requests a new version of iOS to be installed on a single phone that was involved in the attack. What, exactly, does this mean? If the phone is locked, how will they install new software on it without unlocking? People are suggesting an update to iOS that will get pushed-out to all users, but contain a backdoor that is specific to that one particular device -- but how will the new iOS version be installed without unlocking first?Reply
Im generally not an apple supporter(i dont like the closed eco system), i am very plesantly surprised they posted this.
I am quite disappointed that the us courts are trying to force apple todo this, and in my opinion, its just to use this case to set a precedent.
I hope Apple cant get it to work, but id hate to see what the courts would do if that happened.Reply
I think Apple tried to prove that they don't give any user data to agencies. PR stunt. But actually they got it fundamentally wrong as this was actually case of national security and the attack that happened. So huge PR stunt, but a own goalReply
There is one way to brute force an iPhone called IP Box. It's a hardware device which can brute force a 4 digit pin in ~111 hours. http://blog.mdsec.co.uk/2015/03/bruteforcing-ios-screenlock....
But it only works on iOS 8.1 or earlier, was patched in iOS 8.1.1Reply
Can't they dump the drive's data to protect it from being erased?Reply
The FBI's intentions are not good. They have abused the data, tech, power they already have at every turn. They fabricate terror and crime, then use it as an excuse to violate our human and constitutional rights. Stopping crime and terror is secondary to undermining our security and maintaining a culture of fear.Reply
The numerical passcode is likely his ATM pin, or a code from his bank/PayPal or some such. I hope the government can simply subpoena his bank/PayPal etc and this will end at that.Reply
It took me a bit, and I believe no one has summarized this very well yet.
FBI: "You've built a device that makes it nation-state-difficult to install custom software without DRM keys. We'd like you to assist us in deploying software signed with your keys."
Apple: "That feels way too much like asking a CA to sign a cert for you, so fuck off."
I'm honestly not sure which side I'm on here.Reply
Techie question: if Apple can compile a neutered version of iOS to bypass encryption, why can't a hacker (or US govt nerd) at least in theory reverse engineer iOS and patch it accordingly?
(guess answer: iOS needs to be signed. So what they are really asking of Apple is to sign a lobotomized iOS image...)Reply
I really hope they actually physically can't access the data on this phone. It's entirely possible this could be the case -- I've been trying to consider the vectors they could use:
- lightning cable delivered iOS patch (probably won't work because iOS won't negotiate over USB until you tap a dialog box)
- OTA update (not connected to internet)
- Cracking open the device and accessing the storage directly (encrypted until boot time)
The most likely vector I can think of:
- - Lightning cable delivered iOS patch from a trusted computer (i.e one that the terrorists actually owned)
It's quite impressive that Apple is taking a stand like this, though perhaps unfortunate timing WRT the larger encryption debate.Reply
I feel like Apple is intentionally over simplifying it for the purpose of this letter or maybe to push back on the FBI ask more easily.
Apple could propose to secure access to the FBI using the same level of security that it uses to protect the access to the phone content for the owner of the phone himself. Tim Cook only talks about one solution of a "tool" that it could install.
If the same level (and method) of security is used then saying that there is a risk of the backdoor being hacked would be equivalent to saying that there is a similar risk of the user access being hacked.Reply
It's hard for me to have respect for an organization that was built by J. Edgar Hoover, a person who did not respect the law or American's rights.
The philosophy of corruption and oppression still echoes throughout the FBI. Even today, there are FBI agents that work for private interests. You can't reform a mafia, you must abolish it and start over.Reply
Even though the matters are slightly different, I couldn't help but think that Cook is giving off a Boards of Canada vibe in this post (in a good way).
"Now that the show is over, and we have jointly exercised our constitutional rights, we would like to leave you with one very important thought: Some time in the future, you may have the opportunity to serve as a juror in a censorship case or a so-called obscenity case. It would be wise to remember that the same people who would stop you from listening to Boards of Canada may be back next year to complain about a book, or even a TV program. If you can be told what you can see or read, then it follows that you can be told what to say or think. Defend your constitutionally protected rights - no one else will do it for you. Thank you."Reply
This is quite unlike Apple. Is this the same company that insists on keeping its source proprietary and is always against FOSS? The idea that you care for your users' privacy and still like to keep control on them by not giving them the freedom to modify source-code is not what I buy.Reply
I have never been more proud to have worked for Apple. Tim isn't afraid to give the government the old double forks when it counts!Reply
"We have no sympathy for terrorists."
They felt the need to state that, huh?Reply
I can't read it from the letter - are they going to refuse to cooperate? Can they do that?Reply
No sympathy for terrorists, no sympathy for weakening encryption.
I can understand someone outside of tech not understanding how those are comparable statements, but if anything the latter is more important.Reply
Publicizing the case themselves in a very good move.
However, the iPhone of the attacker is an iPhone 5C, which does not have Touch ID or a Secure Enclave. This means that the time between passcode unlock attempts is not enforced by the cryptographic coprocessor. More generally, there's no software integrity protection, and the encryption key is relatively weak (since it is only based on the user's passcode).
The amount of work needed to turn security into good user experience is phenomenal: https://www.apple.com/business/docs/iOS_Security_Guide.pdfReply
Many may hate Apple, however it's undeniable that they're so committed to user security.Reply
Cannot be more glad to see Apple's stand on this. Let's not forget what happened with Blackberry some 5 years ago. India, Saudi Arabia and UAE got monitoring ability on its platform:-Reply
The fact that Apple indicates that they would be able to produce such a software version is in itself a backdoor in the iPhone.Reply
> The San Bernardino Case
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.
When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no an way to guarantee such control
This is just pure awful they admit to helping the fbi. how can we trust themReply
I'm no security expert, but how would Apple access previously encrypted data with a different version of iOS? Doesn't having that ability imply they already have a "back-door"? Could someone explain what I'm missing here or is it more that that would be a one-off solution and the FBI is asking for a global, remote, no apple needed solution...Reply
Even if Apple created a backdoor, how are they going to install it on locked phone? Are locked mobile able to update without access to internet or user passcode?Reply
Just so you know when I forgot the password to my password to my iPhone remembered that I choose 1 from the top section 2 repeated numbers from the row below 1 below that and then a zero. So I tried it untill it said "iPhone disabled connect to iTunes", this is when I found out you can reset the disabled time by clicking on a computer the backup button. So therefor you could successfully create a program that tries passcodes 5 or 10 times then tries and fails to back the iPhone up. (There is no need for a backdoor or anything fancy)Reply
As far as I'm aware, the most proper attacks here are, in order of cost:
0) Find some errata. Apple presumably knows as much as anyone except NSA. Have plausible deniability/parallel construction.
1) OS level issues, glitching, etc. if the device is powered on (likely not the case). Power stuff seems like a particularly profitable attack on these devices.
2) Get Apple, using their special Apple key, to run a special ramdisk to run "decrypt" without the "10 tries" limit. Still limited by the ~80ms compute time in hardware for each try.
(vs. an iPhone 5S/6/6S with the Secure Enclave:)
3) Using fairly standard hardware QA/test things (at least chip-level shops; there are tens/hundreds in the world who can do this), extract the hardware key. Run a massively parallel cluster to brute force a bunch of passphrases and this hw key, in parallel. I'd bet the jihadizen is using a shortish weak passphrase, but we can do 8-10 character passphrases, too. They may have info about his other passphrases from other sources which could be useful.
While I'm morally against the existence of #3, I'm enough of a horrible person, as well as interested in the technical challenge of #3, that I'd be willing to do it for $25mm, as long as I got to do it openly and retained ownership. In secret one-off, $100mm. I'd then spend most of the profits on building a system which I couldn't break in this way.Reply
Tim Cook: a really nice guy with blue whale-sized cohones.
There can be no compromise because China, Syria and Turkey would also lean on Apple to break into phones of dissidents, and pretty soon, future whistleblowers here in US too in order to prevent leaks (iPhone 7 and iCar notwithstanding).
That's the tradeoff in not giving in to faint, vague "maybes" that there were "external coordination" when in all likihood it was the ultraconservative, Saudi half leading this duo into the kookooland of violent extremism.
The security services will just have to buy exploits, develop malware, cultivate human intelligence sources and monitor everything the old-fashioned way... It's not like that kid in a YouTube video finding a jailbreak exploit for an iPhone and not releasing a tool is going to sit on it, he's going to auction it off to the shop or country with the most $$$.Reply
Surprising the FBI doesn't have a division of highly paid individuals who can crack iPhones... There are plenty of people online with a vested interest in this topic who I'm sure you could hire to help.
My guess is that this is more about pushing back the law and peoples rights than is is about getting access to this device.
But then I'm highly cynical about what the government claim they can do with technology for obvious reasons.Reply
Threat to humanity should trump all the garbage Apple and its lackeys are spewing out. Terrorists are not human.Reply
I'm no security expert, but how would Apple access previously encrypted data with a different version of iOS? Doesn't having that ability imply they already have a "back-door"? Could someone explain what I'm missing here or is it more that that would be a one-off solution and the FBI is asking for a global, remote, no apple needed solution...Reply
Very important message!Reply
Their iMessage encryption is fascinating. It basically makes it impossible to retroactively decrypt iMessages. With a court order, they can start MITMing conversations, but unless they intentionally generate a MITM keypair they are cryptographically locked out of the conversation.
http://techcrunch.com/2014/02/27/apple-explains-exactly-how-... (Link to Apple's paper is in the article)
(Yes, Apple could add this key for everybody at the beginning, but if their intention is security then it is a brilliant system.)Reply
What im reading is that apple can remote install an update that disable encryption. They dont want to do it.
But that they have the capability is a bit scary.Reply
Gotta give it to Apple, they sure know how to pull off a PR stunt.Reply
My guess is that it's likely that the FBI can access the data without Apple's help. Based on what we know, how do we distingish between these two situations, and which seems more likely?
A) Apple has created unbreakable security. The FBI cannot access the data and needs Apple's help.
B) iPhone security, like all other security, is breakable. iPhones are a very high-value target (all data on all iPhones); therefore some national security organizations, probably many of them in many countries, have developed exploits. The FBI, following normal practice, does not want to reveal the exploits or capability and therefore must go through this charade.Reply
Being realistic, ¿how many fewer iphones will apple sell if they remove the SE? ¿How many people will not buy an iphone if they are told that their info can be accessed with a judge's warrant? I'm guessing a 0.1% drop in sales?Reply
Link to the full order:
It is a PDF.Reply
Maybe I am missing something here, but the Washinton Post says "Federal prosecutors stated in a memo accompanying the order that the software would affect only the seized phone". What is so wrong with that? If they just use it only on this phone? Or is that the weapon has been created and could be used?Reply
Since Apple is part of PRISM, the FBI can just ask the NSA.Reply
> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
They really need to put that paragraph closer to this one:
> The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The first paragraph without the second implies that iOS isn't actually secure at all.Reply
Apple stand is a bunch of BS. The main issue should be that the safety and protection of humanity. Terrorists are not humans they are bent on destroying humanity.Reply
Am I wrong to think that this brute forcing can still be applied when the raw memory chip is taken of the iPhone? The wipe-all-data-feature requires write access to the chip + some intelligence and monitoring. These capabilities should be physically removable from the actual memory chip, right?Reply
This is why technology companies have to go farther than implementing proprietary security systems: They have to put the capability to circumvent security out of reach of themselves.
Real data security has to be a mix of services that are friendly to reliable key exchange and strong unbreakable encryption, and verifiably secure endpoint software, which in practice means open source software where the user can control installation, that implements encryption.Reply
Given the way a lot of people (and the media) tend to go completely bonkers when somebody says "terrorist", this is commendable.
It remains to be seen, though, what Apple will actually do, in legal terms. Will they flat-out refuse to cooperate, even if this means that they will be fined or Mr. Cook will be imprisoned for contempt or something like that? Will they actually send their lawyers to challenge the court decision? That would be very interesting to watch, and if they succeeded, it would create a precedent for a lot of other companies. But so would their failure.Reply
Am I the only one buying a new iPad because of this announcement?Reply
Cry, US Government!Reply
Does anyone have a decent architectural overview of iPhone (6)? Security - these enclaves etc sound good but devil is in the detailsReply
Very impressive letter. They've expressed their position in language that a layman can understand, there's abundant evidence that they respect the intent of the law authorities, and even clearer evidence that they are drawing a line in the sand based on their principles. They will protect their customers.
I wish more companies could speak so clearly and courageously.Reply
I know that because this is on hacker news, everyone is talking about whether it's possible to access the data, and if possible... how easy or how hard it is. But the focus should be on the whether there should be a clear line/understanding between security and privacy, or should we keep everything black and white as it is now, just looking at extreme cases?
If they cannot co-exist, I'd rather have more security and less privacy. But ideally, I shouldn't have to choose between them.Reply
All of a sudden I'm starting to think my PiPhone is looking pretty good.Reply
A possible compromise would be to add a backdoor to the security module that would unlock the phone in exchange for a proof of work.
It would be relatively easy for the chip to offer a challenge and accept, say, a $100,000 proof of work to unlock the phone. This way, we prevent bulk surveillance but still allow the government to access high value targets' devices.Reply
This is actually the result of a barter. The Gov gets to have some low level TOP-SECRET access in trade for this easy access code and that Apple gets to go public to keep the populace calm and pretend they are fighting this thing.Reply
What I think Apple should (also?) do is appeal to both the law enforcement themselves, and the government - basically go "All secret communications from law enforcement and government figures - up to the President - would be at risk", or something to that effect.
I doubt the ones giving these orders would be comfortable with their own privacy being at risk.Reply
It makes sense for them.
If they put a backdoor in iPhone for US government, they are effectively thrown out of Chinese market.
Interesting enough, what will Apple do if Chinese government demand they to decrypt/put backdoor in exchange of staying in the market?Reply
This was his employer's phone, right? As in, it was government-owned property being used in the course of terrorism. Were they using Apple's Mobile Device Management (MDM) framework or some other form of key escrow? If not, why should Apple bail out a government entity, at the expense of its own customers and security, that couldn't even be bothered to follow best practices?Reply
This is ugly. If Apple can indeed break into the phone, they need to say, "We have to stop production now. All of our engineers will need to be behind this. It will cost us at least a billion dollars, if we can do it. We will miss deadlines for new products and software. Write us a check for $1 billion and we will start on it. We may need a few billion more. Write the check - we'll do what we can do. And lets hope we don't accidentally destroy the evidence doing it."Reply
It bothers me that Tim Cook lied: he stated in his open letter that if they provided the modified OS it could be used on other phones, but the court order specifically says Apple should make the software only work on the specific phone in question.Reply
Remember, iPhone's are available world wide. If the US wants to play world police, then I want a vote in the US election.Reply
I'm really impressed that Apple is standing up to the government and protecting its users' rights. I've never really considered the iPhone worth the premium price tag, but policies like this have changed my mind.
Could someone answer a question I have though? The government wants Apple to create this backdoor and tailor it to the specific device, so presumably it will have a line that goes
To make the backdoor general purpose, this line would need to be removed. But doing so would invalidate the signature and it can't be resigned afterwards because the attacker won't have Apple's signing key. So is the open letter a matter of principle that they won't build any backdoor, now or in the future, rather than a specific concern about this backdoor?Reply
if (!deviceID.equals("san_b_device_id")) return;
Apple should be more clear that this is 5C and not the latest version.Reply
*)My website provides a lot of movie box office include: Jurassic World Trailer (2015) The Stanford Prison Experiment Trailer (2015) Ex Machina Trailer (2015) Ant-Man Trailer (2015) The Wedding Ringer Trailer (2015) Child 44 Trailer (2015) Poltergeist Trailer (2015) Aloha Trailer (2015) Faith of Our Fathers Trailer (2015) Terminator Genisys Trailer (2015) Mad Max Fury Road Trailer (2015) Jupiter Ascending Trailer (2015) The Age of Adaline Trailer (2015) Cinderella Trailer (2015) The DUFF Trailer (2015) Entourage Trailer (2015) Danny Collins Trailer (2015) Bedlam Trailer (2015) Southpaw Trailer (2015) Little Boy Trailer (2015) Chappie Trailer (2015) Zon 261 Trailer (2015) Maggie Trailer (2015) Rage Midsummer's Eve Trailer (2015) Tomorrowland Trailer (2015) Anegan Trailer (2015) Ex-Free Trailer (2015) Insectula! Trailer (2015) Inside Out Trailer (2015) McFarland Trailer (2015) and many other box office moviesReply
I don't quite understand - what is the actual purpose of being able to push a new version of iOS while locked? Apple don't seem to use this - people stick to whichever version they're comfortable with on old devices and accept whatever limitations.. so why does the functionality even exist?
Even with the restriction of being plugged in, outside of Apple who needs to push iOS versions at tethered devices and will be hindered too badly by having to unlock them first?Reply
This is very disappointing letter for me. It means that Apple can indeed build a backdoor into existing phones, they just don't want to do it (or so they speak). I was under impression that Apple employs security hardware which protects keys and makes impossible to penetrate that defense. If it's not the case, iOS security is not as good as it could be.Reply
Backdoor is somewhat of a misconception. What they want are two front doors, ie we encrypt your message with the recipients public key, and we make a copy with our(in this case apple's) public key. We send both messages over the internet, and apple or your isp/cell service provider (we can also assume nsa prism has it too) stores the apple key'd message or both. When the government wants access, they can issue a subpoena for information from the isp/cell provider for the encrypted data (or just download it from Saratoga Spings), then they issue a warrant to apple to decrypt it with their private key. This is likely the only reasonable and responsible outcome that I can see resulting from this debate. Or, pessimistically it becomes an issue for political fodder and we leave it up to politicians who have little to no understanding of the technology to devise some technologically inept solution.Reply
Is there any doubt that when the FBI brings up a law from the 1700's to justify breaking digital encryption in 2016 that they are completely making it up as they go along?Reply
Wouldn't many countries like Russia and China stop allowing the sale of iPhones or at least their use by government officials if the FBI succeeds?Reply
The court order was posted on HN hours before this letter and eother Tim Cook has not rrad the order or he's lying about the back door. What the court ordered was the removal of the auto-wipe.Reply
Wow this made my day. I think my faith in Apple's privacy concerns got a much needed revitalization. Privacy and encryption are the number one reason I stick with iPhone and Mac with File Vault. It was always hard to completely trust them after PRISM. However, that was arguably a different Apple.
This stance against the government come poetry reaffirms my faith in the genuineness of Apple'e encryption efforts and Tim Cook specifically.Reply
it's a slippery slope.Reply
There's a simple way to defeat Apple's argument. The judge could simply ask Apple to flash the new firmware on that phone, let the FBI run the brute force under their supervision and obtain the contents they need, and then flash back a non-compromised version of the OS.
The government would never have access to a phone with a compromised version of an OS that they could use to repeat this trick. Rather, the government would have to obtain court orders and have forensics done under supervision.
This isn't a backdoor and doesn't affect consumers, and sets a really high bar to trying to scale this for the government because it requires Apple as the gatekeeper every time to agree to do the one-off hack.
The cynic in me thinks that this letter is more about brand image. Apple wants to claim they can't hack their own phones, even if the government asks, but clearly in the case of the iPhone 5C it IS possible for them to do it, and this creates a contradiction with their public marketing and privacy position. If they didn't release this open letter, then simply complying with the judges order would make them look bad.Reply
This is the sort of thing that a professional organization - like what medical doctors have - could help with. Let me explain.
The court order gives Apple an out: "To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief".
Now, imagine if this was court ordering a company to engage in unethical medical procedures, rather than unethical software development. The professional medical community would sanction doctors that cooperated and support those that stood by their ethical principles and refused to cooperate. If there was a similar professional organization for software development, Apple could reasonably rebut that telling their engineers to work on this would be unreasonably expensive (since they'd expect to fire people or have them resign over it).
This is another avenue for fighting the order - have a good chunk of Apple's engineering department sign an open letter saying that they'd resign before working on that project. The incentives seem like they'd work for making it a thing.Reply
Actually, someone other than Apple is already able to do the requested things in the court warrant (brute force passcode from a locked iPhone) - ih8sn0w has an iBoot exploit for A5 chipset (same as iPhone 5c), so he can probably boot an unsigned kernel, and use some public tools already published to crack the said passcode. If some lone hacker can do it don't be fooled for a minute that NSA can't, or that the feds couldn't buy something similar from another hacker. This is Apple covering their P ass from the press.Reply
Well they seem to be saying that the approach they describe, to make a modified OS, would actually work to circumvent encryption on a preexisting device. That means that they already know the device is not really actually secure.
They aren't talking about putting a back door into systems to be used in the future, they are saying it's indeed feasible to place a backdoor on a device already out there and then use the backdoor to access the device. That means the device is not actually secure.Reply
Mods: can you please update title to add some context?Reply
Is it possible for a human just to try all 9999 passcode combinations? Assuming the 10-failure erasure is switched off -- a bad assumption, I know. Is there an additional slowdown after a lot of failed attempts?Reply
Link to the FBI order: https://assets.documentcloud.org/documents/2714001/SB-Shoote...
(Edit: deleted part where I was wrong. Thanks robbiet480 for correcting me. It's 2am here and I was tired.)
Also, prediction: if Apple refuses to build a brute forcer, someone else will do it and sell it to the FBI. Just wait and watch.Reply
There needs to be a distinction between state security and "retail" security. State security agencies have the legal framework to compel Apple to do anything and not even talk about it. What I call "retail" security is any act by any legal enforcement agency in the country. Their requests are bound to be in large numbers and for all kinds of things. On top of that, these requests, apparently, are not yet covered by a legal framework. Hence the need to force upon an old law to try and make Apple comply.
What's at stake for Apple is not only their principles but also one of their marketing pillar: "you, the user, can trust us with your data/privacy." By asking Apple to give that up, and quietly, you actually are asking them to undermine their business model. Shareholders will not appreciate that if they wouldn't have a chance to hear about it first. The Apple brand would lose from its value and it would reflect in the AAPL share price.
My point is that the whole thing needs to have legal backup. And Apple is asking for this exact thing: give me a law to use. And not something from the 1700's.Reply
I always wondered why Apple took so much trouble with the secure enclave design, I thought it was really overkill, now I see it was really necessary for instances like this.Reply
So only Apple has the ability to do this...not the US government. So we trust Apple but not USG?Reply
Kudos to Apple for standing up to the US government and stand by its users.Reply
The All Writs Act is a United States federal statute, codified at 28 U.S.C. § 1651, which authorizes the United States federal courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law."
well, as far as I can see, it is not agreeable to the use and principle of law to force a company (or a person since corporations are people) to spend money and waste resources to compromise its own security systems, which happens to be something they morally object to.Reply
"For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them."
Sounds just like gun control :)Reply
And the government wonders why people from tech don't want to work for it.Reply
Privacy is obviously the foremost issue at hand with the Government's request here, but there is also a huge potential impact on the future of the iPhone software. There is a huge difference between granting access to a user's data at the Government's request vs demanding a customized build of the iPhone's OS. Imagine the long-term implications of having a third-party tether its misaligned feature requests to every OS update that the iPhone makes. What would be the continued relationship with Apple and the agency behind this? Would this evolve into something analogous to HIPAA compliance?Reply
If you look at past cases where the All Writs Act has been invoked, the Courts have rejected this type of government conscription.
Effectively, the government is forcing Apple to take receipt of a device that it does not own or posses, then perform potentially destructive services on a device, and then perform services that could potentially require Apple to testify at a trial under the Confrontation Clause of the Sixth Amendment.
I really think that Apple's in the clear here, and the AUSA's in the case are pulling all the stops to get Apple ordered to break the encryption.Reply
> People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes
I wonder if this is a grammar mistake, or Apple actually considers the private conversations, nodes, photos to be theirs?Reply
Massive props to Apple, again I am impressed by their commitment to customer privacy.Reply
I think it's outstanding that Apple is standing up for this.
Will they, can they do anything about data in iCloud as well? While you can turn off iCloud I'd guess the majority of people are using it. Given you can access much of it at iCloud.com that would seem like whether or not you can unlock an iPhone most customers' data is available directly from Apple. Mail, notes, messages, photos, etc. No idea about other apps data that get backuped
Again I'm applauding Apple for standing up for encryption. If they could some how offer the same on iCloud I'd switch from Google (Google is not encrypted either. My point is I'd switch to a service that offers it)Reply
Can someone explain this to me: if the data is encrypted, how does switching the operating system out enable one to read the data? I'm a layman in this area but I can only surmise that the data is stored unencrypted and it's the operating system itself that's somehow locked. If a change of operating system can open up encrypted data, then what's the point of encrypting hard drives or data sent over a network?Reply
Wow. This is the first HN submission to exceed 5,000 points!
To honour Tim, and his advocacy for our industry, I'm going to spend the rest of my week developing privacy/security projects. I encourage everyone else to do likewise.Reply
This is all brave talk until they publicly say the same thing to China, until then this political bluster. http://qz.com/332059/apple-is-reportedly-giving-the-chinese-...Reply
I think there are two orthogonal questions:
* Does Apple pretend the FBI cannot access to its devices?
* Can the FBI access to its devices?
The only thing we learn here is the answer to the first question. We know nothing more for the second one.Reply
What happened that now the companies can talk about these gov requests? The most nefast thing in these gov orders about terrorism is that the companies were forbidden to discuss it publicly.Reply
Possible or not, the FBI seems to have formalized the issue using this opportunity. They are asking the questions they have been wanting to ask since the release of smartphones.Reply
Cook says any iOS device could be breached if this software were created. But other articles have led me to believe that any iOS model with touchid is immune due to the secure enclave being in play even for non-touch passcode access. Is this wrong?Reply
This is interesting:
"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."
Am I reading this right? Apple, if they chose to, can make a version of iOS that disables security features and encryption and load it onto existing phone even though the phone is locked and encrypted?Reply
a message to everyone:
fuck america, and fuck all you idiots.Reply
I see a lot of discussion about "Secure Enclave" and other hardware security features and such, and I'm not sure I see the relevance. Assuming that the data has already been properly encrypted, stored on disk, and purged from memory (by shutting down the phone) by a version of iOS that did not already contain a backdoor when the data was encrypted, there's no magic combination of hardware and software that can decrypt that data without the password, right? This seems to be supported by Apple's claim that the best they could possibly do is provide a channel for the FBI to brute force the password.
So am I missing something that makes the iPhone's internal security architecture relevant here?Reply
It is worth pointing out one salient fact: the phone in question did not belong to the shooter, it belonged to the shooter's employer, which in this case is the county government. That makes Apple's position much less tenable because the owner of the phone is (presumably) consenting to -- maybe even actively encouraging -- the recovery of the data.Reply
Can't they dump all the data from that particular device and then send it to the FBI? Maybe the judge will order that? Obviously they're confessing they can break the encryption but they would not do it, on principle. I don't see how they can win this fight. If it's the iphone of the shooter, and they can decrypt it, they should do it. It is not the same as to give the FBI a tool to unlock any iphone.Reply
Just unlock the freaking phone for them...Reply
Apple's OSs are closed and therefore inherently unsecure. When Apples caves, and they undoubtedly will, it will have the beneficial consequence of being a boon to open source communications software.Reply
Dear Tim Cook,
> "Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."
So there is already a backdoor. Apple are refusing to let the FBI use the backdoor.
The backdoor is the fact that Apple can push a firmware update that disables security features, without the device's password being entered at any point.Reply
Would this really be true or is this just a decoy, to let you believe there is no backdoor?
I do believe there is no backdoor for when a city court requests it, but i don't really believe that the FBI or CIA doesn't have access to it.
Considering that iPhone already exists a long time, they must have some means to backdoor the "iCloud"...Reply
I see a lot of people saying they're impressed, admired, etc. at Apple for doing this.
It's not about giving props: Apple is not doing this out of goodwill, or because they believe in protecting privacy. Apple has a competitive advantage against Google/Facebook in that its business model does not depend on violating their customer's privacy.
They are just exploiting that competitive advantage.Reply
I wonder why no one pointed out that privacy boils down to trust:
That letter might be the truth or could be some kind of decoy. Maybe the backdoor will come and Apple knows that already and they try to limit the damage to their brand.
Like "we tried to resist having a backdoor installed, but we couldn't do it ultimately".Reply
Cook wrote that "this software ... would have the potential to unlock any iPhone in someone’s physical possession." (emphasis mine)
Is that true? What if it's locked with a secure 128-bit (e.g. 10-word diceware) passphrase?Reply