On Tuesday, the word spread that Apple’s latest release of its operating system for Mac computers, MacOS High Sierra, had a terrible, dangerous problem. Anyone could sit down at any Mac computer and in seconds log in to an administrator account, with full permission to install programs, change passwords, read files – to do anything to the computer. No password necessary, no hacking, no clever programming, no technical skills required. Anyone with physical access to a Mac could get into the computer and wreak havoc.
It was a bug. It was a mistake. It was a huge, unforgivable, embarrassing mistake. For many reasons I would call it one of the worst security blunders in modern technology. The entire global Apple ecosystem has been completely vulnerable, and the mistake was only fixed by Apple after a random Twitter user pointed it out.
Apple dropped everything to try to get a handle on this disaster. On Wednesday morning, less than 24 hours later, it released a patch which will be installed automatically on Macs worldwide, along with an apology. “We greatly regret this error and we apologize to all Mac users, both for releasing with this vulnerability and for the concern it has caused.”
If you’re a Mac owner, don’t panic. This will be fixed soon, hopefully before someone sits at your computer with bad intent.
I feel for Apple. Mistakes happen. This one hits particularly hard because Apple has built its reputation on security and privacy. For twenty years it has taunted Microsoft because Windows computers are more vulnerable to hackers and viruses – and then Apple leaves its own computers open and unlocked. This will hurt Apple’s reputation and depress its stock price and shake confidence.
There’s a bigger point, though.
The case against a “golden key”
For several years Apple has encrypted iPhones so that information cannot be retrieved from them without the user’s password, PIN, or fingerprint. In the encryption world, those things are the “private key” that must be supplied to decrypt the data on the phone.
Apple does not have the private key for your iPhone. It can’t decrypt it. Well-designed encryption is unbreakable. Apple does not have a copy in the cloud of everything on your phone, and it can’t get into your phone without your password, your PIN, or your fingerprint.
There are a couple of reasons that Apple has designed its phones this way: (1) it sincerely believes this is the best policy for society and its users; and (2) it allows Apple to market itself as the company most concerned with your privacy, which has been a very successful marketing pitch. (Google reportedly will include similar encryption in a future Android release.)
This drives law enforcement crazy.
In the wake of the 2015 San Bernardino massacre, the FBI could not decrypt the suspect’s iPhone, so it turned to Apple and demanded that it break the device’s encryption. Apple said it couldn’t do it. The FBI backed down in that case after much time in court and much discussion in the media, but law enforcement officials have complained ever since about the difficulty of investigating crimes when they are not able to obtain information from the devices that bad guys use to communicate, plan, and coordinate their evil plans. Last month Deputy Attorney General Rod Rosenstein put it this way:
“The advent of “warrant-proof” encryption is a serious problem. . . . Our society has never had a system where evidence of criminal wrongdoing was totally impervious to detection, especially when officers obtain a court-authorized warrant. But that is the world that technology companies are creating. Those companies create jobs, design valuable products, and innovate in amazing ways. But there has never been a right to absolute privacy. Courts weigh privacy against other values, including the need to solve and prevent crimes. . . . Warrant-proof encryption defeats the constitutional balance by elevating privacy above public safety. Encrypted communications that cannot be intercepted and locked devices that cannot be opened are law-free zones that permit criminals and terrorists to operate without detection by police and without accountability by judges and juries. When encryption is designed with no means of lawful access, it allows terrorists, drug dealers, child molesters, fraudsters, and other criminals to hide incriminating evidence.”
That has led to repeated calls for a “golden key” that could be used by police to decrypt a smartphone. The argument is that the phone will still be encrypted for all purposes, just like today. But in the right circumstances, with a warrant, law enforcement officials could go to a secret bunker and use the golden key to unlock the phone. The Washington Post called for a magic golden key in a widely discussed editorial in 2014. Rosenstein was essentially calling for a golden key last month. There are frequent calls for Congress to pass a law requiring technology companies to design their devices with a back door.
If the gold key was stolen, or hacked, or leaked, then every smartphone in the world would instantly be insecure. As Gizmodo put it: “There is no way to put in a backdoor or magic key for law enforcement that malevolent actors won’t be able to abuse.”
That’s why the technical and security communities react so negatively to the golden key proposals. They understand that any back door used by the good guys could also be used by the bad guys to compromise all of our security.
But a golden key would only be dangerous if it got in the hands of the bad guys. Surely the tech companies could keep it safe, right? At this point law enforcement officials make vague hand-waving motions, and the Washington Post says the tech companies just have to use their “wizardry.”
That gets back to the lesson that you have to learn from this week’s Mac security flaw.
Apple accidentally left every Mac computer in the world unsecured.
In 2016 Microsoft accidentally leaked its keys for Windows tablets, phones, and other devices.
Last month hackers working for the Russian government stole a trove of NSA hacking tools and highly classified files.
By definition a golden key for phones would unlock all the phones. This isn’t a request for Apple to keep a super-secret private key that only unlocks your house. This is a super-secret private key that unlocks every door on the planet. It can be stolen, and when it’s stolen it can be duplicated effortlessly so every bad guy in the world has a copy. From a security conference in 2015: “In the domain of cybersecurity and encryption, the bad guys are just as smart as the good guys. Their tradecraft is focused on identifying and exploiting vulnerabilities. If there is a back door, they will find it and exploit it. At the same time, it’s hard to imagine that government agencies, which are regularly breached, could be trusted to keep such a golden key safe from hackers and criminals.”
A golden key is one of the most dangerous things that can be imagined for your security, because mistakes happen.
Apple proved that this week.