Apple and Encryption

If you listen to the news, you’ll be forgiven for thinking the recent Apple court order and the lawsuit they filed in response are about terrorism. It’s not.

It’s about encryption and the story starts long before San Bernardino. It starts with iOS 8. With the release of the eighth version of Cupertino’s mobile operating system, they added a significant new feature: proper key management for whole disk encryption. The system had disk encryption before, but since Apple’s servers had the keys, it wasn’t secure. Anyone with access to the servers, be it a government officials, disgruntled employee, hackers, foreign espionage agencies, etc. could decrypt anyone’s phone.

This is bad security practice. In the industry it’s what’s known as a “single point of failure.” Keeping millions of crypto keys on one server is a risk. Keeping said server (no matter how well firewalled) on the network is a ridiculous risk. Apple realized this, and fixed their policies.

So now, the crypto keys for an iOS device don’t leave that device. Which means that if you forget your access code, Apple can’t help you. The data is locked behind a layer of strong encryption which can’t be feasibility broken (given the current state of declassified encryption science). And this is a good thing; this is whole disk encryption working as designed.

So what is the case about and what has the FBI asked Apple to do? There is one hole in the system. The crypto key is secured with the phone’s passcode. This can be attacked. It’s a simple matter of brute force (trying every possible combination until one works). A four digit lock code only has 10,000 possibilities. While that sounds like a lot (especially if you’re entering them manually), with automation, it’s a simple matter to try them all. This is what the FBI wants to do.

However, to prevent people who steal phones from doing the same thing, the system is programmed to delete the encrypted data if more than ten failed attempts at the passcode are made. The court order asks Apple to alter this protocol. They propose that Apple author an OS update, sign it with their private key (which is required so that the phone will know Apple really released the update, not some random hackers) and upload it to the phone. Apple (rightly) refuses.

The FBI says this is a one time deal. Everyone else (Apple, Google, Facebook, me, hopefully you) knows this isn’t the case. If this technology exists, it will be used. The FBI will ask again, other agencies will ask, local police will ask, foreign governments will ask, eventually it will be leaked, hackers or foreign intelligence agencies or the NSA will steal it and then every iOS device will be insecure.

The government has periodically tried to insert backdoors into crypto systems with the argument that law enforcement needs the capabilities, that only the “good guys” will use them, and that the “law abiding citizen” has nothing to fear. This ignores the facts. It’s technologically impossible to give them what they want without compromising the system for everyone.

In short, the crypto works. The FBI cannot crack it. The NSA cannot crack it (or won’t admit in open court they can). Apple cannot crack it. Given the NSA’s history of warrantless wiretaps and intentional weakening of crypto systems in the past, this is as it should be.

Author: karatorian

Queer trans pansexual lesbian queen radical leftist anti-fa anarchist mystic agnostic hacker geek ravenclaw gamer artist author herald bookbinder human being.

Leave a Reply

Your email address will not be published. Required fields are marked *