PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Apple Exposes iOS Security Details

Yes, Apple's Ivan Krstic announced a new bug bounty program at Black Hat. But before that he explained several components of iOS security in unprecedented detail.

You've heard by now that Apple announced a new bug bounty program at the recent Black Hat conference. In an unusual appearance, Ivan Krstic, Apple's head of security engineering and architecture, made the announcement himself. But that was just the last 10 minutes of a 50-minute presentation. For The first 40 minutes, Krstic took an unprecedented deep dive into three components of iOS security. And by deep, I mean bathyspheric.

My overall takeaway was a sense of awe at how thoroughly these systems protect user data, even from Apple itself. I'll try to convey what was said, without getting too technical.

Hardened WebKit JIT Mapping
Sorry, that's what it's called. JIT stands for Just In Time, and refers to the way Javascript code is compiled just in time for its execution. "This is necessary for high-performance JavaScript," explained Krstic. "But the code-signing policy has to be relaxed. The JIT compiler emits new, unsigned code. An attacker who managed a write-anywhere attack could enable execution of arbitrary code."

Black Hat Bug Art

For a little background, areas of memory can be marked with read, write, and execute permissions. That distinction, introduced ages ago, snuffed out attacks that executed code in areas devoted to data. Briefly, Apple's solution involves a technique putting the compiled JavaScript into a memory area that permits only execution. Processes can't read what's there or write new data. There's a bit more to it than that, but this change, specific to iOS 10, wipes out a whole range of possible attacks.

Secure Enclave Processor
Applications on an Apple device run in a CPU called the Application Processor, or AP. Modern Apple devices have an entirely separate CPU called the Secure Enclave Processor, or SEP. "The SEP is protected by a strong cryptographic master key from the user's passcode," said Krstic. "Offline attack is not possible. It sidesteps the attack surface of the AP, even when the AP has been compromised. It arbitrates all user access and manages its own encrypted memory. On first initialization it uses a true random number generator to create a unique device key within the processor. It's not exportable, and it's stored in immutable secure ROM."

Krstic went on to explain how the device uses four types of internal security keys with different characteristics. Type A exists only when the device is unlocked. Type B is an always-present public key, plus a private key that exists when the device is unlocked. Type C comes into existence the first time the device is unlocked after boot. And type D is always available.

The presentation moved on to a number of seriously intricate diagrams. One walked through the process of booting and unlocking the device, showing how each key type was created and stored. Every file on your device has its own, unique encryption key; another diagram showed the intricate dance that lets the SEP authenticate and decrypt that file while keeping the essential security keys inside itself. Another explained the complex process that makes it possible for you to choose "Update later." And yet another walked through the process that permits unlocking via touch ID without keeping the master key visible in any way.

The key takeaway from this part of the talk is that Apple has really, really thought through what's required to manage encryption completely inside the Secure Enclave Processor, without forcing the user to go to much trouble at all. If you'd like to see those diagrams for yourself, check out the Krstic's full presentation.

Synchronizing Secrets
It's awfully convenient that you can sync your data between multiple Apple devices. HomeKit lets you manage IoT devices, AutoUnlock makes your Mac unlock when your Apple Watch is nearby, your photos sync through iCloud, and so on. But security-wise, syncing is a problem.

"Traditional approaches are not good," said Krstic. "One way is to make the user enter a strong 'sock drawer key' on all devices; lose it, and access to the secrets is lost. The other way is to wrap the data in a derived key that leaves the data exposed to the account provider."

"We had a number of goals here," continued Krstic. "Secrets must be available all devices, protected by strong crypto. Users can recover secrets even if they lose all connected devices. Data is not exposed to Apple, and there's no possibility of a brute-force attack."

Authentication in the basic iCloud Keychain system is simple. Every device has its own key pair, and in order to add a new device to the sync circle, you must approve it from one of your existing device. Apple's backend is not involved; it has no privilege. If a user loses access to all devices, access can be regained using both the iCloud Security Key and the iCloud password.

Krstic explained in great detail how Apple manages this system without leaving open the slightest possibility that anyone, including anyone at Apple, could access data from the back end. The system involves what are called admin cards, created at the time a new fleet of crypto servers is commissioned. "Admin cards are created in a secure ceremony when the fleet is commissioned, and stored in separate physical safes in custody of three different organizations at Apple, in tamper-proof evidence bags," said Krstic.

That situation lasts only until the fleet is actually put into operation. At that time, said Krstic, "We put the admin cards through a novel one-way hash function." Pulling a clearly-used blender from under the podium, he continued, "Yes, a trip through the blender." Once the fleet of crypto servers is active, it can't be updated or modified in any way, not even by Apple, because the admin cards have been destroyed. If it happens that an update really is required, Apple must spin up a new fleet and send out a software update that makes user devices connect to the new fleet.

"Why do we do this," said Krstic. "Why do we take this last step that's extremely unusual? We go to great lengths to engineer the security systems to provide trust. When data leaves the device, the stakes are even higher. We need to maintain that trust. If we keep possession of those admin cards, there's the possibility that's not true. That's how seriously we take our mission about user data."

Asked, "Did you do this because of the FBI requests for information?" Krstic replied, "I'm an engineer. I can only answer questions about why I presented today." OK, fair enough. But I think the questioner was right. Making a self-contained system that you can't even modify yourself is a pretty good way to keep someone else from making unwanted changes.

I hope I've conveyed the level of detail in Krstic's talk without making your eyes glaze over. Judging by the chatter as the group dispersed, the true byte-level security geeks in the room were highly impressed.

About Neil J. Rubenking