iOS Keychain Security

phx picture phx · Aug 24, 2010 · Viewed 29.3k times · Source

we want to use certificates on the iPhone to authenticate for MS Exchange Sync. We are not sure how the security concept is implemented to protect this certificates.

e.g. is it possible to get "full" Keychain access on the iPhone if no ScreenLock is enabled? (or with an Jailbroken iPhone).

Does anybody has some links about this?

Answer

pepsi picture pepsi · Jul 15, 2011

Fraunhofer's study on iOS keychain security:

From what I can tell, there are two levels of encryption that the iOS keychain uses. The first level uses the lock screen passcode as the encryption key. The second level uses a key generated by and stored on the device.

Fraunhofer's researchers have figured out how to get around the second level. This is the "easier" level to get around, since the encryption key is stored on the device. So on iOS4, their method only works with keychain entries which do NOT use kSecAttrAccessibleWhenUnlocked or kSecAttrAccessibleWhenUnlockedThisDeviceOnly, because those entries reside in memory with the first level decrypted--even when the phone is locked.

  • Starting from iOS 4, keys with kSecAttrAccessibleWhenUnlocked and kSecAttrAccessibleWhenUnlockedThisDeviceOnly are protected by an extra level of encryption
  • On iOS 3.x and earlier, all keys can be decrypted using Fraunhofer's method, regardless of accessibility attribute used
  • Devices with no passcodes at all will still be vulnerable
  • Devices with weak passcodes (less than six digits) will still be somewhat vulnerable

≈50ms per password try; → ≈20 tries per second; → ≈1.7 years for a 50% change of guessing the correct passcode for a 6-digit alphanumeric code with base 36. The standard simple code of 4 numeric digits would be brute-forced in less than 9 minutes. Based on the assumption that the counter for wrong tries in the iOS can be bypassed, as it is not hardware-based

Apple Inc. WWDC 2010, Core OS, Session 209 "Securing Application Data", Slide 24

Bottom line: If you must store sensitive data, better use your own encryption. And don't store the key on the device.

Edit: There are numerous news articles which cite the Fraunhofer study and reassure their readers not to worry unless their devices are stolen, because this attack can only be done with physical access to the device.

I'm somehow doubtful. The fact the researchers did their tests with physical access to the phone seems to have just been a way to simplify the problem, as opposed to being a limitation. This is their description of what they did to decrypt the keychain entries:

After using a jailbreaking tool, to get access to a command shell, we run a small script to access and decrypt the passwords found in the keychain. The decryption is done with the help of functions provided by the operating system itself.

As anyone who has used jailbreak.me knows, jailbreaking does not require physical access to the device. Theoretically it should be trivial to modify the jailbreak.me code and have it automate the following:

  1. Perform the jailbreak as normal (all this requires is for the user open a maliciously crafted PDF)
  2. Run Fraunhofer's scripts after the jailbreak is complete
  3. Send the passwords over the network to a location the attacker can read it from

So once again, be cautious about what you put in the keychain.