OK, so I’ve been thinking about this whole kerfuffle (or has it elevated to a brouhaha at this point?) about the FBI going to court and forcing Apple to (as I understand it) develop a special version of iOS that doesn’t have the “10 attempts and data self-destructs” option. I’ve got thoughts, believe me, and some of them disturb even me.
Let’s start with the whole request in the first place. Tim Cook is right: not only is the request potentially improper on its face, but even the creation of a special software stack that doesn’t have the self-destruct feature opens a big can of worms probably nobody wants to open. It’s not that the Fed would use it improperly: they would, no questions asked. It’s that the moment Apple creates it there’s the potential for naer-do-wells to get the code. No airlock is ever 100% airtight, and even as secure as I know Apple’s campus to be it would leak out somehow. It’s … pun totally intended here … forbidden fruit almost worthy of a 007 movie-plot scenario to get. Somebody would, you know it.
But then I start thinking of the implications here. The FBI can’t unlock it? Now, I know more than one person is saying “oh, it’s not that they can’t, they just want the legal precedent!” I can accept that, but it still lingers in the back of my mind that all my life as a hacker and somebody who often skirts right up to the edge of “legal” on certain things that I’ve wasted a hell of a lot of time covering my tracks. I mean, think about this seriously. Going to court, especially in northern California (Apple’s home turf) to get this writ required some pretty serious lawyering. If you’ve read the docket, it actually is a novel approach to the problem, and that wasn’t done by some first-year law intern. Real lawyers had to craft those briefs, and to this armchair legal scholar I can honestly say it was VERY well done. It had the hallmarks of someone who knew exactly what they were doing.
And the Fed never wastes lawyers on trivial shit. I believe that the FBI can’t crack it, or at least crack it easily.
As a teenager I heard horror stories of the FBI using data recovery techniques to do discovery on hard drives of BBSes and the like to get evidence of warez that were deleted. I’ve heard some wild theories as to how the FBI did some of these things, probably fueled by too many teenagers watching too many X-Files episodes. But even saner and cooler heads thought the FBI had some pretty good tech, and could do some pretty impressive things with the then-primitive tech of the 80s. Hell, some of the equipment we regularly used back then was so dodgy even the owner couldn’t guarantee his files were there at any given moment.
So here we have the FiBbys asking Apple to help them crack this with (what amount to) a copy protection defeat just about any skript kiddie could hack.
I’ve pointed this out glibly in my Facebook post, but.. don’t they have code crackers at the FBI? I mean, really. How hard is it to JMP instruction around the 10-times-and-your-out code? Isn’t there some way you could just pull the WRITE line from the memory store so that a delete couldn’t happen?
Oh, wait. Mmmmmmaybe it’s not that simple. Maybe Apple not only won’t code around this, maybe they can’t.
Consider this possibility. Apple now does most of their own chip design and production. And we know that newer models of the iPhone use some of the more advanced techniques to guard sensitive data in a secure part of the machine with its own rings and stuff. But the 5C (the phone in question) doesn’t have that protected partition.
But what if the crypto is handed off to an internal cypher chip that has a “bad password” function INSIDE THE CHIP, not easily rewritable in the firmware, that destroys the data after 10 password attempts are recorded? There’s no signals to mine on the bus, and no easy way to just (theoretically) fiddle with IO pins and get the behavior to stop. Ten passwords, data store is corrupted, all done inside Apple’s A6 little magic box. It may even be coded in the microcode of either the CPU or GPU portions of the chip, and that’s not likely easily changed (assuming it can be changed at all).
This is theory. But I think it’s a valid point: Apple thought this through. They’re going to fight this in the courts, and if they lose they’re going to throw their hands up and then publically say “we can’t even do it, and here’s why.” Congress will argue, and we’ll wind up with another lousy key-escrow proposal (or something even more idiotic), it’s already starting. But in the end there’s millions of iPhones out there that CAN’T BE HACKED.
Steve Jobs may have been an ok guy after all. One of his (likely) design decisions may cause McAfee to eat a shoe.