Former Microsoft executive thinks he has a solution to the cryptography war with law enforcement

William Gayde

Posts: 382   +5
Staff

As encryption becomes more commonplace in our devices, the job of law enforcement is becoming increasingly difficult. The central question surrounding this "Crypto War" is whether or not there is a way to maintain a consumer's privacy, but still give law enforcement a way in during issues of national security or to help solve crimes.

The view among privacy advocates is that encryption should be so strong that no one, not even the device's manufacturer or government should be able to gain access to the information. On the other hand, law enforcement officials have called for limited encryption and security that is just weak enough to allow them access in times of need.

These two ideas may seem mutually exclusive, but former Microsoft executive Ray Ozzie has come out with a solution that aims to please both sides. Originally published in Wired, his "Clear" proposal provides a combination of strong cryptography with a method for government officials to gain legitimate access in times of crisis.

The system works through a pair of public and private keys, the same way the vast majority of encryption works today. Anything encrypted with the public key can only be decrypted with the private key, and vice versa.

Device manufacturers, like Google and Apple, would generate a keypair and install the public key on all of their devices. The private key would be kept in an ultra-secure location that only the manufacturer has access to, similar to the way code-signing keys are stored. The phone would then automatically encrypt the user's PIN using the public key pre-installed on their device. If "exceptional access" was required by law enforcement, they would then need to physically obtain the device as well as a search warrant for the data on it.

Once this warrant is obtained, a special recovery mode on the phone can be enabled that presents the investigators with the encrypted PIN. This encrypted PIN and proof of the warrant are sent back to the manufacturer who can then decrypt the PIN for that device specifically.

Once this recovery mode is accessed, the phone is effectively "bricked." Data cannot be erased from it and the phone cannot be used further. By requiring physical access and intentionally bricking the phone, this means law enforcement will not be able to covertly gain access to a device. This is meant to ensure that they go through the proper legal channels and cannot abuse the system.

While this is certainly one of the most robust solutions to the cryptography issue we have seen so far, it is not without its drawbacks. The most obvious would be that if a hacker was to obtain the manufacturer's private key, they would automatically gain access to every device. A similar key escrow system known as a Clipper Chip was attempted back in the 1990s and it failed miserably.

With no clear answer or way to please both sides of the argument, it's still good to see new proposals that produce a meaningful debate.

Permalink to story.

 
"Once this warrant is obtained, a special recovery mode on the phone can be enabled that presents the investigators with the encrypted PIN. This encrypted PIN and proof of the warrant are sent back to the manufacturer who can then decrypt the PIN for that device specifically."

I like the proposal, especially since it is workable and protects the privacy of the regular citizen BUT, I would strongly recommend that any warrant (1) be by a Federal judge and (2) that judge not be (now or previously) part of FISA. The second requirement is primarily due to the appearances that all FISA judges are corrupt. Not that they are but the entire nature of having "secret judges in secret courts" flies in the face of the expectations of fairness and honesty in our judicial system.
 
If this was to be implemented, I hope that the manufacturer's private key would be kept stored offline.

Also, IF this happens before quantum computing becomes standard (3-5 years estimate), its not going to end well.
 
I agree with @Uncle Al, but I'm also waiting for the crypto experts to comment.....not my field, but the idea sounds good.
 
90% of world governments would never accept this solution because they want NO evidence of their spying.
 
Technical means that would make agencies get a warrant to search the device (or our mailboxes etc.) - this sounds very good. But I don't think the private keys in this particular solution could be kept safe for long...
 
If this was to be implemented, I hope that the manufacturer's private key would be kept stored offline.

Also, IF this happens before quantum computing becomes standard (3-5 years estimate), its not going to end well.
You can't have the secret key stored offline since your equipment is online though the key could be put in a chip that is no accessible outside the chip which would also need to do the encryption and decryption. It is the only way to have it secured all done on the same chip.
 
I agree with @Uncle Al, but I'm also waiting for the crypto experts to comment.....not my field, but the idea sounds good.

You mean like the backdoor put into Cisco IOS, that was discovered by the Chinese, and they thanked the NSA for it, which means they must have been using it for years.

No - to backdoors, except for the Mrs....
 
Ok, so they can use that cert to encrypt the phone itself and then some official instance can get into that phone with their key.
That's fine for any average Joe, but anyone wanting to hide stuff, like real big criminal organisations and all that, will just encrypt their data (= pictures, phone records, messages --> anything of sensitive value) with another, private certificate that cannot be hacked (for now).
Generation of such certificates can be done on *any* machine, and encryption can be done out of the box on any device/platform with relative ease.
So I do think this has no value whatsoever. Yes, you'll be able to get data from some unaware folks, but not the criminal organisations, since they'll easily write their own encrypted apps (or use whatever's publicly available), or they could even use an older or alternative OS without this new feature, or contact a Chinese manufacturer and order 1000 'custom' smartphones for 100,000$ or sth....
Really, all of this is too little too late: there are many easy and cheap ways around it....
 
This could be improved by having each manufacturer create a new private key on a regular basis, say weekly or monthly. Then the keys must be kept in separate locations to avoid a large breach. Now at least if one private key is compromised, only the devices manufactured that week are at risk.
 
Back