Police in Britain have announced that two people have successfully been prosecuted under a UK law that forces defendants to give up their encryption keys and penalizes those who don’t comply. Another UK woman’s case had attracted attention two years ago, when the government demanded she give up her encryption keys after the police found encryption software on her computer, but the police say she was not one of the two defendant’s charged. Is there a software solution to this problem — a way that people can encrypt files on their computers, without arousing the suspicion of law enforcement if the computers are seized?
File encryption, if properly implemented, is generally considered mathematically unbreakable. But to prevent suspicion falling on people just for encrypting files in the first place, requires a human solution as well as an engineering one. One way or another, some file encryption software would have to be in widespread use that has these two properties: (1) it’s deployed on a large number of people’s machines — not just a large absolute number, but a significant proportion of the total population, so that suspicion does not fall on people just for possessing the software — and (2) it should not be possible to tell the difference between machines where the users use the software regularly, and machines where the software has never been run. Then, and only then, would it be possible to use the encryption software on your machine, without anyone who seizes the machine having reason to think that you had ever encrypted anything at all.
(Of course, in a relatively free society, if law enforcement has probable cause to seize your machine in the first place, then they would presumably already have some evidence against you. But this would at least prevent police officers and judges from becoming more suspicious as a result of encryption software being present on your machine.)
Note that this is similar to the kind of problem that is normally solved with steganography, but by my reasoning, I don’t think that using stego would actually gain anything in this situation. Whether you’re talking about encryption software or stego software, if it’s a program that not a lot of people have installed, then just by virtue of having it on your machine, you’ll attract suspicion if your machine is seized. On the other hand, suppose you’ve cleared that hurdle and the software is installed on a lot of people’s computers, so that just having installed it is not by itself grounds for suspicion. If it’s stego, then you can embed the hidden data inside other images or videos, so that an intruder can’t tell whether you’ve been using the software to hide anything (assuming the stego software is good enough that the intruder can’t tell the images have been tampered with). But you could achieve the same thing with straight encryption software: just have every installation of the program create a "storage volume" file, where encrypted files will be stored. As long as a storage volume file with files embedded in it, is indistinguishable from a storage volume file that has never been touched, the presence of the storage volume file won’t give you away.
I’m not actually aware of any encryption program that has that property: that for a given machine with the software installed, it’s impossible to tell whether the software has ever been used to encrypt data. This is probably because this would normally not be a useful feature of an encryption program. The whole point of making it impossible to tell whether someone has used the program or not, is that people who have used the program would not attract undue attention to themselves as a result. But if the encryption program is only used by one thousandth of one percent of total Internet users anyway, then just the fact that a user has the program installed, would be enough to draw suspicion to the user if their computer is seized, so there’s no benefit to concealing the fact that the program has been used. On the other hand, if the encryption program is installed on a significant proportion of users’ machines anyway, then simply having the program installed is no longer grounds for suspicion. And that’s when it would become a valuable feature for it to be difficult to tell whether the owner of the machine actually uses the encryption program or not.
This may be hard to implement correctly, and there are some tradeoffs that will have to be decided. For example, if the program creates a default "storage volume" file when it’s installed, how big should that initial volume be? The problem with creating a small storage file initially and then letting it grow as encrypted files are added, is that this now makes it easy to tell who is using the program and who isn’t — anyone whose storage file has grown beyond the default size, is using it to encrypt files (and is therefore a terrorist movie-downloading child pornographer, etc.). In order to avoid suspicion falling on people who use the program, the storage file would have to be the same size on everyone’s computer. If you make it 1 GB, that wastes a lot of space on people’s machines who aren’t using it. On the other hand, if it’s only 1 GB, it also means that users will only be able to store up to 1 GB of encrypted data — any more than that, and they’ll have to expand the size of the storage file, thus calling attention to themselves if the machine is ever seized. And then, what about the fact that a large file which is created all at once, is normally not fragmented very much, but if the storage file is frequently modified, it is likely to become more and more fragmented — thus giving people a way to tell if the encryption program is being used frequently. (So you’d either have to deliberately create a very fragmented storage file by default on the first install, or create an unfragmented file on first install but then make sure to read and write from the file in a way that doesn’t fragment it further.) I don’t want to get too bogged down in implementation details. The point is just that you’d have to block all the possible ways that an intruder would be able to tell whether the software is used frequently — forget one thing, and you’ve given an intruder a way to identify people who are actually using the software to encrypt files.
A program called TrueCrypt achieves something close to this — TrueCrypt allows you to encrypt a storage volume with two different passwords, so that one password provides access to "innocent-looking" data, while the other password provides access to the data that you really want to keep secure. If someone is compelled to give up their password, they could provide only the password that unlocks the "innocent-looking" data — and there’s no way, from examining the encrypted file, to tell that there is a second password guarding even-more secret data. (Of course, the "innocent-looking" data can’t be truly innocent-looking, because it has to look like the kind of thing that someone would believe you might want to encrypt — so it should look suspicious enough that you would genuinely want to hide it, but not bad enough to get you in real trouble if you’re forced to reveal it!) The Achilles heel of this scheme is that just having TrueCrypt on your computer in the first place, would at least signal to an intruder that you’re encrypting files. And even if they can’t prove that you might have another "super-secret password" guarding more private data on your encrypted volume, they would certainly suspect it, if they already had grounds to be investigating you and if they knew anything about how TrueCrypt works. To provide true plausible deniability of any encryption at all, you need a program that already exists on lots of people’s machines, so that an intruder doesn’t suspect anything when they find it on your computer.
(The same objection also applies to many other non-solutions to the problem, like using a Linux distro that encrypts your entire file system. Even assuming this would be within the technical means of the average person who wanted to do encryption, it’s still going to look suspicious as long as the vast majority of people are not doing it.)
Which leads to the other half of the problem, which is getting the software widely deployed enough that it would not look suspicious for someone to have the program installed in the first place. Best of all for the purpose of avoiding suspicion, of course, would be for the program to come installed by default with a popular operating system. Windows XP and Vista have the built-in ability to encrypt folders, but anyone who seizes the machine can still see that you encrypted a folder, so this don’t have the undetectability factor. Built-in deniable encryption of the kind that I’m describing, doesn’t instinctively feel like the sort of thing that Microsoft would start bundling with its operating system. (Among other things, they might say that while companies often have business reasons for encrypting files, it’s harder to think of a business case where employees would need to encrypt files and hide the fact that they were encrypting anything.)
Perhaps instead it could be bundled with a popular free software program beholden to no for-profit corporate masters. (My first thought was Firefox, but I was quickly told that Firefox was created specifically to strip out many of the features that had caused bloat in the original Mozilla project, and that any bundling of unnecessary tools would go against the whole ethos of the project.) Maybe a good place to include something like this would be the Google Pack — it’s installed by lots of people, and currently doesn’t have a file-encryption tool in the bundle. Beholden to for-profit corporate masters, yes, but ones that frequently declare "Don’t Be Evil" and often seem to do cool stuff just to see what would happen.
Another possibility would be for a next-generation P2P program to bundle this capability with their software. This provides a nice dovetailing of interests — P2P users might want a way to hide the files that they’ve downloaded, while at the same time, intruders who seize the computer and found the P2P application installed, wouldn’t necessarily suspect the owner of anything more than a little copyrighted file trading. "Well, he’s got this NiftyP2P program installed, which comes with ‘plausibly deniable’ encryption, but most people use just NiftyP2P to download mp3 files and movies anyway. And I can’t tell if he was actually using the encrypted file storage volume, because that’s how ‘plausibly deniable’ encryption works. Is this the same guy who uploaded those subversive anti-government documents? I dunno."
Anyway, if you actually want to give people a way to run encryption software on their PCs, while ensuring that anyone who seizes their machine cannot tell that any encryption has been going on, these are the hurdles that you’d have to clear. I’m not sure whether this is better viewed as a blueprint for how to achieve this goal, or an argument for why it will probably never happen. There are lots of almost-solutions, like TrueCrypt with its ability to encrypt different sets of data into the same storage volume. But you still can’t actually hide the fact that you’re doing encryption in the first place.
(If you’re willing to store your encryption software away from your computer, you could keep a steganography program on a CD or USB drive hidden in your house, and then whenever you need access to the encrypted data, plug in the program and use it to extract data that has been hidden in a large number of image or video files. That would achieve the goals I’ve outlined in the article: the ability to encrypt files, while still ensuring that anyone who seizes your computer won’t be able to tell that you’ve encrypted anything. The problem is that it would require enough self-discipline to always return the CD or USB stick to its hiding place when you were done with it — and still, you’d have to hope that whatever authorities seize your computer, don’t also search your house and find the CD or USB stick where you keep your stego software.)
Finally, risking the wrath of my civil-libertarian allies, I’ll admit it may not actually be a positive thing for every citizen to be able to hide the fact from their local law enforcement that they’re encrypting files on their computer. Many times if the police in a mostly-free country like the US or the UK seize a person’s computer, they’re trying to prevent real harm, and not every person with an encrypted file volume is a good guy. For some of the people who have left enough of an evidence trail that their computers get seized, it would be perfectly rational to view them with suspicion because of an encrypted volume found on their computer. But if you assume it’s a worthwhile goal for people to be able to encrypt files without attracting suspicion, my argument is that the prerequisites in this article are necessary for that to work. At the moment it seems a long way off. But if someone created an encryption program with "deniability" — so that it was impossible to tell whether the program had ever been used after it was installed — and someone at Google thought "Hey, that’s cool" and added it to the Google Pack, everything would change very suddenly.
And some Comments:
A) A smart crook with stolen state secrets or child porn on their encrypted drives would just tell ’em to fuck off.
1. It makes forgetting your decryption key/passphrase/whatever illegal. Yes, seriously. The burden of proof is on the accused to show that they can no longer decrypt the data – how the hell do you prove you don’t have something?
2. The people who it was originally intended to inconvenience – the real terrorists, if you like – aren’t going to be even remotely concerned by it. They know full well that there is a risk they’ll be caught and spend time in jail. If it’s a choice between "reveal the decryption key, thus providing the police with the only evidence they’re likely to find which implicates you and a number of others for so many criminal activities you’ll be in prison for 20 years and when you get out you’ll get a bullet in the head for the people who you dropped in it" or "keep your mouth shut, go to prison for two years", I wonder which one they’ll chose?
B) This is a perfectly viable option but, as someone working in computer forensics, the major issue missed in this editorial and the subsequent comments is that most people really can’t be bothered with encryption. I have examined many computers with versions of truecrypt and other, less reputable, encryption packages on them that are simply not used. Maybe I was foiled I hear you say and maybe yes I was (in my recollection there were no large unknown files with cryptic looking signatures and unfathomable data structures (normally a big pile of what looks like junk)) but the evidence was still resident (possibly replicated) in the unencrypted portion of the filesystem anyway.
If I were to have the ability and/or inclination to design a system of encryption designed to not arouse suspicion it would have to be something that is there by default like having a separate partition or container file for each user with the encryption tied-in to their user account so when logging in their login credentials are the encryption key and the volume is auto mounted transparently. Maintaining a separate file or partition for each user would assure privacy both within the system and upon any kind of post-mortem analysis (such as a forensic analysis using EnCase, FTK or TSK). These are just my musings and as the author of the article said getting any kind of wide support for such a technology is unlikely and will probably never happen. It’s interesting to muse on it however!
3)What all the talks on crypto seem to forget is that crypto only protects your data when you are not using it.
If they are investigating you to the point where they are going to be seizing your computer they have means of acquiring your password.