Yesterday and today I attended Chris Mitchell‘s talks at FOSAD 2007 on Trusted Computing / Treacherous Computing (TC) (Slides: part1, part2). The presentation was fairly neutral, even though Chris is one of the people working on OpenTC, so I assume he is also of the proponents of this controversial technology to which I oppose.
Here are some aspects that become even clearer to me from the two talks and the discussions that followed.
- Remote Attestation. The main purpose of this technology is to prevent “corporate drones” from using unauthorized software on their corporate computers. The way they do this is by having the corporate computer remotely attest to other systems that the hashes of the software running on the machine match a list of predefined hashes. This will prevent end users from running the software they actually want to run and is against the very principle of free software, where you should be able to run modified versions of the software.
- DRM and SIM-lock. The Trusted Platform Module (TPM) chip is the perfect tool to uniformly enforce Digital Rights Management / Digital Restrictions Management (DRM) on any sort of device, not only PCs but also cell-phones and others. But as Chris noted, TC is not just about DRM. I’m sure it can do a lot other evil things as well. For example the SIM-lock “feature” on mobile phones can be quite easily implemented uniformly using the mobile version of the TPM. DRM and SIM-lock were examples given in the talk, and two of the official use cases of the mobile TPM specification.
- Ownership and Control. The user of a device having a TPM is not necessarily the one that controls the TPM (the TPM owner). In a corporate setting maybe it is reasonable for the company to keep control over its computers. However, for mobile phones you can expect the mobile network operator to try to be the one that controls the TPM, not the owner of the device.
- Ownership of Data. Having a file on your “trusted” computer does not mean you can use it in any way you like if you are not the owner of its content. And this is not restricted to media files. For example, you can think of scenarios in which encrypted documents can only be opened with a particular version of Microsoft Word 2007, running on a particular version of Windows Vista. Encrypted storage is already a feature in Windows Vista (BitLocker) so this scenario is not hard to implement using the TPM.
- Pseudo-anonymity. The TPM specification was not constructed with anonymity in mind. Version 1.2 of the specification mitigates this problem by adding the Direct Anonymous Attestation (DAA) protocol which uses zero-knowledge in the certificate generation phase. However this only provides a limited level of anonymity, and it relies on the user generating many certificates and using different certificates for different purposes.
- Terribly complicated. The TPM specification is terribly complicated (didn’t count all the different acronyms in the slides but I think there are at least 100). Even more important the specification ignores many of the security engineering principles known for a very long time. And since the security of a TC platform is as weak as its weakest link, there will definitely be some ways to break it. This doesn’t really harm the companies that designed the technology, as long as the ways to exploit it are not obvious for uneducated end users.
- The TPM is just the beginning. Most recent computers have a small chip on their mainboard that implements some of the functionality needed for TC. The TPM will be embedded in more and more devices and there will probably be more chips soon which will add the missing “features”.
- Non-evil uses of the TPM. Even if it seems to be designed with very evil goals in mind, the TPM can still perform some operations which might be useful. For example it can enforce isolation between virtualized operating systems. It can also generate and manage secret keys. The only thing I still wonder about is whether the TPM generates randomness from the pure evilness of the “Trusted” Computing initiative.