You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

3.0 KiB

Owner-Controlled Computers

A computer is owner-controlled if it offers an owner-controlled boot process.

An owner-controlled boot process is one in which all mutable software executed by processors of privilege equal to or higher than the CPU is under the control of the computer's owner, and not of some other party (e.g. the manufacturer).

Clarifications

  • ROM images which cannot be changed (i.e. MaskROM) are not mutable. Software contained on flash or similar chips is mutable, regardless of signing schemes.

  • In order for software to be under the control of its owner, the owner must have the complete source code (in "the preferred form in which a programmer would modify the program") for it and for any necessary compilers, as well as any signing keys and legal permissions necessary to cause the computer to boot using the result.

  • A peripheral which is capable of commandeering control of the CPU is of privilege equal to or higher than than the CPU. This includes all peripherals capable of non-IOMMU-guarded DMA, since they can seize control of the CPU.

Wait, don't we have to trust hardware manufacturers?

"You need to be able to increase the costs of getting caught" -- Edward Snowden

By using a chip as your CPU you are, of course, trusting that its manufacturer hasn't included a hardware backdoor. Why shouldn't you trust software from that same manufacturer to run at the highest privilege level on the same device?

A hardware backdoor or bugdoor can be publicly demonstrated to exist once discovered, and is "perfectly undiscoverable" only if it is never used. Immutable proof of crime or incompetence is in the hands of every customer. Discovery would be catastrophic for the manufacturer, both reputationally and financially. I can easily trust that my hardware manufacturers are existentially terrified of this outcome, even in the face of government pressure. Properly-designed software bugdoors, on the other hand, are practically risk-free (especially when designed in coordination with hardware) and cost little to remediate.

Trust, but deblobbify.

Debian prefers that its developers keep their code signing keys on commodity microcontrollers (such as the stm32 used in gnuk) rather than commercial fixed-purpose HSMs like Yubikeys. Ian Jackson's explanation for this preference uses similar reasoning.

What if I don't care about security, trust, or power?

Instead, you can be awed at the kind of amazing things people discover when "maybe the cpu traps out to some software I don't control" can be ruled out as an explanation. Care about science, and being able to do it instead of accepting unknowability.