Palladium: technical limits and implications
Tim Dierks
tim at dierks.org
Mon Aug 12 12:28:15 PDT 2002
At 07:30 PM 8/12/2002 +0100, Adam Back wrote:
>(Tim Dierks: read the earlier posts about ring -1 to find the answer
>to your question about feasibility in the case of Palladium; in the
>case of TCPA your conclusions are right I think).
The addition of an additional security ring with a secured, protected
memory space does not, in my opinion, change the fact that such a ring
cannot accurately determine that a particular request is consistant with
any definable security policy. I do not think it is technologically
feasible for ring -1 to determine, upon receiving a request, that the
request was generated by trusted software operating in accordance with the
intent of whomever signed it.
Specifically, let's presume that a Palladium-enabled application is being
used for DRM; a secure & trusted application is asking its secure key
manager to decrypt a content encryption key so it can access properly
licensed code. The OS is valid & signed and the application is valid &
signed. How can ring -1 distinguish a valid request from one which has been
forged by rogue code which used a bug in the OS or any other trusted entity
(the application, drivers, etc.)?
I think it's reasonable to presume that desktop operating systems which are
under the control of end-users cannot be protected against privilege
escalation attacks. All it takes is one sound card with a bug in a
particular version of the driver to allow any attacker to go out and buy
that card & install that driver and use the combination to execute code or
access data beyond his privileges.
In the presence of successful privilege escalation attacks, an attacker can
get access to any information which can be exposed to any privilige level
he can escalate to. The attacker may not be able to access raw keys & other
information directly managed by the TOR or the key manager, but those keys
aren't really interesting anyway: all the interesting content &
transactions will live in regular applications at lower security levels.
The only way I can see to prevent this is for the OS to never transfer
control to any software which isn't signed, trusted and intact. The problem
with this is that it's economically infeasible: it implies the death of
small developers and open source, and that's a higher price than the market
is willing to bear.
- Tim
PS - I'm looking for a job in or near New York City. See my resume at
<http://www.dierks.org/tim/resume.html>
---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at wasabisystems.com
More information about the Testlist
mailing list