A SANS Handler Notebook entry by Toby Kohlenberg reports on data leakage in cloud computing
, and links to a terrific paper from some UCSD/MIT people
: Ristenpart, Tromer, Shacham, and Savage.
If we set the wayback machine to the early 1970s, we find a paper by Butler Lampson
about something called the confinement problem
. It's the same thing. Ristenpart et al pick up some of the threads (like noninterference) though their paper doesn't point all the way back to Lampson.
This is a hard problem to solve. The only defense right now is if attackers lack the motivation to exploit it.
The Bottom Line
Virtual machines are not
analogies to safe deposit boxes, much though we might wish they were. Once your data is visible in plaintext on the same machine as your adversary, there are unimaginable risks of disclosure, few of which you control.
Sharing a computer is like sharing a meal. You may share more than you intend even if you don't touch each others' plates.
All security starts with physical
security. You don't want to store your crown jewels in the same place as the jewel thief's booty.
Back in the 1960s, when computers stode the earth like dinosaurs, gulping megawatts and tons of air conditioning, it could cost $100 for one second
of computing time. The paper by Ristenpart et al states that Amazon's "Elastic Compute Cloud" (EC2) service costs as little as $0.10 per hour
. So '60s computer owners had a gigantic inducement to keep their costly machines as busy as possible.
This led to the multilevel security problem
. Briefly, it's the problem of preventing sensitive data from leaking into non-sensitive computing environments. The DOD didn't want to have separate computers for every stripe of security restriction. They wanted to be able to run less-secret jobs on the same equipment as top secret jobs, preferably at whatever times were convenient. On the other hand, they'd do everything possible to prevent top secret information from leaking into less-secret jobs. After all, the results of the less-secret jobs would be available to people who weren't trusted to protect top secret information.
The NSA spent millions trying to encourage computer designers to solve the problem under the Trusted Computer System Evaluation Criteria
, or Orange Book. The problem has been solved in practice by Moore's Law: it became cheaper to host sensitive processing on separate, dedicated hardware. In defense and government applications, at least, they manage to avoid most of the risky sharing situations. The remaining problems produce today's cross domain sharing problem
The Underlying Problem
I think the easiest way to describe the problem is in terms of noninterference
. If two processes don't interfere with one another, then they can't communicate. On the other hand, if one can interfere with the other, then it can send information to the other. The interference can be of any kind: slowing it down, restricting its RAM or network bandwidth, or forcing cache misses. If one process can sense the presence of another, then it can establish a communications channel.
The channel may be noisy and unreliable, but that doesn't prevent communication. It simply restricts the channel to a lower bandwidth.
Moreover, the very features that make cloud computing interesting - huge multiprocessor systems - are most fruitful for high bandwidth covert channels. Digital Equipment Corporation (DEC) made the first major attempt to put multilevel security on a multiprocessor: the Vax Virtual Machine Monitor. Early studies found a multiprocessor contention channels
that could transmit data at megabytes per second. That was particularly fast, considering that the instruction cycles were counted in megacycles.
Such channels didn't disappear simply because we haven't thought much about them in today's post-MLS security environment.
On the other hand, it's not clear that we've really tried to design a system from the bottom up to avoid covert channels. Proctor and Neumann made some suggestions on managing resources
to avoid channels. It's not clear that we can apply this to modern multi-core applications, but then we haven't tried, either.
Risk and Profit
As soon as cloud computing hosts serious work, attackers will be motivated to go after it. It might not be this year or next year, but it'll happen.
Meanwhile, customers have to decide if they are going to try this latest form of outsourcing. The early investors may profit from it if they can save money in the short run and move on before the security risks catch up. When Computerworld wrote about cloud computing channels
, the writer indicated that some customers avoid cloud computing already because of trust issues. Some managers may become heroes near term if they can save money with a shift to the cloud, and then get promoted to a different job before the troubles arrive.