Friday, March 2, 2012

Rise of the (64 bit) Machines

In my bytebuffer presentation, I note that 64-bit machines have become inexpensive and ubiquitous, and ask the rhetorical question “what are we doing with all this memory?” This is the setup to a fake graph that proclaims that it's being used to play FarmVille™.

A joke, yes, but it points to a bigger truth: when a consumer-class machine has multiple cores and 8Gb of memory, there's a lot of computing power that isn't being used. And these machines represent the next round of corporate desktop purchases. Five years from now, a typical office user will have four cores, eight gig of RAM, and a terabyte of disk. You don't need that to run Outlook.

I think this is the next opportunity for virtualization technology. To date, virtualization has been used to carve a large machine into smaller pieces. In the future I see it being used to carve up desktop machines: the person sitting in front of the computer will get only a fraction of the capacity of his or her machine. The IT department will get the rest.

There are a lot of problems to be solved before a business runs its critical infrastructure on the desktops of its employees. For one thing, the power buttons have to be disabled — or at least repurposed to shut down only the virtual machine. And virtualization software will have to evolve, to be hidden from the desktop user. At the least, the local display, keyboard, and mouse have to act like they always have.

But more important, our ways of thinking about enterprise-scale software will also have to change. We must write software that is completely unaware of where it's running. And which can recover even if the computer is unplugged.

Today's “cloud,” be it EC2 or a corporate data center, is our chance to experiment. Think of it as training wheels for a fully distributed future.

No comments: