Allow me to interject some sanity here for a moment, because I seem to have stumbled upon a very interesting point which should be quite clear... but apparently is not.
Just because you're running a bunch of virtual servers does not mean you're adopting cloud computing.
Here's the situation...
Acme Widget Corp has a ton of under-utilized servers, so they decide that they're going to go virtual. They buy huge hardware, and install VMWare infrastructure on everything so they can virtualize and use up those precious processor cycles on that bare metal they're paying lots of money for.
You see, at present Acme Widget Corp has 1,000 servers running just under 400 applications, which means there are several applications where a server is simply hosting some small portion of the total application, and probably using 1/10th the total computing power of that physical server.
Along comes a brilliantly thought-up project to virtualize everything so they can squeeze more applications into less hardware and get better utilization. Today they've got 600 servers, most of them well-utilized with a bit of capacity (say, 25%) to spare in case they need it.
So... in a meeting to describe the latest advances in cloud computing, their CIO steps forward and volunteers that they've already virtualized everything and are well on their way to this not-so-new thing called "cloud computing". Yes, the CIO did air quotes.
Now, you're reading this and thinking one of two things... either "what a fool" or "so what's wrong with that statement?" - either way you should read on because there are several reasons this situation of theirs is not cloud computing.
- It's missing elasticity, meaning, they can't simply add capacity on demand. Sure, they can create another virtual machine and distribute load as their software packages and architecture allows - but Cloud Computing is so much more than that. It's about being able to add cores and memory on demand, without needing to buy physical metal... in most cases until you run out of 'cloud capacity'...
- It's missing mobility, meaning, you can't just take one virtual server and move it to some other environment. Now, I guess you can if you've got things like vMotion and some other cloud is using VMWare also ...but this isn't typical
- It's missing bursting capabilities, which means that when their virtual environment (not cloud) has hit capacity during a high usage period they can't just spin up a few instances of their application architecture on another public cloud provider and go
- It's missing self-service, which is what a lot of promise of Cloud Computing comes from. Taking IT from a provisioning group or builders of computing technology to brokers of that technology through a self-service. The ability for a business to go spin up a useful, template-driven compute instance is critical for Cloud Computing
- The necessary orchestration isn't there, meaning that if Acme found that they needed to patch their virtual machines against a critical OS vulnerability, they'd have to go to each (virtual) machine and patch... yet what you're looking for in a Cloud Computing environment is a patch-once type of capability which saves time, money, and critical IT resources and preserves uptime
So, you see, this isn't just as simple as buying a bunch of VMWare licenses, or using some of the other virtual platforms (I'm not picking on VMWare specifically here), Cloud Computing is fundamentally a cheaper, more robust, and more expansive way of computing.
Cross-posted from Following the White Rabbit