A History lesson for Cloud Detractors

Computing history

We’ve all seen cloud computing discussed ad nauseam on blogs, on Twitter, Quora, Stack Exchange, your mom’s Facebook page… you get the idea. The tech bloggers and performance experts often pipe in with their graphs and statistics showing clearly that dollar-for-dollar, cloud hosted virtual servers can’t compete with physical servers in performance, so why is everyone pushing them? It’s just foolhardy, they say.

On the other end, management and their bean counters would simply roll their eyes saying this is why the tech guys aren’t running the business.

Seriously, why the disconnect? Open source has always involved a lot of bushwacking…

Quite simply, they have very different agendas. The operations team is trying to deliver application speed, reliability, and scalability. Meanwhile the management team is trying to balance cash flow, capital investments and operating expenses, all this while retaining as much flexibility in decision making as possible.

Sound familiar? Yes, the bottom line for both are the same, but they approach the problem from two different sides.

After hearing this refrain from enough customers, I realized the question we were asking in operations was the wrong question. It is true that cloud servers do not perform as well as traditional servers — at least not yet. The question that one should ask is, given this performance constraint with virtual servers, how can I alleviate the performance limitations of virtual servers and storage but still deploy into the cloud?

A little bit of history

Anyone who’s been around computing since the mid 1990’s will see where I’m heading with this. Commoditization of computing, that’s where. In the late nineties there was a big shift happening. In the 1990’s as Linux was maturing, commodity servers became all the rage. A big shift began to happen which drove cheaper PC hardware into the datacenter. Where once Sun stood supreme, suddenly all these upstarts were pushing crappy commodity hardware at 1/10th of the cost.

Old-guard systems administrators  at that time would balk, telling you how the stuff wasn’t reliable, performance was worse, and they just failed too often. And you know what? They were right! Nevertheless look where we are today. Linux barreled through the datacenter because it lowered costs and introduced flexibility by giving you more choices and more redundancy.

All this rings very true today with the push to the cloud. Startups are the biggest adopters of cloud computing because the future is mostly unknown to them. Investing a quarter million dollars in hardware today when you don’t know where you’ll be in six months – that’s a very hard prospect to entertain.

But cloud computing more than just lowers cost. Don’t have the money for redundant servers to support Disaster Recovery? With Amazon EC2 (here’s a handy guide to EC2) you simply write scripts to rebuild your infrastructure and keep them handy when such an ill-fated day arrives. Flexibility, scalability and easier cost management. That’s what the cloud delivers.

– You know you love us! Why not grab the newsletter.