The Cloud-Computing Myth

There's always a new era of "network computing" around the corner, but we won't reach it soon.
##CONTINUE##
John Chambers, among others, has reignited a fervor around the prospects of "cloud computing."

Distributed computing has been championed on several occasions, perennially reemerging every five years or so but failing each time to overcome significant hurdles and operational risks before fading back into remission.

Let me save you some suspense, I believe the hype of cloud computing will once again taper off, even with advancements in Internet applications and improvements in connectivity the past few years. Bandwidth constraints and the growing cost of incremental traffic will partly be to blame (this is not a trivial hurdle; carriers grapple with an inability to charge by usage). The concept will also fail because of the complexity of maintaining and supporting so many remote devices and the costs of unavoidable outages; because part of a business' competitive advantage is its operational customization of off-the-shelf software; because of regulatory hurdles and restrictions not often considered part of the discussion.

Advantages: Nothing Cerebral here

The drivers behind the cloud computing movements have always been clear, especially from the Cisco perspective; if you increase data usage across the network, you sell more routers. Flow everything through the network and you can sell a lot more routers. Everyone benefits: More data flowing mean more servers, more backup, more fiber, more towers, more data centers ... you get the point. It helps enterprise information technology staffs who can better control what their devices are used for and remotely maintain them. It even helps the end user who can finally jettison some the device maintenance and upkeep that they so hate doing and often neglect.

So what's wrong with cloud computing?

Detractor 1: Cost of Connectivity

Revenues generated per dollar of infrastructure invested is falling at carriers. Having previously failed to push tier-pricing structures on customers (proportional to usage)--and with the growing strain from iPhone users hogging resources--carriers are reluctant to sell unlimited access to heavy users. New netbook plans where a user receives subsidized netbooks with service plans are popping up, with very limited bandwidth caps; the new base Verizon plan limits users to 250 Mbs of downloads a month before overage charges, compared with an average 900 Mbs of usage by your average smart phone user and about 3 Gbs for the average land line (Alcatel Lucent, CDMA Development Group, October 2008).

Cloud computing would take all applications and data and place them remotely. Anytime you accessed a video, a picture, Word, Excel, anything, you would need to do it remotely and generate much greater strain on the network. It is less and less likely, given the tight economic environment, the more restrained cap-ex budgets and the lack of tired-pricing acceptance, that carriers would encourage and subsidize such a high-usage model.

Detractor 2: Unavoidable Outages

Downtime is a real risk to business. If a consumer can't access the Web for several hours a couple of times a year because of outages, it is, while unfortunate, hardly the end of the world. Loosing that much productivity across a region or across your entire organization, however, is not acceptable.

Take the Google outage: Even with a very robust and redundant architecture supported by the best and most expensive purpose-built hardware out there, Google's network goes down occasionally. Think about your own experience with access and how many outages you experience on a yearly basis. Try to imagine a situation where you lose all access three, four, five times a year for three hours to two days. Consider local outages too: There are still some locations in Boston where I have a tough time getting uninterrupted access over the course of an hour or two.

Detractor 3: Not Sharing Secret Sauce

Many businesses consider their optimization of off-the-shelf software as part of their proprietary and competitive advantage. They spend many man hours optimizing products and configuring them to optimize their sales efforts, their manufacturing processes, their customer interactions, etc. I'm not sure how much of this secret sauce would be trusted to external software vendors also serving your competitors.

Detractor 4: Regulatory and Compliance Restrictions

The final detractor to cloud computing can be lumped into compliance and regulatory problems. These are rarely mentioned but present a significant hurdle.

Compliance and regulatory concerns have to do with where data are physically located and actually make little logical sense except for the fact that this is how the law is written. Government regulations require customer records, certain kinds of billing information and various types of other compliance information to be physically located in the same state or country as the customer or the regulator. Assuming a cloud computing environment in Europe, these requirements would necessitate a data center or a storage site in every country.

Obviously such an IT structure would not be feasible, as large organizations typically operate two or three primary data centers where information is consolidated around the world. Regulators don't particularly care about IT consolidation, virtualization or cloud computing. They require organizations to not only be able to quickly locate required information but also prove its location.

Try doing that in a virtualized cloud. Currently we see no signs that governments will ease these restrictions; if anything, given the realities of the last year, regulatory hoops at every level are likely to increase.

While it seems clear that high prices and a variety of constraints will dampen the rollout of cloud computing, I hate to stand as a naysayer to progress and innovation. After all, technology has a long history of tearing down barriers and overcoming significant hurdles. A positive takeaway, therefore, as we approach this vision, is the growing worldwide need for significantly more bandwidth and much bigger pipes. So run out and buy the base station and long haul, guys; the future should continue to be very bright for them. Given the cost of the network upgrades, we should prepare to pay more for the growing amount of bandwidth we consume as we stream videos and data either to our phones or computers.

Finally, at least until the next re-emergence of distributed computing, most of us will still need to carry around full-function PCs, so don't sell the PC component providers just yet.

-----------------------------
BY Avi Cohen
Source:Forbes

2009 Forbes.com LLC™ All Rights Reserved.

0 comments:

 

Copyright 2008-2009 Daily IT News | Contact Us