Berkeley gives forecast for cloud computing

A group of leading computer scientists published a white paper Thursday (Feb. 12) that lays out a broad road map for cloud computing. They called for a standard application programming interface (API) to spark adoption among a broad set of providers and users.
##CONTINUE##
Cloud computing lets users tap into the power of massive data centers to run their applications, a capability currently offered by Amazon, Google, Microsoft and a handful of other companies. The trend could "transform large parts of the IT industry," said the group of eleven professors at the University of California at Berkeley who wrote the report.

The report called for cloud services to apply encryption and improve software tools for virtualization, debugging and code scaling to nurture the trend. In addition, it said computer makers need to do a better job of optimizing their hardware for today's large data centers, and they should embrace solid state drives as part of the storage hierarchy.

"Cloud computing is going to happen, and it will impact applications as well as infrastructure software and hardware," said David Patterson, head of Berkeley's new Parallel Computing Lab and one of the authors of the report.

The group released in March 2008 a white paper on the critical need for a new programming model for multicore processors. The heavy interest in that paper sparked the group's decision to create the white paper on cloud computing which they spent the last five months drafting.

One of the more interesting positions put forward in the new paper is that the industry needs an API for cloud computing so users can have some confidence their data will not get locked in to one proprietary service. The white paper describes the very different technical approaches Amazon, Google and Microsoft have taken in cloud computing services to date.

Amazon's EC2 service operates at a relatively low level of technology, letting users control most of the software stack but it lacks automated scaling and failover features. At the opposite end of the spectrum. Google's AppEngine is more of a high level application framework that creates a "clean separation between a stateless computation tier and a stateful storage tier [resulting in] impressive automatic scaling and high-availability mechanisms," the report said.

Somewhere in the middle, Microsoft's recently announced Azure service compiles the company's .NET software libraries to its Common Language Runtime, a language-independent managed environment.

"The three main players have all made very different bets," said Patterson. "We argue more people will consider using these services if there is a common API," he said.

Cloud computing providers should not compete so much in how they offer services as in the quality of those services, the authors wrote.

"Our view is that different offerings will be distinguished based on the level of abstraction presented to the programmer and the level of management of the resources," they wrote. "Success comes in how well offerings use statistical multiplexing to hide the details of how they appear to offer infinite and easy to tap computing, storage and communications," they added.

Similarly, the paper said cloud services need to use encryption, virtual private networks and firewalls to secure user data.

Among its other hardware recommendations, the authors said computer makers need to do a better job driving power management technologies throughout their systems. Servers should be able to turn off unused DRAM banks and spin disks at a slower rate when not in use, Patterson said.

"If you care about energy that could make a big difference," Patterson said. "These are not necessarily hard changes, they've just been ignored."

A bigger issue is rewriting data center software to make use of flash disks as a middle tier of storage in between today's DRAMs and hard drives.

"That's a fairly revolutionary change," he said. "A whole new hierarchy of storage in computer systems doesn't come around very often."

More broadly, computer makers need to optimize their systems "at the scale of a container--at least a dozen racks--which will be is the minimum purchase size" for cloud providers, the paper said. Cloud vendors will measure systems based on their costs of operation and how many simultaneous virtual machines they can run, it added.

In terms of software, the paper called for advances in virtual machines, debuggers, storage and automated scaling capabilities. "Software needs to be aware that it is no longer running on bare metal but on virtual machines, [and] it needs to have billing built in from the beginning," it said.

-----------------------------
BY Rick Merritt
Source:EE Times

Copyright © 2009 TechInsights, a Division of United Business Media LLC All rights reserved.

0 comments:

 

Copyright 2008-2009 Daily IT News | Contact Us