A Berkeley View Of Cloud Computing : An Analysis – the good, the bad and the ugly


I read thru the technical report from UC Berkeley, Above the Clouds: A Berkeley View of Cloud Computing with interest. My analysis:

<summary>

  • As an undergrad work on cloud computing, the paper gets an A+. But as a position paper from eminent academics, I can only give a C-. Granted it correctly identifies many of the trends and obstacles. But that material is widely available !
  • With a title “A Berkeley view of cloud computing” the report misses the point. “A Berkeley observation…” is more like it – view requires original thinking and interpolation, which the report lacks.

</summary>

<the_good>

  • The authors got some of the essentials of Cloud Computing right viz: infinite capacity, no up-front commitment and pay as you go.
  • The three classes viz: amazon , Microsoft and the Google model is interesting. But there are more in-between.
  • They have some good points on the cost advantage of power et al and leveraging that aspect by building datacenters at the appropriate locations.
  • The new application models viz. analytics, parallel batch processing, compute-intensive desktop applications and so forth are excellent observations.
  • They have done some good work in characterizing elasticity. Pages 10 and 11 are good read – the models are very simplistic, though.
  • They also have done a good job in showing the economies of scale that can be achieved by a cloud computing infrastructure.
  • I like their assertion that “there are no fundamental obstacle to make cloud-computing environments secure as well as compliant to data and processing rules. Declarative policy and enforcement thereof is my answer.
  • They have correctly identified scalable storage as one of the bottlenecks. The BigTable(Google), Dymo(AMZ) and Cassandra(facebook) all are solutions for the challenge.

</the_good>

<the_bad>

  • But, they got the model wrong ! The essentials of Utility Computing is the consumption model not the payment model. No doubt the pay-as-you-go model is attractive to startups, but the payment model is the second order effect. For enterprises and other organizations, the value proposition is the elasticity and the just-in-time availability of resources.  Even for startups the pay as you go is attractive but elasticity is much more important.
  • Argument about increase in performance and resultant cost reduction. This just Moore’s law and it is achievable within IT environments as well as a cloud computing space. I think computers are on a 5 year amortization schedule and depreciation. And a refresh can be done – with associated efficiency whether they are a cloud provider or an IT shop.
  • I think the major disconnect in the paper is the basic definition of a cloud as public. The artificial separation of public/private clouds and the focus on payment were the two areas where their definition has gone awry. Cloud is an architectural artifact and a business model of computing. But clouds are clouds – internal or external, public or private. The internal vs. external is only a spatial artifact – which side of the firewall. Not worth a demarcation when we talk about the domain of cloud computing.  Which side of the internet (firewall) does the cloud infrastructure lie, should not be the criteria. By their definition, they have disenfranchised the whole set of clouds inside organizations. The internal-external cloud integration across data, management, policy and compute planes is an important topic which this model conveniently skips. Also as I mentioned earlier, utility is the consumption not a payment model. A big organization can have a cloud computing infrastructure and it’s business units can leverage the elasticity – no need for a credit card, a charge back model will do.

</the_bad>

<the_ugly>

  1. I really didn’t get the “statistical multiplexing” they mention a few times. What exactly is this and what is the relevance ? Just a buzz word to jazz up the paper ?
  2. I literally got lost in their characterization of DDoS attack and the cost models there of on P.15. Really convoluted and it does not change for traditional vs. cloud. They found a break-even point for DDoS attack based on very slim assumptions.
  3. I do not think the data transfer bottleneck, as described in the paper (P.16), is an issue. Who is going to transfer 10TB of data routinely for cloud processing ? Looks like a force fit for some calculations done by someone.
  4. The report has no useful tables or equations. Equations 1 and 2 (which are the same, btw) are not right – in thesense that the datacenter cost includes the utilization and I do not think we need to accommodate for it additionally.
  5. I am sorry to say all the cost models and the equations look forced and very unnatural. Even the assertion of 1/5 to 1/7 cost advantage of a datacenter is at best questionable.No value what so ever – sorry folks

</the_ugly>

Updates:

  1. Good comments. Thanks folks.
  2. James Urquhart has an excellent blog on the subject. Thanks James. He is more generous than me ;o)
  3. [Feb 19,2009] Blog from GoGrid – good analysis

13 thoughts on “A Berkeley View Of Cloud Computing : An Analysis – the good, the bad and the ugly

  1. Krishna,

    Totally agree with your analysis. Not particularly insightful – and the paper takes a rather narrow view in an attempt to make its points.

    H.

  2. I agree with your good and bad assesments, and even some of the “ugly” ones – e.g. the economic multipliers in Table2 being somewhat contrived ( even Tier2 colos do much much better than $95 per Mbps-month).

    But have to take issue with some of the other “ugly” classifications.

    Statistical Multiplexing: Not sure if this is a trick question. Elasticity based on oversubscription (which again relies on not everyone peaking at the same time) is baked into access networks and more recently in storage (thin provisioning buzzword).

    Data transfer bottleneck: The first three instances of grid/cloud computing that I heard about as being spectacular successes all involved very significant amounts of data. They are: a) SETI (not a commercial cloud but. still) that ingests significant amounts of data from all sorts of telescopes. b) A transcoding application example, that took a massive media library from one of the mainstream owners and transcoded them into several web-friendly formats on EC2 as a one-time effort. c) Actually more an S3 application (mostly storage) where Smugmug uses AWS to safely house it’s paid content.

    So they do have a very valid point there as well.

  3. You are missing the significance of some key points:

    The 5 to 7 x advantage is due to economy of scale that is very difficult to achieve privately.

    Coupled with this scale is the availability of a very wide sampling of workloads which is difficult to get in a private setting – leading to better statistical multiplexing, leading to better utilization, leading to better payment model, leading to better ROI. So, in a sense, payment model is truly the only differentiating feature that makes cloud computing uniquely different, and not any technology component or platform.

  4. I have to disagree with your analysis of the definition of Cloud Computing, specifically internal vs external clouds. The primary benefit of cloud computing are the cost efficiencies derived from the economies of scale that large cloud providers can offer. Simply said, internal clouds cannot offer this benefit. Yes, there are other benefits, but the customers that I talk to (actual customers, not just academics discussing the cloud) are considering the Cloud to save $$$. And yes, the definition excludes some from the Cloud Computing club, but such a definition is needed to clear the clutter and help the customer to distinguish between a real cloud provider and one that has just re-branded.

  5. Pingback: blog.dsa-research.org » Archives » Haizea and Private Clouds

  6. Pingback: A Management Consultant’s View of Cloud Computing or Why McKinsey shouldn’t leave it’s day job ! « My missives

  7. I agree with a lot of what you. In my opinion it is not necessary to make a distinction between the public and private clouds. This is particularly true for large enterprises. All of us have large of number of computing resources sitting idle somewhere because they are not in use. A cloud computing paradigm could significantly reduce IT costs.

    Ravi

  8. Well Cloud Computing has many benefits. As the organizing principles underlying today’s datacenter, have actually outlived their utility and that a new paradigm is emerging.

    Not to forget that there are specific pain points within the underlying IT infrastructure for which we often could not devote time to formulate long term solutions. Most of the challenges that the current data centers are facing include:

    — Ballooning labor costs
    — Sky-high energy consumption
    — Growing Demands from users
    — Chaotic data silos
    — Exponential growth in data volume

    The hidden cost in responding to these pain points is business innovation and at this point the role of cloud computing becomes important.

  9. Well in my point of view, the author of the paper is correct on point of private and public cloud. Public cloud is representing true spirit of cloud computing and on the other hand private clouds behave like normal data centers. If you say that there is no difference between private and public cloud then why introducing new terminology as a cloud? Private cloud and normal data center is a same thing.

  10. I would agree up to some extent to the above analysis and particularly the keyword such as statistical multiplexing is barely confusing and the other assumption they made is acceptable up to large extent and had really apprehended some important merits and demerits of cloud computing

    if any body get the real understanding of statistical multiplexing in terms of cloud computing please let write to me on akobakog@gmail.com

    Thanks
    Cheelem Dala Haramey Gee

Leave a comment