Wednesday, February 25, 2009

Windows Azure and Cloud Computing Posts for 2/23/2009+

Windows Azure, Azure Data Services, SQL Data Services and related cloud computing topics now appear in this weekly series.

Note: This post is updated daily or more frequently, depending on the availability of new articles.

• Updated 2/24/2009 9:00 AM PST: SDS to receive more SQL Server features
• Updated 2/25/2009 5:00 PM PST: Additions

Azure Blob, Table and Queue Services

•• Jaikumar Vijayan reports that World Privacy Forum claims that cloud-based services may pose risks to data privacy in this 2/25/2008 ComputerWorld post. Jaikumar writes:

The report runs counter to comments made last week at an IDC cloud computing forum, where speakers described concerns about data security in cloud environments as overblown and "emotional." But the World Privacy Forum contends that while cloud-based application services offer benefits to companies, they also raise several issues that could pose significant risks to data privacy and confidentiality.

•• Andrew Brodie demonstrates programming Azure Queue Services with Iron Python in his Get in the Queue, your locks don't work here post of 2/26/2008. His earlier Azure Table Storage in IronPython post of 2/20/2009 did the same for Table Services. Import AntiGravity and I'll see you on Cloud Azure of 2/21/2009 demos creating tables with Iron Python.

Brian H. Prince’s Azure Tables are for Squares in the Cloud post of 2/20/2009 provides an overview of Azure’s Table data storage (without comparing it to SDS.)

SQL Data Services (SDS)

• Gavin Clarke reports 'Full' SQL Server planned for Microsoft's Azure cloud in a 2/23/2009 midnight (GMT) post to The Register. Clarke writes:

The company told The Reg it's working to add as many features as possible from SQL Server to its fledgling Azure Services Platform cloud as quickly as possible, following feedback.

General manager developer and platform evangelism Mark Hindsbro said Microsoft hoped to complete this work with the first release of Azure, currently available as a Community Technology Preview (CTP). But he added that some features might be rolled into subsequent updates to Azure. Microsoft has not yet given a date for the first version of Azure, which was released as a CTP last October.

"We are still getting feedback from ISVs for specific development scenarios they want. Based on feedback we will prioritize features and get that out first," he said.

"The aim is to get that in the same ship cycle of the overall Azure platform but it might be that some of it lags a little big and comes short there after."

For more background and commentary on the subject, see my A Mid-Course Correction for SQL Data Services post of 2/24/2009.

.NET Services: Access Control, Service Bus and Workflow

No significant new posts as of 2/23/2009 2:30 PM PST 

Live Windows Azure Apps, Tools and Test Harnesses

Robin Wauters describes Microsoft Research’s Social Desktop, which “leverages the Windows OS and Windows Azure” to integrate the Web with users desktops, in a Microsoft Research: A Look At The Intriguing Social Desktop Prototype post of 1/23/2009 to TechCrunch. Robin writes:

Unfortunately, a Microsoft spokesperson told NetworkWorld that Social Desktop at this point is merely a research prototype which will not be a feature in Windows 7, nor will it be available for public use.

Steve Marx’s live Simple Word Search Puzzle Generator Azure project generates puzzles from words you supply in a multi-line text box.

Commenced 2/23/2009; ends 2000 hours later? Are Azure evangelists exempt from quotas?

Azure Infrastructure

•• Orleans Software Platform appears to be the codename for the Azure Fabric Controller and related Azure software, according to Dan Reed, head of Microsoft Research’s new Cloud Computing Futures (CCF) group. For more information, see my Microsoft Research Announces Cloud Computing Futures Group, Orleans Software for Azure at TechFest 2009 of 2/25/2009.

•• My Ballmer Says Azure Will Release at PDC 2009 post of 2/25/2009 quotes Elizabeth Montalbano’s Ballmer: Azure ready for release by end of year InfoWorld post of 1/24/2009.

•• Mary Jo Foley delivers her promised interview with Dave Cutler in “Red Dog: Five questions with Microsoft mystery man Dave Cutler” of 2/25/2009. Cutler answers his own sixth question:

One of the things you did not ask is why aren’t we saying more about Azure and in the process filling the marketplace with sterling promises for the future. The answer to this is simply that the RD group is very conservative and we are not anywhere close to being done. We believe that cloud computing will be very important to Microsoft’s future and we certainly don’t want to do anything that would compromise the future of the product. We are hypersensitive about losing people’s data. We are hypersensitive about the OS or hypervisor crashing and having properties experience service outages. So we are taking each step slowly and attempting to have features 100% operational and solidly debugged before talking about them. The opposite is what Microsoft has been criticized for in the past and the RD dogs hopefully have learned a new trick.

I don’t believe the skunk works approach will work for Azure, as I note in my Dave Cutler Rationalizes Azure “Skunk Works” post of 2/25/2009.

• Mary Jo Foley continues her series on the Azure Services Platform with part II How the Red Dog dream team built a cloud OS from scratch of 2/24/2009, which describes how Corporate Vice President Amitabh Srivastava built the Azure team. According to Mary Jo:

“We said, let’s not try to copy Google or Amazon,” Srivastava recalled. “We said we’d run things very differently.” …

The team decided to build a layer — with pieces akin to what is inside a modern-day operating system — to manage the thousands of Windows Server machines. A “fabric controller” would manage the cloud; a storage subsystem would act like a traditional “file system” for all of the servers; a virtualization layer, derived from Microsoft’s Hyper-V hypervisor, would be at the lowest level between the servers and the rest of the datacenter “operating system.”

Mary Jo’s topic for tomorrow will be “What’s Dave Cutler been up to? … “A Q&A with the father of Windows NT on his role in the Red Dog team.”)

Elise Ackerman’s Ballmer's big bet article of 2/23/2009 in the San Jose Mercury News analyzes Steve Ballmer’s enthusiasm for cloud computing with Azure:

But Ballmer, whose never-say-quit style has been compared to Gen. George S. Patton's, has a plan, and many say it is an audacious one. He has doubled down on the Internet, investing billions in data centers and in a new operating system called Azure that will enable Microsoft to move all its existing software to the Internet and sell it through subscriptions or pay for it with advertising. [Emphasis added.]

I’m not so sure about that last sentence, Elise.

Joe McKendrick writes in his Commentator ‘depressed’ that cloud sounds just like SOA ZDNet post of 2/23/2009:

Stacey Higginbotham, a commentator for GigaOm, recently remarked that a recent HP tutorial on cloud computing was “depressingly similar to the idea of service oriented architecture,” noting that “HP offered clouds as merely a means to deliver IT as a service inside the enterprise.”

Joe argues “that SOA is the private cloud, and the public cloud is one massive SOA. Pretty exciting stuff if you ask me.” He concludes:

Plus, there are still reasons for enterprises to be nervous about cloud computing. There are issues with a lack of portability and vendor lock-in — “yet another reason that enterprises may want to keep their data out of the clouds for a bit longer.”

Joe also quotes Phil Horvitz, CTO of Apptis in his SOA will accelerate cloud computing — here’s why post of 2/23/2009:

“SOA is not required to take advantage of the benefits of cloud computing, but well-designed architectures will require less work for them to be cloudified.  From this perspective I see SOA as an enabling technology that will accelerate migration to the cloud. Going forward, I think the dream for enterprise architects is to be able to design systems using loosely coupled SOA services on one or more clouds and to pull them into a single, unified, homogeneous application without regard or concern for whose cloud they run on.”

Dominic Green blogs on 2/23/2009 that this animated Cloud Computing Explained… video “… gives a nice simplified explanation of cloud computing in ‘plain English’. Covering the basics of scalability, XaaS, and Utility computing.”

I agree wholeheartedly with his characterization.

Lori MacVittie takes on the growing interest in cloud computing standards for interoperability in her Approaching cloud standards with end-user focus only is full of fail post of 2/23/2009 to the F5 blogs. Lori leads off with:

If you’re looking at standardization and interoperability efforts only as they relate to providers or end-users then you’re not thinking long term nor are you really considering the potential of cloud computing and virtualization to revolutionize data center architectures. In a nutshell, if you equate “cloud” with “providers like Amazon and Google” then you don’t really get the big picture

Gary E. Smith posted this press release, Citrix Extends 20-Year Partnership with Microsoft into Server Virtualization with “Project Encore”, which announced Citrix Essentials™  for Microsoft® Hyper-V™, on 2/23/2009. Starting at $1,500 per physical server, Citrix Essentials for Hyper-V provides:

  • Advanced Storage Integration using Citrix® StorageLink™ technology makes it easy for Hyper-V and Microsoft System Center customers to fully leverage all the native power of their existing array-based storage systems.
  • Dynamic Provisioning Services lets customers centrally manage a common set of master images, which can be streamed on-demand into Hyper-V virtual machines or physical servers.
  • Hypervisor Interoperability makes it easy for customers to manage virtual machines across heterogeneous Hyper-V and XenServer™ environments.
  • Automated Lab Management enables Hyper-V customers to develop, test, and deliver applications faster by automating and simplifying the entire virtual machine lifecycle, including movement across virtualization platforms.

How this will affect Azure, and especially cloning Azure to create private clouds, isn’t clear yet.

James Niccolai gives his take on the Citrix announcement in his Microsoft, Citrix join forces against VMware post of 2/23/2009 for IT World. Keith Ward does the same for Redmond Developer News with his Citrix, Microsoft Take Aim at Enterprises with Essentials post of the same date.

This story was overshadowed by news that Citrix is launching a free version of XenServer as announced in Larry Dingan’s It’s official: Citrix aims to blow up enterprise virtualization pricing and Citrix to offer free XenServer; Takes shot at VMware posts of 2/23/2009.

Scott Lowe’s It’s Official: XenServer Available for Free post of the same date offers additional technical details. Forrester Research analyst James Staten offers his opinions in Red Hat and Citrix ratchet up open source virtualization relevancy.

Mary Jo Foley’s Red Dog: Can you teach old Windows hounds new tricks? (Part 1: It’s not just about Windows any more of 2/23/2009) answers these questions posed by Mary Jo:

What led Microsoft — which has spent a good part of the past decade-plus protecting the Windows franchise at the expense of the Web — to finally create an infrastructure that would support not just Windows developers, but also Web programmers?

And how did a company known for its slipping dates more than making its shipping dates manage to build a cloud-computing platform that developers could begin test-driving in less than two years?

She promises, “Over the course of this week, I’m going to be publishing a post a day about Red Dog.”

Cloud Computing Events

Whose Cloud Is It Anyway? is TechCrunch’s roundtable on cloud computing scheduled for Friday, February 27, 2:30 - 6:30 pm at Microsoft’s Mountain View Conference Center, 1065 La Avenida St, Building 1, Galileo Auditorium, Mountain View, CA 94043. The schedule:

  • 2:30 - 3:15 Product Pitches: Three product pitches from early-stage enterprise focused start-ups. Feedback from panel of experts.
  • 3:30 - 5:00 Roundtable Discussion: The cloud is many things to many people. It is a a data center in the sky, a platform for a new bread of enterprise apps, a way to bring Web-scale computing to businesses small and large.
  • 5:00 - 6:30 MeetUp Reception and Demo Tables

The all-star roundtable (expanded to include Amazon’s chief technology officer Werner Vogels, Ning CEO Gina Bianchini, Facebook VP of Engineering Mike Schroepfer, and Jon Engates, CTO of Rackspace according to Erick Schoenfeld’s Amazon, Ning, Facebook, And Rackspace Join Our Cloud Roundtable post of 2/19/2008):

Roundtable Moderators:

  • Erick Schonfeld, co-editor TechCrunch
  • Steve Gillmor, editor TechCrunchIT

Get tickets here via Eventbrite: $75 each based on availability.

Dmitri Sotnikov will present an Active Directory, User Identity and Azure/BPOS for IT Professionals session at The Experts Conference (TEC) for Directory & Identity to be held at the Green Valley Ranch Resort and Casino in Henderson, NV on March 22 – 25, 2009. Here’s his abstract:

In this session we will dive into identity management, federation and sign-on process for Windows Azure and Microsoft’s BPOS products such as Exchange Online and SharePoint Online. How do you set up federation between your existing Active Directory and these “cloud” applications? Which options do you have? How does authentication actually happen? How much of the infrastructure and management effort can be shared across these applications and how much is application-specific?

BPOS = Business Productivity Online Suite and Henderson is a suburb of Las Vegas.

Danny Kim’s Implementing an Identity-Based Solution using Microsoft's Cloud Based Infrastructure is also on the TEC-D&I agenda:

Microsoft’s new Cloud based infrastructure provides the building blocks for hosting scalable and highly available services for Corporations, ISVs and developers to leverage all of the hardware and software of a global datacenter. This session will cover the main components of building an application for the Cloud along with additional complementary building block services Microsoft will have released such as Identity and Access services, BizTalk.net services (Internet Service Bus), Workflow Services, Database Services, etc. To tie the pieces together, the session will cover the mechanics of building a live service running on Microsoft's Cloud infrastructure.

TEC-D&I is the new name of the former Directory Experts Conference.

Other Cloud Computing Platforms and Services

Aaron Ricadela’s VMware Raises the Cloud-Computing Ante article for Business Week’s Technology section carries a “VMware joins Microsoft, Google, and Amazon in the race to help build the world's next generation of software” deck. Aaron writes:

On Feb. 24, VMware (VMW) released key pieces of an ambitious new product that's designed to help companies more efficiently juggle complex computing tasks. Dubbed the Virtual Data Center Operating System (VDC-OS), the software creates a bank of computers, storage devices, and networking equipment that a company can tap at will, as computing needs arise—say, during a December spike in Web traffic for an online retailer.

The software, due later this year, reflects VMware's push into so-called cloud computing, which lets a business rely on an outside provider for storage, data processing, and other computing tasks. The idea is that a company can reduce expenses and save time by turning costly computing over to a better-equipped provider. By making the leap, VMware becomes the latest tech company, along with Microsoft (MSFT), Google (GOOG), and Amazon.com (AMZN), that wants to supply the tools for building the world's next generation of software.

The UC Berkeley Above the Clouds team’s "IBM Software Available Pay-As-You-Go on EC2” post of 2/23/2009 casts a vote for the pay-as-you-go software licensing for IBM’s DB2 database on Amazon EC2. They write:

At Berkeley we believe that at least in the short term, one of the biggest advantages of cloud computing is ease of experimentation. Before today, one could use a cloud service like EC2 to test out multiple operating systems, machine images with pre-configured open-source software stacks, and large-scale experiments (will my software scale to 100 nodes?). The availability of software like DB2, WebSphere sMash, etc from IBM means even more prototyping and experimentation is possible without negotiating long-term contracts or having to go through a complicated setup process. This potential for prototyping and experimentation helps both software users and commercial software vendors.

The same can be said for Microsoft’s SQL Server that’s available per-hour on EC2. Hopefully, Amazon Web Services will move from Windows Server 2003 R2 and SQL Server 2005 to the current versions.

John Foley describes How To Get Started With Storage-As-A-Service in this 2/23/2009 post that uses Nirvanix as an example:

Pay-as-you-go online storage services are a flexible way to deal with exploding data volume. For 25 cents per month, you can rent a gigabyte of storage from Nirvanix. But is that any way to buy enterprise storage?

Foley seems to think so:

As companies get more serious about storage as a service, Nirvanix ratchets up the level of engagement. When I talked to [Nirvanix VP of sales Joe] Lyons the other day, he was getting ready to go into a three-hour meeting with a customer that was in the market for 200-plus TBs of storage. Those in-person meetings cover use cases, data migration strategies, cost, technical nitty-gritty, service levels, and so on.

Terremark Worldwide’s Government Selects Terremark's Enterprise Cloud(tm) to Power USA.gov press release of 2/23/2009 describes how:

The GSA will leverage the Enterprise Cloud's reliability and agility to dynamically provision highly available cloud computing resources to handle any spikes in online traffic. The on-demand nature of The Enterprise Cloud allowed Terremark to provide the GSA complete access to secure cloud-based computing resources within minutes instead of weeks. Terremark's solution will also supply GSA with industry-leading physical and logical security and robust connectivity to some of the world's leading carrier networks.

Krishnan Subramanian delivers his favorable analysis of the Terremark press release in his USA.Gov Moving To Clouds post of 2/23/2009.

Krishnan describes how enterprises can minimize the risks of a Platform as a Service (PaaS) vendor going out of business, using Cogshead’s demise as an example in PaaS, Trusting Beyond Its Initial Hype of 2/23/2009.

Microsoft Research Announces Cloud Computing Futures Group, Orleans Software for Azure at TechFest 2009

Rick Rashid, senior vice president of Microsoft Research, announced on 2/24/2009 a new research organization called Cloud Computing Futures (CCF). According to Rob Knies, CCF will be:

[F]ocused on reducing the operational costs of data centers and increasing their adaptability and resilience to failure. The group, led by Dan Reed, director of Scalable and Multicore Systems, will strive to lower hardware costs, power consumption, and the environmental impact of such facilities.

The Peering into Future of Cloud Computing post continues with a Q&A session about CCF with Reed. Among the group’s immediate goals is:

Our goal is to reduce data-center costs by fourfold or greater while accelerating deployment and increasing adaptability and resilience to failures, transferring ideas into products and practice. To date, we have focused our attention on four areas, though our agenda spans next-generation storage devices and memories, new processors and processor architectures, system packaging, and software tools. …

We have been working with researchers from Microsoft Research on several approaches to data-center networking. The most mature of these is Monsoon, which uses much of the existing networking hardware but replaces the software with a new set of communications protocols far better suited for a data center. This work will not only lead to more efficient networks, but by relaxing the constraints of existing networks, it also will open new possibilities to simplify data-center software and to build more robust platforms.

Reed adds the following about the Orleans software platform:

  • Orleans software platform: The software that runs in the data center is a complicated, distributed system. It must handle a vast number of requests from across the globe, and the computers on which the software runs fail regularly—but the service itself should not fail, even though the software is continually changing as the service evolves and new features are added. Orleans is a new software platform that runs on Microsoft’s Windows® Azure™ system and provides the abstractions, programming languages, and tools that make it easier to build cloud services.

This is the first reference to “Orleans” with Azure that I’ve found and “Orleans software platform” gets only one Google hit this morning. I assume it encompasses the Azure Fabric Controller, but its “programming languages” remain a mystery.

The theme of TechFest 2009, being held at the Redmond campus, is Technology “Signposts” Point to Future of Computing.

Ballmer Says Azure Will Release at PDC 2009

According to Elizabeth Montalbano’s Ballmer: Azure ready for release by end of year InfoWorld post of 1/24/2009:

Microsoft plans to release its Windows Azure cloud computing platform before the end of the year, Microsoft CEO Steve Ballmer said Tuesday [at the company’s Winter Wall Street Analysts briefing].

In comments made to members of the financial community, Ballmer said Microsoft will have "the ability to go to market" with Azure by the end of this year at its Professional Developers Conference (PDC) in November.

"[Azure] will reach fruition with the PDC this year," he said. Ballmer spoke to Wall Street analysts Tuesday to give them an update on Microsoft's financial status and what they can expect from the company for the remainder of the fiscal and calendar year. Microsoft's fiscal year ends on June 30.

This represents a significant delivery slip for SQL Data Services (SDS), which was slated to release in 2009H1 when introduced as SQL Server Data Services (SSDS) at MIX08. My SQL Server Data Services to Deliver Entities from the Cloud post of 3/7/2008 provides the low-down on SDS’s predecessor.

The Azure team will need to speed development substantially if promised relational features are to be included in SDS v1 (see my A Mid-Course Correction for SQL Data Services post of 2/24/2009.) Most features promised by the SSDS team in mid-2008 still haven’t made it into the current SDS CTP.

Details will be added when a transcript is available.

Dave Cutler Rationalizes Azure “Skunk Works”

Dave Cutler answers his own sixth question in Mary Jo Foley’s “Red Dog: Five questions with Microsoft mystery man Dave Cutler” of 2/25/2009. Cutler writes:

One of the things you did not ask is why aren’t we saying more about Azure and in the process filling the marketplace with sterling promises for the future. The answer to this is simply that the RD group is very conservative and we are not anywhere close to being done. We believe that cloud computing will be very important to Microsoft’s future and we certainly don’t want to do anything that would compromise the future of the product. We are hypersensitive about losing people’s data. We are hypersensitive about the OS or hypervisor crashing and having properties experience service outages. So we are taking each step slowly and attempting to have features 100% operational and solidly debugged before talking about them. The opposite is what Microsoft has been criticized for in the past and the RD dogs hopefully have learned a new trick.

I don’t believe the skunk works approach is appropriate for Azure. Developers need a full description and timeline for required features. Failure to deliver promised features in a timely manner is what led to A Mid-Course Correction for SQL Data Services.

IDC Increases Year-Over-Year Cloud Growth Estimate for 2009 from 36% to 40.5%

According to a 1/26/2009 press release, the November 2008 IDC study, Economic Crisis Response: Worldwide Software as a Service Forecast Update (IDC #215504) finds:

Recent IDC surveys and customer interviews support the finding that the harsh economic climate will actually accelerate the growth prospects for the software as a service (SaaS) model as vendors position offerings as right-sized, zero-CAPEX alternatives to on-premise applications. Buyers will opt for easy-to-use subscription services which meter current use, not future capacity, and vendors and partners will look for new products and recurring revenue streams. As such, IDC has increased its SaaS growth projection for 2009 from 36% growth to 40.5% growth over 2008.

Additional findings from the IDC study include:

    • By the end of 2009, 76% of U.S. organizations will use at least one SaaS-delivered application for business use.
    • The percentage of U.S. firms which plan to spend at least 25% of their IT budgets on SaaS applications will increase from 23% in 2008 to nearly 45% in 2010.
    • This market's growth prospects will accelerate the shift to SaaS for the whole value chain as the promise of a recurring revenue stream, and the opportunity to tap OPEX and project-related dollars, will benefit the whole SaaS ecosystem.
    • While demand for SaaS is strongest in North America, new contracts from customers in Europe, Middle East, Africa (EMEA) and Asia/Pacific (excluding Japan) also look particularly positive, and IDC expects that by year-end 2009, nearly 35% of worldwide revenue will be earned outside of the U.S.
    • On the downside, IDC interviews with SaaS providers highlighted several issues, such as cash-flow shortfalls related to slow-paying current clients, liquidity challenges stemming from tight credit at lenders, and — on the horizon — limited resources to scale up with expanded infrastructure to support new customers and new service offerings.

The new study augments other revised IDC forecasts by offering a post-2008 financial crisis update for the worldwide SaaS market, and specifically updates Worldwide Software on Demand 2008–2012 Forecast and 2007 Vendor Shares: Moving Toward an On-Demand World (IDC #213197, July 2008).

Tuesday, February 24, 2009

A Mid-Course Correction for SQL Data Services

The Azure Services folks have decided that SQL Data Services (SDS) needs more relational attributes at the expense of the “simplicity” policy espoused by the original SQL Server Data Services (SSDS) team. First news about the change in direction came at the MSDN Developer Conference’s visit to San Francisco on 2/23/2009 in conjunction with 1105 Media’s Visual Studio Live! conference at the Hyatt Regency.

Gavin Clarke reports in 'Full' SQL Server planned for Microsoft's Azure cloud in a 2/23/2009 midnight (GMT) post to The Register:

[Microsoft] told The Reg it's working to add as many features as possible from SQL Server to its fledgling Azure Services Platform cloud as quickly as possible, following feedback.

General manager [of] developer and platform evangelism Mark Hindsbro said Microsoft hoped to complete this work with the first release of Azure, currently available as a Community Technology Preview (CTP). But he added that some features might be rolled into subsequent updates to Azure. Microsoft has not yet given a date for the first version of Azure, which was released as a CTP last October.

"We are still getting feedback from ISVs for specific development scenarios they want. Based on feedback we will prioritize features and get that out first," he said.

"The aim is to get that in the same ship cycle of the overall Azure platform but it might be that some of it lags a little big and comes short there after."

Hopefully, SDS hasn’t reached the point of no return.

Adding RDBMS Features Reverses Original Policy

Traditional relational databases don’t deliver the extreme scalability expected of cloud computing in general and Azure in particular. So SQL Server Data Services (SSDS) adopted a Entity-Attribute-Value (EAV) data model built on top of a customized version of SQL Server 2005 (not SQL Server 2008), as I reported in my Test-Drive SQL Server Data Services cover story for Visual Studio Magazine’s July 2008 issue.

SSDS architect Soumitra Sengupta posted Philosophy behind the design of SSDS and some personal thoughts to the S[S]DS Team blog on 6/26/2008. According to Soumitra, the first and foremost problems the team needed to solve were:

  1. Building a scale free, highly available consistent data service that is fault tolerant and self healing
  2. Building the service using low cost commonly available hardware
  3. Building a service that was also cheap to operate - lights out operation

The team favored simplicity at the expense of traditional relational database features, which potential users (such as me) expected .

Since we made an early decision to limit the number of hard problems we needed to solve, we decided that we would focus less on the features of the service but more on the quality of the service and the cost of standing up and running the service.  The less the service does we argued, the easier it would be for us to achieve our objectives.  In hindsight, this was probably one of the best decisions we made.  Istvan, Tudor and Nigel deserve special credit for keeping us focused on "less is better".

The result of this policy were schemaless EAV tables that offered flexible properties (property bags) in an Authority-Container-Entity (ACE) architecture that mystified .NET developers, who were then in the process of about-facing their mindset from traditional SQL queries to .NET 3.5’s Language Integrated Query (LINQ) constructs and object/relational mapping with LINQ to SQL and the Entity Framework. SSDS offered SOAP and REST data access protocols with a very limited query syntax.

The SSDS folks believed the simplified ACE construct made it easy for developers who weren’t database experts to create data-driven applications that used SSDS instead of Amazon Web Service’s SimpleDB or the Google App Engine as a scalable data store in the cloud.

Less wasn’t Better

Apparently, “less” didn’t turn out to be “better” when it comes to the .NET developers who are Azure’s target audience. Microsoft promotes the Azure Services Platform’s ability to leverage their Visual Studio 2008 expertise. VS 2008 is all about, ADO.NET, object/relational modeling (O/RM), and integration with SQL Server 200x with the SqlClient classes. SSDS’s REST interface didn’t even align with heavily promoted ADO.NET Data Services.

Gavin continues:

According to Hindsbro, partners want a full SQL Server database in the cloud. The current SQL Data Services (SDS), which became available last March, provides a lightweight and limited set of features. Prior to SDS, Microsoft's database service was called SQL Server Data Services.

"If you go there now you will find more rudimentary database experiences exposed. Not a lot of these apps would be interesting without a full database in the cloud, and that is coming," Hindsbro said.

He did not say what SQL Server features Microsoft would add to Azure, other than to say it'll include greater relational functionality.

Microsoft in a statement also did not provide specifics, but said it's "evolving SDS capabilities to provide customers with the ability to leverage a traditional RDBMS data model in a cloud-based environment. Developers will be able to use existing programming interfaces and apply existing investments in development, training, and tools to build their applications."

The pre-beta SDS restricts what users can do in a number of ways that make it hard to set up and manage and that are limit its usefulness in large deployments.

Gavin’s last paragraph is an understatement, to be charitable.

Less is Azure Table Services

Azure’s early testers are mystified by SDS’s overlap with Azure’s Table Service, which has a feature set that’s almost identical to SDS today, but is aligned with ADO.NET Data Services and its LINQ to REST queries.

Microsoft’s standard answer to Azure and SDS Forum questions, such as “The confusion here is why are there two different kinds of storage. Are they different?  If so why and if not what is the relation?” in the Azure Forum’s Difference between Azure Storage and SDS Storage thread and SDS Forum’s What Are the Advantages of SDS Over Table Storage Services with the StorageClient Helper Classes? thread is:

"SDS will provide scalable relational database as a service (today, Joins, OrderBy, Top...are supported) and as it evolves, we plan to support other features such as aggregates, schemas, fan-out queries, and so on.  SDS just like any other database also supports blobs.  SDS is for Unstructured, Semi, and Structured data, with a roadmap of having highly available relational capabilities."

Microsoft won’t reveal pricing for Azure services, but it’s clear that SDS is positioned as a value-added feature with premium per-hour and per GB storage charges compared with prices for renting plain old tables (POTs).

Early RDBMS Feature Promises

The SDS team began promising more SQL Server features shortly after releasing the SQL Server Data Services (SSDS) invitation-only CTP on 3/5/2008 at the MIX08 conference. Primary examples were optional schemas, full-text indexing and search, blob data type, ORDER BY clauses for queries, cross-container queries, transactions, JOINs, TOP(n), simplified backup and restore, and alignment of the REST API with ADO.NET Data Services.

The team delivered blobs, pseudo-JOINs, ORDER BY, and Take (but not Skip) by PDC 2008 (late October 2008) when the Azure invitation-only CTP released. My SQL Data Services (SDS) Update from the S[S]DS Team post of 10/27/2008 describes the new features in Sprint #5.

The SDS team will need to deliver all the promised features, and perhaps a few more, to justify a significant increase to service charges over those for Azure tables.

Competition from Amazon

In the meantime, Amazon Web Services (AWS) announced on 10/1/2008 that Amazon EC2 “will offer you the ability to run Microsoft Windows Server or Microsoft SQL Server … later this Fall.” My Amazon Adds SQL Server to Oracle and MySQL as EC2 Relational Database Management Systems post of 10/1/2009 has more details. Amazon announced support for IBM DB2 and Informix Dynamic Server in this IBM and AWS page on 2/11/2009.

EC2 currently supports Windows Server 2003 R2 and SQL Server 2005 Express and Standard editions. There’s no surcharge for the Express edition and the surcharges for the three instance types that offer SQL Server Standard edition are:

Instance Type Surcharge/Hour Surcharge/Year
Standard Large US$ 0.60 US$   5,256
Standard Extra Large US$ 1.20 US$ 10,512
High CPU Extra Large US$ 1.20 US$ 10,512

Note that SQL Server Standard isn’t available for the Standard Small instance type, which costs US$ 0.375 per hour less than Standard Large. If you don’t need Standard Large’s added capacity, the yearly surcharge increases by US$ 3,285.

You probably can’t beat AWS’s SimpleDB for low-cost usage and storage charges. Amazon now offers a simplified SQL subset for querying SimpleDB EAV tables.

Soumitra’s SQL Server Data Services (SSDS) is simple, but it is not SimpleDB post of 3/7/2008 claims that SSDS isn’t a SimpleDB-compete and concludes:

Underneath the hood, the service is running on SQL Server.  So the rich capabilities of our server software is all there.  We have chosen to expose a very simple slice of it for now.  As Nigel explained, we will be refreshing the service quite frequently as we understand our user scenarios better.  So you can expect to see more capabilities of the Data Platform to start showing up in our service over time.  What we announced here is just a starting point, our destination remains the extension of our Data Platform to the cloud.  I know you are asking "I need more details and a timeline".  As we on-board beta customers and get their feedback, we will be able to give you more details.

Whether or not SSDS is a SimpleDB-compete or not, I’m sure that the SDS Team would like to offer their product at a surcharge that’s competitive with Amazon’s for SQL Server Standard.

Silence Isn’t Golden

In the first few months of SSDS’s existence, the team posted frequently in to the S[S]DS Team Blog, but went silent after PDC 2008. I mentioned the lack of communication in my The SQL Data Services Team’s Recent Silence Isn’t Golden post of 1/3/2009.

Jeff Currier replied in a comment:

We've been a bit more silent than usual because the features we've been focusing on have been more of a operational nature (and therefore not customer facing). This should explain the recent silence (along with the holidays).

That might be an explanation, but it isn’t a very satisfactory one.

Dave Robinson posted SQL Data Services – What’s with the silence? today, presumably in response to The Register’s article:

Just wanted to drop a quick note. People are starting to question what’s going on in the SDS world and why we have been so silent. Well, to be honest, we have been so silent because the entire team has been heads down adding some new exciting features that customers have been demanding. Last year at Mix we told the world about SDS. This time around we will be unveiling some new features that are going to knock your socks off. So, that’s it for now. Just wanted to let everyone the team is alive and well and super excited for the road ahead. We are 3 weeks away from Mix so hang on just a little bit longer. Trust me, it’s worth it.

I’d like to know why it’s “worth it” to wait for MIX09 to find out what’s in store for SDS and when can we finally expect it.

What new features are going to knock my socks off?

Saturday, February 21, 2009

Is C# the Only Supported Language for Azure Roles?

The “About the Azure Services Platform and Windows Azure” topic of the current Windows Azure SDK states in its “Compute Service” topic:

A service may be comprised of one or both types of roles. C# is currently the supported language for writing managed code to be deployed as roles. [Emphasis added.]

This document is now in its second edition, at least.

Dave Chappell writes on page 14 of his “Introducing the Azure Services Platform” whitepaper:

Developers are free to use any .NET language (although it’s fair to say that Microsoft’s initial focus for Windows Azure has been on C#). [Emphasis added.]

Despite Dave’s demurrer, the way I read the official SDK tea leaves is: C# is currently the only supported language for writing managed code to be deployed as roles.

Surprisingly, I haven’t heard an outcry about this denigration of Visual Basic from well-known and respected VB proponents, such as Beth Massi, Bill McCarthy, or Julie Lerman.

As a long-time (since VB 1.0’s Professional Extensions, a.k.a. Rawhide) VB user, I object!

Security Issues Receive Focus at IDC’s Cloud Computing Forum

The IDC Cloud Computing Forum took place on Wednesday, 2/18/2009 at the Stanford Court hotel atop San Francisco’s Nob Hill. Due to a schedule conflict, I wasn’t able to attend, but several reporters had filed their stories by the Saturday morning deadline for this post.

Sessions included:

    • Achieving High Performance with Cloud Computing in Uncertain Times
    • Building the Business Case for Cloud Computing
    • Building Your Own Internal Enterprise Cloud
    • Managing the Complexities of Security, Privacy and Compliance in the Cloud
    • New IT Models for Growth and Innovation
    • Next Generation of Server and Storage Virtualization
    • Powering the Cloud: Addressing Electricity Use and Efficiency for The Next Generation Data Center

but Security, Privacy and Compliance topics got most of the column-inches.

Cloud Computing Brings Challenges, Opportunities by Mark Long of CIO Today concentrated on Joseph Tobolski’s and Marie Wieck’s keynotes:

"For organizations eager to delay, reduce or eliminate capital spending, the pay-as-you go cloud computing model is proving to be attractive," says Joseph Tobolski, the director of cloud computing at Accenture. But as is the case with other earlier technological advances, "Cloud computing brings major challenges as well as big opportunities," noted Tobolski.

"As with many popular new technology trends, there are probably as many definitions out there as there are different analysts and vendors," said Marie Wieck, vice president of middleware services at IBM. "In IBM's view it's really a fundamental extension of the Internet computing model, and it is a platform that provides the ability for companies to access services and resources much more quickly."

IDC sees a number of opportunities for cloud computing to catch on in the government, health care and manufacturing industries, which typically invest heavily in IT systems and infrastructure Relevant Products/Services and are currently looking for ways to achieve cost savings. "The nature of cloud computing also makes it suitable for the budget- and resource-constrained SMEs, where not requiring in-house teams to deploy and manage the solutions is a huge advantage," the firm's analysts said.

How to Prepare for the Cloud by Richard Adihikari of InternetNews.com quotes IDC chief analyst Frank Gens and Tobolski:

"If you have that service-oriented delivery model good [that] CIOs have been working on for the last decade, add scalability, self service provisioning and pay-per use-options, you have the private enterprise cloud," Gens said. "You can then easily pull in some of the services you need from external cloud service providers."

[T]he problem of security is not as big an issue as many fear, both speakers said. "You have to ask yourself, what are you comparing, say, Amazon's security to," Gens said. "Your own security? If you're a mid- to low-level enterprise, Amazon's probably going to beat you."

However, enterprises must ensure their internal systems are efficient before going to the cloud, warned Joseph Tobolski … . "If you're inefficient on the inside and add cloud stuff, you're going to get into a lot of trouble."

Cloud security fears are overblown, some say by James Niccolai of IDG News Service writes:

It may sound like heresy to say it, but it's possible to worry a little too much about security in cloud computing environments, speakers at IDC's Cloud Computing Forum said on Wednesday.

Security is the No. 1 concern cited by IT managers when they think about cloud deployments, followed by performance, availability, and the ability to integrate cloud services with in-house IT, according to IDC's research.

"I think a lot of security objections to the cloud are emotional in nature, it's reflexive," said Joseph Tobolski … . "Some people create a list of requirements for security in the cloud that they don't even have for their own data center."

Security, Privacy And Compliance In The Cloud by InformationWeek Analytics’ Roger Smith leads with:

One of the more interesting panel discussions at the IDC Cloud Computing Forum on Feb 18th in San Francisco was about managing the complexities of security, privacy and compliance in the Cloud. The simple answer according to panelists Carolyn Lawson, CIO of California Public Utilities Commission, and Michael Mucha, CISO of Stanford Hospital and Clinics is "it ain’t easy!"

"Both of us, in government and in health, are on the front-lines," Lawson proclaimed. "Article 1 of the California Constitution guarantees an individual’s right to privacy and if I violate that I’ve violated a public trust. That’s a level of responsibility that most computer security people don’t have to face. If I violate that trust I can end up in jail or hauled before the legislature," she said. "Of course, these days with the turmoil in the legislature, she joked, "the former may be preferable to the later."

Stanford’s Mucha said that his security infrastructure was built on a two-tiered approach using identity management and enterprise access control. Mucha said that the movement to computerize heath records nationwide was moving along in fits and starts, as shown by proposed systems like Microsoft’s Health Vault and Google’s Personal Health Record. "The key problem is who is going to pay for the computerized of health records. It’s not as much of a problem at Stanford as it is at a lot of smaller hospitals, but it’s still a huge problem."

AMD: Considering the chips that comprise the cloud by Jacqueline Emigh starts:

In all the talk lately about "the cloud," the topic of computer processors doesn't always happen to float by. But it should, according to AMD's Margaret Lewis.

AMD's Lewis focused much of her remarks on how AMD's hardware virtualization works hand-in-hand with software virtualization technologies such as hypervisors to support cost effective cloud computing. Cost performance benefits can accrue across areas that run the gamut from infrastructure to energy use, she said. Cloud infrastructures perform better when cloud resources are kept as close as possible to users, for reduced network overhead, she elaborated later, in an interview with Betanews.

Quad-core AMD Opteron chips are being used as the underlying processor behind implementations of Microsoft's Azure virtualization software, for instance. "We [have] lots of other examples, too, many of them involving very advanced approaches to virtualization," she told Betanews. [Emphasis added.]

Stay tuned for additional stories from the IDC Cloud Computing Forum as they appear on the Web.

Friday, February 20, 2009

Azure and Cloud Computing Posts for 2/16/2009+

Windows Azure, Azure Data Services, SQL Data Services and related cloud computing topics now appear in this weekly series.

Note: This post is updated daily or more frequently, depending on the availability of new articles.

• Update 2/18/2009: Several additions
Update 2/21/2009 11:30 AM PST: Several additions

Azure Blob, Table and Queue Services

•• Dan Vanderboom’s Windows Azure: Blobs and Blocks post of 2/21/20909 describes an extension to the StorageClientLibrary to use blocks (BlobContainerRest.PutLargeBlobImpl) when creating blobs smaller than 4 MB.

•• Joe Gregorio of the Google App Engine Team and champion of Atom and AtomPub protocols posted Back to the Future for Data Storage on 2/19/2009. Joe explains why relational databases aren’t well suited for scalability across a distributed system. He cite’s Michael Stonebraker’s paper, "One Size Fits All": An Idea Whose Time Has Come and Gone, that posits there are common datastore use cases, such as Data Warehousing and Stream Processing that are not well served by a general purpose RDBMS and that abandoning the general purpose RDBMS can give you a performance increase of one or two orders of magnitude.

Joe concludes:

It's an exciting time, and the takeaway here isn't to abandon the relational database, which is a very mature technology that works great in its domain, but instead to be willing to look outside the RDBMS box when looking for storage solutions.

And, of course, Google’s Bigtable is “outside the RDBMS box.”

mh415 asked How to mimic the RDBMS "auto increment" feature in Azure Tables? in the Windows Azure forum and MSFT evangelist Steve Marx replied:

With the functionality in Windows Azure tables today, the only thing you can do is query and then conditional write on a "last used index" entry, which will require a minimum of two round-trips to the storage service and retry logic.  The etag tracking you get from the ADO.NET client library should protect you from your race conditions; you'll just need to handle the retries.

So the algorithm looks like this:

  1. Query "last used index" value (stored in a table)
  2. Increment value
  3. Try to conditionally write new value, repeating steps 1-3 until successful
  4. Use the new index freely, knowing no other instance will be using it

In effect, you'll have created a lock by using the conditional write to detect races.

Simon DaviesChanging the Default SQL Server Instance For Windows Azure Development Storage post of 2/17/2009 shows you how to change the Azure Table storage instance from the default localhost\SQLEXPRESS instance to another default or named SQL Server instance.

Jon Udell advances Derik’s topics (see below) to Azure Services in his Using the Azure table store’s RESTful APIs from C# and IronPython post of 2/17/2009. Jon’s “general strategy” is:

    • Make a thin wrapper around the REST interface to the query service
    • Use the available query syntax to produce raw results
    • Capture the results in generic data structures
    • Refine the raw results using a dynamic language

Derik Whittaker’s Getting data from a REST service using C# of 2/15/2009 uses a helper method with HTTP GET to return data as plain old XML (POX), text string, or in JavaScript Object notation (JSON) format.

Derik’s Posting data to a REST service using C# post of 2/14/2009 shows the code to POST the following to a generic REST service:

  1. someValue which is a string
  2. anotherValue which is a string
  3. finalValue which is an Int32

SQL Data Services (SDS)

•• The .Net and SQL Services Teams report they are Resuming New .NET Services and SQL Services Account Provisioning as of 11:12 AM PST 1/20/2009.

Was the outage in deploying Azure projects and services reported by Steve Marx in his RESOLVED: Windows Azure Outage Windows Azure forum post caused by the account provisioning hiatus?

Niraj Nagrani and Nigel Ellis are interviewed at PDC 2008 by Keith and Woody in Deep Fried Bytes’ Episode 26: Discovering Azure SQL Services podcast of 2/13/2006. Probably will emphasize SDS because Niraj is a senior product manager with the Microsoft SQL Server Technical Marketing team and Nigel is SQL Data Services Development Manager.

.NET Services: Access Control, Service Bus and Workflow

•• Maureen O’Gara’s Call for Cloud Security Guidelines Heard post of 2/20/2009 observes “Chief information security officers concerned about software services’ security in the cloud” and goes on to report:

Infosecurity Europe, which granted is a show but one takes things as one finds them, says it surveyed 470 organizations and found that 75% of them intend to reallocate or increase their budgets to secure cloud computing and software as a service in the next 12 months.

It also interviewed a panel of 20 chief information security officers (CISOs - a new ‘C') of large enterprises only to learn that they are concerned about the availability and security aspects of software services in the cloud. It said they were particularly concerned about the lack of standards for working in the cloud, SaaS and secure Internet access. All of them reportedly said they would welcome the development of guidelines.

•• David Pallman offers a two-part Introduction to Live Services:

Introduction to Live Services, Part 1: How Windows Live Fits Into Azure Cloud Computing, which “clarifies the intersection of Live Services and cloud computing.”

Introduction to Live Services, Part 2: A Guided Tour of Live Services is “a guided tour of the many capabilities in Live Services that are available to you.”

•• Wictor Wilén demonstrates Custom code with SharePoint Online and Windows Azure in this 2/20/2009 post, which describes a Worker Cloud Service for processing a SharePoint list with a SharePoint workflow in response to added documents.

Girish P’s Windows Azure bringing Cloud computing to the mainstream post of 2/19/2009 promotes Windows Live Services to “bring it all together” for developers and end users.

Folks giving presentations might want to borrow Girish’s better-than-average diagrams.

Update 2/21/2009: Girish’s diagrams originated on the How Does It Work? page of Microsoft’s main Azure Services Platform page.

Live Windows Azure Apps, Tools and Test Harnesses

•• Jason Young’s Azure – Performance, IoC, and Instances post of 2/19/2008 states that he’s “disappointed by [Azure’s] current reality” when it comes to performance. Azure’s currently in the pre-Beta stage and it’s unfair to judge the performance of even beta versions. My experience shows the Windows Azure OS to be reasonably performant; it’s Internet latency that appears to me to be the problem.

Check out my live Azure Table and Blob test harnesses, which include timing data for execution of storage operations without including time for HTTP request and response operations over my DSL connection.

•• Steve Marx suggests: Try Windows Azure Now! (Almost) No Waiting as of 2/20/2009. Steve writes that he’s:

[H]appy to report that these days you can expect to receive a token within two business days of registering. [Emphasis Steve’s.]

• Reuven Cohen’s Testing The World Wide Cloud post of 2/18/2009 describes SOASTA’s new CloudTest Global Platform. Reuven writes:

SOASTA has unveiled an ambitious plan to utilize an interconnected series of regionalized cloud providers for global load and performance testing Web applications and networks. They're calling this new service the CloudTest Global Platform, which is commercially available today, and is said to enable companies of any size to simulate Web traffic and conditions by leveraging the elasticity and power of Cloud Computing.

This probably won’t help many of us until we get a few more Azure instances.

John O’Brien and Bronwen Zande gave a Windows Azure: A developers introduction to coding in the cloud presentation to the Queensland MSDN Users Group on 2/17/2009.

The session included a demonstration of a port of John’s earlier Silverlight DeepZoom sample to Windows Azure. You can download John’s source code for the session from a link on his Qld MSDN User Group – Windows Azure talk complete post of 2/17/2009. Be sure to read John’s comment regarding edits to the configuration files and his earlier Silverlight Deep Zoom Sample Code and Silverlight Deep Zoom Sample Code Part 2 posts to understand the session code, which includes a Worker Role and a Web services.

Click here for a live Azure demonstration with images about print making from the Australian government: http://www.printsandprintmaking.gov.au/RSSFeed.ashx?mode=3&page=0&index=1&itemtypeid=3. Click the Go button at the top of the page to load the imanges, wait for their thumbnails to appear in the carousel at the bottom of the page, and then click one of the thumbnails to manipulate it in the upper pane.

Tanzim Saquib’s Building applications for Windows Azure of 11/7/2008 is an early (missed) tutorial that shows you how to build a simple ToDo list application with Azure.

Azure Infrastructure

•• Chris Capossela describes five issues he believes will be front and center for business customers as they prepare for this evolution in his The Enterprise, the Cloud, and 5 Key Drivers for 2009 post of 2/21/2009 on GigaOm. The five issues he details are:

  1. Value
  2. Due diligence
  3. Timing
  4. The right tools
  5. Trust

Thanks to David Linthicum for the heads-up on Twitter.

•• James Urquhart’s Infrastructure 2.0 and the New Data Center Culture article of 2/20/2009 for SNS News Service explains how and why:

The number of people and skill sets required to run computing is an increasing burden on corporate IT. …

It takes real expertise to tend to the routers and switches that form the basis of a network infrastructure, but most of that expertise is applied through highly manual processes. …

So IT relies on “clerks” to get the network job done. …

Virtualization also enables levels of automation that were previously impractical with highly customized physical infrastructure. As the virtual infrastructure has to be completely controlled through a computer program, it has not taken long for IT operations to begin to drive out the manual tasks that were once required to provision, maintain, recover, and retire computer servers in the past. …

The tactical IT administrator is about to become another excellent example of the effects of automation – thanks in large part to Infrastructure 2.0.

In other words, IT clerks will soon be asking “Do want fries with that?”

•• Kyle Gabhart offers 10 Steps to Successful Cloud Adoption when “Adopting your very own cloud” in this 2/20/2009 post. Kyle writes:

Adoption can be a scary process.  In your fear of doing something wrong, you may be tempted to buy a big, expensive consulting package and just have someone else handle everything.  You don’t need to do that.  Simply find a subject matter expert to serve as a mentor that can guide you through the process of pragmatically evaluating and possibly even adopting Cloud.  Along the way, make sure that this mentor is educating you and your team so that you are able to function effectively once this person has left the building

•• Gregg Ness’s The Coming Cloud Computing Wars post of 2/20/2009 claims “Cloud Isn't Hype - It's a Vendor Struggle for Relevance” as he leads with:

The cloud computing meme continues to billow as Juniper and IBM announce a cloud management partnership rumors swirl about heavy petting between VMware and shareholder/partner Cisco. A few months ago it seemed like every cloud discussion included Google and/or Amazon; now it appears that “the network infrastructure issue” has finally reared its head and ushered in networking and management leaders into the cloud conversation.

•• Christopher Hoff reviews the increasingly controversial "Above the Clouds: A Berkeley View of Cloud Computing" technical paper in his Berkeley RAD Lab Cloud Computing Paper: Above the Clouds or In the Sand? post of 2/19/2009. Chris’s critique concludes:

Given that it was described as a "view" of Cloud Computing and not the definitive work on the subject, I think perhaps the baby has been unfairly thrown out with the bath water even when balanced with the "danger" that the general public or press may treat it as gospel. …

That being said, I do have issues with the authors' definition of cloud computing as unnecessarily obtuse, their refusal to discuss the differences between the de facto SPI model and its variants is annoying and short-sighted, and their dismissal of private clouds as relevant is quite disturbing.  The notion that Cloud Computing must be "external" to an enterprise and use the Internet as a transport is simply delusional. …

However, I found the coverage of the business drivers, economic issues and the top 10 obstacles to be very good and that people unfamiliar with Cloud Computing would come away with a better understanding -- not necessarily complete -- of the topic.

Chris’s review is probably closer to my take on the paper than any other critique I’ve read so far.

•• Chris also expresses his views about What People REALLY Mean When They Say "THE Cloud" Is More Secure... on 2/20/2009. Chris writes:

Almost all of these references to "better security through Cloudistry" are drawn against examples of Software as a Service (SaaS) offerings.  SaaS is not THE Cloud to the exclusion of everything else.  Keep defining SaaS as THE Cloud and you're being intellectually dishonest (and ignorant.) …

I *love* the Cloud. I just don't trust it.  Sort of like why I don't give my wife the keys to my motorcycles. [Emphasis added.]

•• Chris Evans raises a question about the Storage Network Industry Association (SNIA) standardizing on AWS’s S3 SPI in his Cloud Computing: Common API Standards post of 2/19/2009. Chris writes:

What’s really needed, is to standardise on:

    • Security Model - users want consistency of security across cloud storage providers.  The security model needs to be consistent to provide ease of access, integration with technologies like Active Directory or LDAP.
    • Access Method - standardisation on the use of XML, REST, SOAP, FTP or other protocols to access storage.

Fortunately, Azure uses REST and SDS uses REST and SOAP protocols to access storage.

Dmitry Sotnikov points in his Gartner on Cloud and information control post of 2/18/2009 to a Gartner report entitled “Trusted SaaS Offerings for Secure Collaboration” and priced at US$195. Dmitry writes:

The report is really valuable for anyone either building clouds or cloud-related products, or considering to move sensitive data to a SaaS application.

The key areas they look into are:

  • List of typical SaaS applications which have high trust requirements.
  • Key security features which such applications should possess.
  • Transparency measures which cloud computing/SaaS providers need to implement.

Excellent report: short, to the point, and with material you can use while developing or evaluating SaaS application with trust requirements.

Simon DaviesDynamic Languages and Windows Azure post of 2/17/2009 discusses reflection and the Azure Development Fabric’s Code Access Security restrictions, as well as an implementation of Lisp called L Sharp created by Rob Blackwell at Active Web Solutions that you can read about on his blog here and try it at http://lsharp.cloudapp.net/default.aspx.

Paul Miller adds to the review traffic with his Digging into Berkeley's View of Cloud Computing post of 2/17/2009. “To understand more, [Paul] spent some time on the phone with two of the paper's authors this morning, [Armando Fox and Dave Patterson,] and the result has just been released as a podcast.”

Brenda Michelson’s Unintentional Cloud Watching -- Cloud Computing for Enterprise Architects post of 2/17/2009. Brenda spent 19 years in corporate IT, most recently as Chief Enterprise Architect for L.L. Bean. Previous to L.L. Bean, over the span of 10 years, she provided development services for insurance, banking, a chip manufacturer and a world leader in aircraft engine design & manufacturing. Brenda writes:

On "the morphing of boxes to platforms", what follows is a slide I created for last summer's ComputerWorld Data Center Directions conference.  I was asked to do a mini-presentation on server management, but as you can see, I started with a broader view of "boxes morphing to platforms" and then spoke of related management implications.

Dan’l Lewin, Microsoft’s Corporate VP of Strategic and Emerging Business Development, defines cloud computing with a 50,000-foot view of Azure in this three-minute BeetTV video (2/17/2009, thanks to Alin Irimie.)

Kyle Gabhart says the Industry [Is] Buzzing with Interest Around Cloud Computing and contributes more buzz with the slides from his The Role of Cloud in the Modern Enterprise webinar of 2/16/2009. Kyle claims that Owning Hardware is soooooo 2008 in this post of 2/17/2008, citing Fortune magazine’s Tech Daily post: "Goodbye hardware. Hello, services"

David Linthicum’s SOA needs to learn from the cloud, and the other way around post of 11/17/2009 warns that cloud hype will result in ignoring proper cloud architecture. David writes:

Cloud computing is indeed disruptive technology, and something that needs to be understood in the context of a holistic IT strategy, as well as understood, defined, and leveraged from domain to domain. The silliness that hurt the adoption in SOA is bound to infect cloud computing as well, if we let it happen. I urge you to get below the surface here quickly, else history will repeat itself.

Mike Kavis provided the backstory for David’s post (above) in his If SOA is Dead, Cloud Computing better start writing its will post of 2/12/2009.

Niraj Nagrani and Nigel Ellis are interviewed at PDC 2008 by Keith and Woody in Deep Fried Bytes’ Episode 26: Discovering Azure SQL Services podcast of 2/13/2006. There’s no explanation for the long posting delay.

Cloud Computing Events

•• Security Issues Receive Focus at IDC’s Cloud Computing Forum from OakLeaf ontains excerpts of stories by reporters who attended San Francisco’s one-day IDC Cloud Computing Forum on Wednesday 2/18/2009.

•• Cloud Computing Conference 2009 will take place 5/28 to 5/29/2009 at the the ISEP Conference Center - Instituto Superior de Engenharia do Porto (Engineering Institute of Porto), Rua Dr. António Bernardino de Almeida, 431 P-4200-072 Porto, Portugal. Free registration is here and the agenda is here. Discussion topics are:

    • Digital Identity: How could Identity 2.0 be the backbone (driving force) of cloud computing
    • Future of telecommunications: How Cloud Computing will depend on telecommunication development and network quality
    • User data protection and confidentiality: Reputation and trust, how to use old experiences and well known examples as starting points
    • Interoperability: How will Cloud Computing platforms talk together, and how users will be able to move their data among Clouds
    • IT departments perspectives and integration: How could we already start get benefits of Cloud Computing
    • User perspective: How Internet (the Cloud) will become our PC
    • Small companies and startups opportunities: How to become a Cloud Computing provider and how to use Cloud Computing to add (real) value to business.

The Cloud Computing Interoperability Forum will participate in an all-day workshop entitled "Strategies and Technologies for Cloud Computing Interoperability (SATCCI)" to be held in conjunction with the Object Management Group (OMG) March Technical Meeting on 3/23/2009 at the Hyatt Regency Crystal City, Arlington, VA.

According to his Joint CCIF / OMG Cloud Interoperability Workshop on March 23 in DC post of 2/19/2009, Reuven Cohen will present his thoughts on creation of an open unified cloud interface and opportunity for unification between existing IT and cloud based infrastructures (a.k.a. hybrid computing.)

OpSource SaaS Summit 2009, starts Wednesday 3/11/2009 at the Westin St. Francis, in San Francisco and continues through 3/13/2009. According to OpSource:

SaaS Summit 2009, the largest on-demand industry gathering in the world, is being held in San Francisco on March 11 – 13, 2009. This year’s agenda focuses on the opportunities emerging from the depths of the current economic downturn for SaaS, Web and Cloud computing companies.

Topics include:

      • Thriving, Not Just Surviving
      • SaaS Marketing in a Downturn
      • Cloud Confusion
      • Selling SaaS to the Enterprise
      • Funding the Cloud
      • Minimal Cost, Maximum Gain with Social Networking
      • SaaS Channels: Money Maker or Money Waster

OpSource recently received US$10 million funding from NTT.

2nd International Cloud Computing Conference & Expo will take place, 3/30-4/1/2009, at the Roosevelt Hotel in New York City, with more than 60 sponsors and exhibitors and over 1,500 estimated delegates from 27 different countries. The conference is gathering an array of cloud luminaries as presenters, including Amazon’s Werner Vogels as a keynoter. The Early Bird Price of US$1,695 ($300 saving) expires 2/20/2009.

2009.cloudviews.org promises to be an international conference with a lively discussion and a demonstration place of how changes are already happening. Following are the proposed discussion topics:

    • Digital Identity - How could Identity 2.0 be the backbone (driving force)  of Cloud Computing
    • Future of telecommunications - How Cloud Computing will depend on telecommunication development.
    • User data protection and confidentiality - Legal perspective.
    • Cloud Computing platforms interoperability and data management (mobility) .
    • IT departments and cloud computing integration
    • User perspective - how Internet (the cloud) will become our PC
    • Small companies and startups opportunities - how to become a Cloud Computing provider and how to use cloud computing to add (real) value to business.

What’s missing are the venue and conference dates.

Update 2/2-/2009: See the first entry of this topic (Cloud Computing Events)

Seems to me that cloud hype is overshadowed only be the number of conferences devoted to the topic.

Other Cloud Computing Platforms and Services

•• Jan Pabellon reports that Open Source Vendor SugarCRM Embraces Cloud Computing in a Big Way in his 2/22/2009 post from the Philippines. Jan writes:

Just recently SugarCRM launched a new way to meld Internet services and open source software by launching Cloud Services and Social Feeds. These new Cloud Connectors for SugarCRM allow for company and contact data residing in other cloud environments to be called and presented in SugarCRM. These services include such sites as LinkedIn, ZoomInfo and Crunchbase. The Sugar Feeds feature on the other hand provides a Facebook-like rolling set of notices and alerts based on activity within SugarCRM.

•• James Urquhart reports (along with many others) on 2/21/2009 that Ubuntu now has 'cloud computing inside', which probably would be a more accurate statement in the future tense.

•• Amazon Web Services has updated SimpleDB’s Select API with the Count(*) aggregate function and will now return the partial result set of entities retrieved during the five-minute time limit, as reported in this Announcing Count and Long Running Queries release note of 2/19/2009.

If SimpleDB can return Select Count(*) Where … aggregate values why can’t SDS and Azure Tables?

AWS’s New WSDL Coming Soon post of the same date announces “a new WSDL version which excludes the Query and QueryWithAttributes APIs” in favor of the Select API.

•• Thorsten vok Eiken’s The Skinny on Cloud Lock-in post of 2/19/2009 describes Thorsten’s Lock-in Hypothesis: “The higher the cloud layer you operate in, the greater the lock-in,” which posits that vendor lock-in increases as you move from Infrastructure as a Service (IaaS, e.g. Amazon Web Services) to Platform as a Service (PaaS, e.g. Azure or Google App Engine) to Software as a Service (SaaS, Salesforce.com).

The most interesting element of the post were the results of a survey RightScale recently conducted with their customers and prospects what concerned them most about lock-in:

What piqued my interest is RightScale’s confirmation of my conclusion that data lock-in is more important than vendor lock-in. However, I was surprised that concern for data lock-in outweighed single-vendor worries by 3.5:1. (Image courtesy of RightScale.)

•• Andrew Conry-Murray’s A Cloud Conservative post of 2/19/2009 reports that the Vanguard Group, Inc. chose a private cloud. Andy quotes Bob Yale, who runs technology operations for the company and is very concerned about client data in a public cloud:

"You read a lot about the providers and their security, but given the nature of our business and the criticality of our client data, we aren't comfortable that providers bring the same rigor around data protection as we do. We aren't ready to give up control of our data."

•• rPath, VMware and BlueLock will demonstrate how to maximize application portability, deployment options in virtual and cloud environments in a Webinar on 2/25/2009 according to a press release of 2/19/2009. Erik Troan, founder and CTO, rPath; Wendy Perilli, director of product marketing, VMware; and Pat O’Day, CTO, BlueLock will deliver “Blending Clouds: Avoiding Lock-In and Realizing the Promise of Hybrid Compute Environments — Today,” a webinar and live multi-cloud demonstration. You can register for the event here.

Paul Miller podcasts a 40-minute conversation with Armando Fox and Dave Patterson of Berkeley’s RAD Lab about their Above the Clouds: A Berkeley View of Cloud Computing, which has gathered considerable notoriety among the clouderati. You can listen to the “cloudcast” here.

When I was a kid in Berkeley, RAD Lab meant the radiation laboratory, a nickname for the cyclotron at the top of Strawberry Canyon.

Krishnan Subramanian describes the San Diego Computer Center’s new National Science Foundation’s grant in his Academic Research On Cloud Computing Gets Funded post of 2/18/2009. Krishnan quotes HPC Wire:

Researchers from the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, have been awarded a two-year, $450,000 grant from the National Science Foundation (NSF) to explore new ways for academic researchers to manage extremely large data sets hosted on massive, Internet-based commercial computer clusters, or what have become known as computing "clouds."

John Foley wants more transparency from Amazon about its plans for Amazon Web Services (AWS), data center location and construction schedule, and granular income data for AWS. In his Amazon's Cloud Is Too Cloudy post of 2/18/2009, Foley writes:

So I was glad to see the interview with TechFlash, as it presented an opportunity to learn more about Amazon's groundbreaking IT service model. In the Q&A, Jassy talks about enterprise adoption of AWS, service level agreements, and how he and a group of buddies get together every month to scarf down chicken wings at the Wingdome.

But Jassy clams up when asked about the size of AWS and future plans. …

As I pointed out in a post a few days ago, Amazon is growing in influence in the IT industry, having struck agreements with IBM, Microsoft, Oracle, Red Hat, and Sun in the past 12 months. As Amazon's reach expands, its reticence becomes a bigger issue. How can IT pros, with confidence, turn over their IT workloads to a service provider that provides such limited visibility into its core operations?

Krishnan Subramanian’s Asterisk On The Clouds post of 2/17/2009 explains how open-source PBX and Telephony platform Asterisk can manage voice services from the cloud. He points to two articles that explain how to install and run Asterisk on Amazon EC2:

John Foley takes on VMware’s vCloud “initiative” in his VMware To Take Its Next Steps Into The Cloud post of 2/17/2009. John writes:

VMware’s work in private clouds entails bringing some of the capabilities of public cloud services, including self-service provisioning and usage-based billing, to the corporate data center. It’s also working on securing private clouds by isolating virtual machines in environments where multiple business units share IT resources (which describes most companies).

George Crump discusses performance issues with WebDAV, HTTP, NFS and CIFS in his Getting Data To The Cloud post of 2/17/2009 and points to his Web cast on Cloud Storage Infrastructures to learn more.

Reuven Cohen’s Describing the Cloud MetaVerse post of 2/17/2009 uses the Cloud MetaVerse term to describe the inter-relationships between the world of various internet connected technologies and platforms.

Reuven’s Red Hat Announces It's Kinda Interoperable, Sort Of, Maybe? post of the same date is skeptical of the capability of the

[R]eciprocal agreement with Microsoft to enable increased "interoperability" for the companies’ virtualization platforms. Both companies said that they would offer a joint virtualization validation/certification program that will provide coordinated technical support for their mutual server virtualization customers. …

Digging a little deep[er] it appears that Red Hat and Microsoft don't fully grasp what Interoperability actually is or more to the point who it benefits. But rather they seem to taking advantage of the buzz that interoperability has enjoyed in 2009. So now rather then slapping a "cloud" logo on your product, you slap an interoperable logo on there too.