Wednesday, April 21, 2010

Windows Azure and Cloud Computing Posts for 4/21/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in April 2010. 

Azure Blob, Table and Queue Services

Danny Cohen’s Azure Storage: Asynchronous delete operations and ghost objects post of 4/21/2010 begins:

Here’s the story of my attempt to make sense of Windows Azure Storage erratic behavior when it comes to reporting an object’s existence. The short version for this post is that there is API design flaw in the way Windows Azure Storage addresses this issue, and that you need jump through a few hoops in order to bypass this flaw. The long version is described below.

I’ve been reading Steve Marx’s blog, and I examined his proposed workaround to testing for the existence of a blob. I played around with it for a while, checking whether the same solution can be implemented with a container object (CloudBlobContainer), since the required API (“FetchAttributes” method) has the same signature and apparent functionality for both objects (although there is no base class, interface or any other mechanism that connects both objects).

After playing around with it for awhile, I used the following code to examine the functionality. It basically does a create-verify-delete-verify test cycle.

CloudBlobClient blobStorageType = GetCloudBlobClient();
var name = “test-container”;
var container = blobStorageType.GetContainerReference(name);
var result = container.CreateIfNotExist();
container = blobStorageType.GetContainerReference(name);
Assert.IsTrue(container.Exists());
container.Delete();
Assert.IsFalse(container.Exists());

However, I soon noticed that the results were inconsistent with my expectations: occasionally and seemingly unpredictably I got the “container.Exists()” call to return the wrong result (both true or false).

The behavior was quite erratic and annoying. My first thoughts were that this is an “eventual consistency” issue, since the behavior was so inconsistent. However, Windows Azure Storage is consistent, so this was not really an option.

Danny continues with an explanation for the problem: asynchronous deletions.

V. J. Kumar posted his Azure Blob and Entity Table Integration, extending the Thumbnail sample sample to The Code Project on 4/21/2010:

Introduction

Windows Azure is an exciting platform from Microsoft which supports massively scalable tables in the cloud, which can contain billions of entities and terabytes of data. Windows Azure storage lets user store data in     Windows Azure Blob ( provides storage for large data items) and       Windows Azure Table (provides structured storage). A clear understanding about how to interact with entity tables and blobs will enable us to fully appreciate and utilize all the featuress of Azure.  This article describes the concepts for doing CRUD (Create, Read, Update, Delete) operations on Windows Azure Tables and how table data can interact with the Blobs.

Background

The Azure SDK comes with some samples which are helpful to understand the technology. There have been 2 major releases of the Azure SDK; the first one was in March 2009 and the second one in November 2009. The March release had Storage Client class to work with Blobs and Table storage in the Local Development storage as well as the Cloud. In the November release Microsoft released an API for the Storage Service and simplified things like storage string manipulation.

Some samples have come with the SDK. Two of the samples Thumbnail and the Personal Web Site give a very good flavor of the Azure development environment. The Thumbnail sample is useful to understand the Blob manipulation and the Worker role in Azure. The personal web site sample has example of how to work with tables and blobs. The thumbnail example has been modified for the new Storage Client API and the Personal Web Site has not been.

This article used the Thumbnail sample and extends it. The thumbnail sample lets you browse through an image file and save it in the blob storage. A worker role is created which creates the Thumbnail of the image you uploaded and the image and its thumbnail are stored as blobs. The web page displays the thumbnails in the blob storage and the page is refreshed every ten seconds.

In this extension the following features are added :

  1. Enter a name, date and comments of the picture you are downloading.
  2. Modify fields after they are stored in the Tables.
  3. Delete a picture and the associated data.
  4. The thumbnail in the list, when clicked displays original picture.
  5. Removed automatic refresh every second and used Updated Panel to update relevant sections of the page avoiding unnecessary round trips to the server. …

The extended code described in this article produces the following default page”

image

Joseph Fultz explains Migrating Windows Service to Azure Worker Role: Image Conversion Example using Storage in this 4/2/2010 article, which I missed when posted:

In my work with Symon Communications we had to move the pieces of their solution from their current Windows Services implementations to something that would work well in the cloud. These services would run in the background to collect, transform, and prepare data. This seemed like a natural fit for the use of a worker role. The simple scenario as a means of proving out the idea was to read the images from one container, convert them, and save them to another storage container. Similar to the Thumbnails example that is in the Azure SDK, but in our case we wanted to simplify and felt the use of the queue to be overkill for what we needed to accomplish.

The setup for this is to add a worker role to your cloud solution, create source and target containers in Azure Storage, and finally seed the source storage with the files, in our case PNG files, with the files that are to be converted. This can all be done through development storage and fabric and works the same once deployed. I’ll be using “pictures” and “converted” as the names of the two containers. Thus, on the development storage they’ll actually be referenced as devstoreaccount1/pictures and devstoreaccount1/converted. Let’s get started on the code by adding a new class file to the worker role project. I named the class ImageConverter. Keeping this as simple as possible for the point of demonstration of the worker role in place of a service I use System.Drawing.Image’s built in capabilities to transform it for me. …

Joe continues with the details of his implementation. See also Joe’s Channel 9 Screencast with Symon Discussing Azure Deployment Architect post of 4/8/2010 here:

Directly related to a couple of my recent posts we (Jared Bienz and [I]) talked about it with Joshua Kurlinski from Symon.  You can see the screen cast here.

<Return to section navigation list> 

SQL Azure Database, Codename “Dallas” and OData

The SQL Server Team reported SQL Server 2008 R2 Released to Manufacturing! in this 4/21/2010 post to its MSDN blog:

We are very excited to share that today, Microsoft released SQL Server 2008 R2 to manufacturing! Customers can expect availability in the next few weeks through Microsoft’s distribution channels. For more information, visit www.sqlserverlaunch.com.

R2 Banner

SQL Server 2008 R2 enhancements continue to support mission-critical workloads providing a trusted, scalable platform, increased developer and IT efficiency and managed self-service business intelligence reporting and analytics. It is for these very reasons that SQL Server 2008 R2 is setting records for performance and price/performance on industry standard benchmarks. …

According to a 4/21/2010 conference call with Microsoft Senior Vice President Ted Kummert and General Manager of Business Intelligence Tom Casey, RTM bits will be available for download from TechNet and MSDN on 5/3/2010 and general download by 5/13/2010. The data sheet shows Express, Express with Tools and Express With Advanced Services editions have their maximum database size expanded to 10GB. You can download the Express or trial version as of 4/21/2010 from here.

Check out the Microsoft SQL Server 2008 R2 Hosted Trial; get more details from PASS: Come Play in the SQL Server 2008 R2 Hosted Trial Sandbox.

Lynn Langit’s SQL Saturday – SQL Azure – April 2010 deck update post of 4/21/2010 is brief and to the point:

Here’s the deck I’ll be presenting this Saturday in Huntington Beach at SQL Saturday – c’mon over!

SQL Azure April 2010

View more presentations from lynnlangit.

Julian Lai announced WCF Data Services for Silverlight 4 now available for download on 4/20/2010:

A couple of months ago, we released an update to .NET 3.5 SP1 and its counterpart ADO.NET Data Services for Silverlight 3 Update CTP3. I am now pleased to announce that we have shipped an updated Data Services client library with Silverlight 4. This release includes all the features that we shipped in the SL3 CTP3 Data Service release as well as support for NTLM, Basic and Digest authentication. If you are using the SL3 CTP3 release, the client library in SL4 represents the production release of that feature set. For more information regarding previously released features, check out our blog post here.

The enhanced authentication support is useful in same domain and cross domain scenarios.  To use the feature, simply specify your credentials using the new Credentials property on the DataServiceContext class. For example:

Uri uri = new Uri(www.somesampleuri.com);

DataServiceContext svc = new DataServiceContext(uri);

svc.Credentials = new NetworkCredentials(username, password, domain);

svc.HttpStack = HttpStack.Auto;   // setting this to HttpStack.ClientHttp is also supported

svc.LoadAsync(....);

More information can be found in our MSDN documentation.

Bob Beauchemin’s DAC support SQL Azure and vice-versa. It's live post of 4/19/2010 to the SQLSkills blog begins:

Last week I did a talk at SQLConnections on SQL Azure Database and Data-Tier Applications (DAC). At the time (it was the day of Visual Studio 2010 launch), I explained that conference abstracts had to be submitted 6 months ago. At the time, because of some coincidental feature correspondence (e.g. the DAC whitepaper suggests only using DAC deployment on databases of 10gb or less; 10gb is the current maximum size of a SQL Azure database) I'd actually thought that DAC and Azure were "joined at the hip" and that DAC might already be used in the cloud (internally) for SQL Azure deployment.

It isn't. In fact, neither DAC nor SQL Azure Database supported each other. *Until last week*. At the VS2010 launch, the other DAC talk (by the team) said Azure would be supported as a development/deployment environment. But, except for "import from existing database", even the RTM VS2010 didn't work with SQL Azure.

Imagine my surprise on returning home to see this blog posting by the SQL Azure team. As of last Friday, SQL Azure enhancements "enables deployments of database applications directly from SQL Server 2008 R2 and Visual Studio 2010 to SQL Azure for database deployment flexibility". So, it does hook up, after all.

DAC is a pretty controversial feature because in V1, it only supports a subset of database objects and deployment via a
"new database-copy data-rename databases" funtionality. So, its not for everyone. But, at both talks, attendees seemed to understand the target audience, the "departmental database application", the "600th database application" in a large company, the ones that usually have no DBA support because DBAs are busy with 24x7 line-of-business OLTP apps. If you've ever worked in a big company where database and software development is not the main business of the company (ie, the main business is manufacturing cars, or banking, not developing software), you can grok what exactly what a "departmental application" is.

The attendees got it. When I asked at the end of my talk if, because there are customers for it in the present, the DAC ought to be postponed until it would work with all DBMS apps, I only got 1 taker (for postponed) out of 100. Not so controversial after all. …

See the Windows Azure AppFabric Team’s New OData Service for SQL Azure Uses AppFabric item in the AppFabric: Access Control and Service Bus (next) section.

Naveen Srinivasan’s Using OData , LINQPad, Reactive Extensions (Rx) to query stackoverflow post of 4/7/2010 showed up today on a Google alert for “Odata”:

I saw this cool post from Scott Hanselman on creating a OData API for stackoverflow. I use LINQPad more often than anything.   And sometimes when I am not very busy, I also look for unanswered questions in stackoverflow.    I have been playing around with Reactive Extensions. FYI LINQPad 4.0 supports Rx. So I thought how cool will be it if I have to look for unanswered “windbg” questions from stackoverflow , so that I could answer them.  And here is the query

image

So this would essentially keep querying stack overflow ,if stackoverflow has to implement OData . And I wouldn’t have to launch and application to look for unanswered questions.

I know this will not work now. But how cool it is to combine these frameworks write very succinct code to get what we want, without having to jump through hoops.

<Return to section navigation list> 

AppFabric: Access Control and Service Bus

The Windows Azure AppFabric Team’s New OData Service for SQL Azure Uses AppFabric post of 4/20/2010 observes:

In his MiX 2010 keynote, Doug Purdy announced a new service that exposes data in SQL Azure to web based clients using the OData protocol.

The new service is exciting for two reasons:

  1. It can be used to publish SQL Azure data with no code, and
  2. It uses AppFabric Access Control (ACS) exclusively for authentication and authorization.

You can learn more about it from the following blog posts:

Ron Jacobs apologizes for video problems in his endpoint.tv – Developing Services with IIS/Windows Server AppFabric post of 4/19/2010:

To those nearly 5,000 people who tried to watch this video today… sorry.  I forgot to publish the video correctly for Channel 9 and didn’t realize it until a couple of hour ago.

As I mentioned previously in my post on Developing Services with IIS/Windows Server AppFabric it is possible to work with AppFabric directly from Visual Studio (and there is more than one way… I’ll cover that in another post/video).

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Steve Marx recommends Call DiagnosticMonitor.Start() Only Once in this 4/21/2010 post:

If you’re using the cloud templates in Visual Studio, you’ll see a line in the OnStart() method in WebRole.cs or WorkerRole.cs that looks like this:

DiagnosticMonitor.Start("DiagnosticsConnectionString");

You might be tempted to put this line somewhere else (like in an ASP.NET page), but please resist that temptation. At the moment, if you call DiagnosticMonitor.Start() more than once, you’ll end up with multiple copies of the Windows Azure Diagnostics process. Much like a memory leak, this will sneak up on you over time (as more processes get started), and you’ll eventually starve your role instance of resources.

We’ve seen this come up more than once in customers’ applications, and we have a bug open on our side to prevent duplicate calls to DiagnosticMonitor.Start() from starting multiple copies of the process in the future.

What About Changing Diagnostics Settings Later?

One reason you might be tempted to call DiagnosticMonitor.Start() again is to use different configuration settings. (Maybe based on an admin user’s actions or a specific error condition, you want to ramp up your logging.)

One of the nice things about Windows Azure Diagnostics is that it can be dynamically configured, and you don’t need to call DiagnosticMonitor.Start() again to do that. …

Steve continues with “code that runs in an ASP.NET web page and changes the configuration settings for the current role instance” and concludes:

If you’re going to use Windows Azure Diagnostics at all in your application, I recommend calling DiagnosticMonitor.Start() exactly once, in your role’s OnStart() method. Then use dynamic configuration to change settings at runtime.

See my Windows Azure Live DNS Brief Outage at All Data Centers on 4/21/2010 at ~ 2:30 PM PDT post of 2/21/2010 in the Windows Azure Infrastructure (next) section.

Steve Marx explains Capturing Filtered Windows Events with Windows Azure Diagnostics in this 4/21/2010 post:

If you download the source code for my Hosted Web Core Worker Role, you’ll see a few lines of code which configure Windows Azure Diagnostics to capture Windows Events generated by Hosted Web Core:

var cfg = DiagnosticMonitor.GetDefaultInitialConfiguration();
// HWC uses the event log to indicate what's broken.
// This config setting is really handy when debugging bad config.
cfg.WindowsEventLog.DataSources.Add("Application!*[System[Provider[@Name='HostableWebCore']]]");
diagnosticMonitor = DiagnosticMonitor.Start("DiagnosticsConnectionString", cfg);

When you add a Windows Event log data source, you get to specify an XPath selection criterion for the events you want to log. The Windows Azure Diagnostics documentation gives a simple example and then links to Consuming Events for more details. In neither place could I find examples like what I wanted (listening to events from a particular provider).

I was able to figure out what to put in that string by creating a custom view in the Windows Event Viewer on my laptop and then switching to the XML view to see the XPath:

image

image

It wasn’t hard from there to derive the right syntax to pass to WindowsEventLog.DataSource.Add.

Panagiotis Kefalidis pops up again with a Yes, I’m still here post of 4/21/2010:

Well, I’m still here. I know it’s been like ages since my last post but believe me, I’ve been quite busy with tons of stuff and there was no time to blog.

So, let’s catch up a little bit:

  1. I’ve been awarded the MVP title for 2010 on Visual C#. Thank you very much Microsoft.
  2. I’ve attended the Regional CEE MVP Summit that took place in Athens, Greece. Photos will be up soon.
  3. I started playing with Cassandra DB from Apache Foundation and I’m currently looking in a way to make it run on Windows Azure. I just started and I’ll keep blogging about it. It’s really cool and I hope it will work!

I’ll believe panagiotis’ promise when I see the posts.

Lori MacVittie asserts “It’s all fun and games until application performance can’t be measured” in her Learn How to Play Application Performance Tag at Interop post of 4/21/2010:

We talk a lot about measuring application performance and its importance to load balancing, scalability, meeting SLAs (service level agreements) and even to the implementation of more advanced concepts like cloud balancing and location-based global application delivery but we don’t often talk about how hard it is to actually get that image performance data. Part of the reason it’s so difficult is that the performance metrics you want are ones that as accurately as possible represent end-user experience. You know, customers and visitors, the users of your application that must access your application over what may be a less than phenomenal network connection.

This performance data is vital. Increasingly customers and visitors are basing business choices on application performance:

blockquote Unacceptable Web site performance during peak traffic times led to actions and perceptions that negatively impacted businesses’ revenue and reputation:

  • -- 78 percent of consumers have switched to a competitor’s Web site because they encountered slowdowns, errors and transaction problems during peak traffic times.
  • -- After a poor online experience, 88 percent are less likely to return to a site, 47 percent have a less positive perception of the company and 42 percent have discussed it with family, friends and peers, or online on social networks.

-- Survey Finds Consumer Frustration with Web Site Performance During Peak Traffic Times Negatively Impacts Business Results

And don’t forget that Google recently decided to go ahead and add performance as a factor in its ranking algorithms. If your application and site perform poorly, this could certainly have an even bigger negative impact on your bottom line.

What’s problematic about ensuring application performance is that applications are now being distributed not just across data centers but across deployment models. The term “hybrid” is usually used in conjunction with public and private cloud to denote a marriage between the two but the reality is that today’s IT operations span legacy, web-based, client-server, and cloud models. Making things more difficult is that organizations also have a cross-section of application types – open source, closed source, packaged, and custom applications are all deployed and operating across various types of deployment models and in environments without a consistent, centrally manageable solution for measuring performance in the first place.

The solution to gathering accurate end-user experience performance data has been, to date, to leverage service-providers who specialize in gathering this data. But implementing a common application performance monitoring solution across all applications and environments in such a scenario is quite problematic, because most of these solutions rely upon the ability to instrument the application/site. Organizations, too, may be reluctant to instrument applications for a specific solution – that can result in de facto lock-in as the time and effort necessary to remove and replace the instrumentation may be unacceptable. …

Bruce Kyle reports RES Software Stores Desktop Profiles on SQL Azure in this 4/20/2010 post to the US ISV Developer Community blog:

RES Software uses SQL Azure to offer cloud storage for user desktop configurations. The company also offers on premises storage options.

Channel 9 Video: RES Software Stores Desktop Profiles on SQL Azure.

channel9logoIn this short video on Channel 9, I talk with CTO Bob Janssen about how and why they company offers both SQL Azure and SQL Server storage options for their customers.

RES Software is the proven leader in user workspace management. They driving a transformation in the way today’s organizations manage and reduce the cost of their PC populations. Designed for physical or virtual desktop platforms, RES Software enables IT professionals to centrally manage, automate and deliver secure, personalized and more productive desktop experiences for any user.

For more information about how RES Software users access their desktop information in the cloud, see RG021 – How to create a MS SQL Azure Database on the RES Software blog.

Return to section navigation list> 

Windows Azure Infrastructure

My Windows Azure Live DNS Brief Outage at All Data Centers on 4/21/2010 at ~ 2:30 PM PDT post of 2/21/2010 reports a problem with live DNS operations at all Microsoft data centers covered by the Windows Azure Service Dashboard:

I received the following alert from Pingdom on 4/21/2010:

PingdomAlert DOWN:
  Azure Tables (oakleaf.cloudapp.net) is down since 04/21/2010 01:48:21PM.

and from Mon.itor.us:

image

The post includes a Dashboard screen capture during the outage.

Finchannel.com’s Harris Poll: Cloud Computing – Are Americans Ready? post of 4/21/2010 highlights consumer view of cloud-based storage:

Computers and the Internet have certainly made people's lives easier. Besides just the ability to communicate via email, pictures and music are now stored in our computers, we easily share documents back and forth and people have the ability to conduct banking online and store tax documents in their hard drives, not just in a dusty file cabinet.

However, the new problem is how to move those files in an easy manner or access them from a remote location. There is a new technology, referred to as cloud computing, that allows files to be stored, edited, or played online from any location at any point in time. But will people use this new technology?

On the positive side, just under half of online Americans (47%) say they would be extremely or very interested or interested in using this service for email, while one in five (19%) would be somewhat interested and three in ten (31%) would not be interested at all. But this technology may not be the answer as over half of online Americans say they would only be somewhat or not at all interested in using cloud computing for pictures (55%), music (59%), office documents (61%), videos (63%) or financial services such as tax files or bank records (69%).

These are some of the findings of a new Harris Poll,surveyof 2,320 U.S. adults surveyed online between March 1 and 8, 2009 by Harris Interactive.

As one might expect, there is an age difference in Americans' interest in cloud computing. The youngest generation, Echo Boomers or those aged 18-33, are more likely to say they are extremely or very interested or interested in using this technology. Over half of Echo Boomers express an interest in using it for email (56%), pictures (54%), and music (55%), and just under half say they are interested in using it for office documents (47%) and videos (46%). …

Issues with Cloud Computing

One of the main issues people have with cloud computing is security. Four in five online Americans (81%) agree that they are concerned about securing the service. Only one-quarter (25%) say they would trust this service for files with personal information, while three in five (62%) would not. Over half (58%) disagree with the concept that files stored online are safer than files stored locally on a hard drive and 57% of online Americans would not trust that their files are safe online.

But, at the same time, over three in five online Americans (63%) agree that having access to all their files wherever they are would make their lives much easier.

So what?

The concept behind cloud computing may make sense to Americans. A strong majority agrees that having remote access to their files would make life easier. So if the security issues and concerns could be resolved to the satisfaction of most people, they might be more likely to consider using it.

Matthew Weinberger quotes Unisys: MSPs Need to Go to the Cloud or Die in this 4/21/2010 article for MSPMentor (MSP = Managed Service Provider):

In a Cloud Computing Expo presentation evocatively titled “Storm Clouds: Disruptive Technologies Create the New Normal,” Unisys VP of Global Outsourcing Solutions Sam Gross explained how users have come to expect constant streams of data on demand — an expectation that’s completely shifting the market towards SaaS. Gross criticized the whole industry for being slow to adopt the cloud, but saved his harshest observations for MSPs. I was on the scene to hear him out.

“Those stick in the mud MSPs will vanish if they don’t change,” Gross said.

It’s a strong stance, and it’s hard to miss his meaning. While most of Gross’ presentation was spent talking about how Unisys had solved most, if not all, of the cloud’s security problems with their “Stealth” data-masking solution, he still made some excellent points in support of his argument.

The average customer, he says, doesn’t care how the service provider is managing their cloud platform, just that they have e-mail, phones, and whatever else they need to do business.

“It’s not the provisioning tools, it’s the content,” Gross said.

Most compellingly, Gross reiterated that cloud-delivered solutions allow for a hugely flexible business model. Basically, he says, MSPs who figure out how to combine cloud services with traditional services to be a “new breed” of solutions provider are going to be the ones who thrive in this changing IT climate.

There’s definitely something to his argument, but the fact remains that many MSPs just aren’t making money off the cloud. I suspect if you asked Gross — and I intend to — he’d say that they just haven’t hit on the right model yet, or that they’re being too rigid in their thinking. All the same, I still think it’s too early to issue a verdict on whether or not the cloud hype is justified.

Thomas Wailgum reports “A new Yankee Group study of enterprise cloud computing services finds cloud contracts full of disclaimers, ambiguous uptime guarantees, and uncertain privacy policies and compliance claims” in his Cloud-Computing Services: "Fine Print" Disappointment Forecasted post of 4/21/2010 to the CIO.com blog:

That's the crux of Yankee Group's latest research effort, Cloud 99.99: The Small Print Exposed, by VP and senior research fellow Camille Mendler.

Mendler examined the terms of service, service-level agreements (SLAs) and privacy policies for 46 software-, infrastructure- and platform-as-a-service (SaaS/IaaS/PaaS) offerings from 41 vendors. Those included stalwarts such as Amazon, Google, Microsoft and Salesforce.com

Not surprisingly, the report uncovered some not-so-good news.

"Cloud vendors offer enterprises poor service guarantees and limited financial redress if their service fails," notes the report. "Get-out clauses are rife, and worryingly, robust privacy policies are rare, potentially exposing enterprises to litigation. Enterprises must take a close look at the small print before they proceed, and develop proactive strategies to get the best out of cloud services."

Mendler offers several key areas that enterprises and CIOs need to watch closely or they could suffer:

1. Slippery SLAs: "Whatever the number of 9s offered, 'uptime' definitions vary, and service demarcation points for uptime are rarely end to end," Mendler writes. "Vendors also tend to play fast and loose with scheduled maintenance windows."

The study found that just half of service providers offer SLAs, and "none offer financial compensation when they fail to perform against them." Mendler also claims that timelines to fix site problems are typically "notional" (or conceptual), and that customers should expect "limited reparation other than service credits or the ability to terminate their contract."

2. Cagey Compliance: "SAS-70-certification is not a blanket guarantee of safety or survivability," the report states. "Enterprises should also seek ISO 27000 credentials, and check vendor adequacy against international data protection regulations."

3. Self-Serving Metrics: "Beware vendors acting as both judge and jury in determining service performance," Mendler notes. "The use of third-party performance monitoring tools must become table stakes for credibility."

The report appears to be both a warning to customers and a call to action for the growing number of cloud computing vendors.

"Cloud service providers better clean up their act fast because major investment decisions hang in the balance," Mendler says in the press announcement. "Enterprises need transparency, professionalism and certainty to invest in cloud services—few providers are stepping up."

Here’s Yankee Group’s Bottom Line and Executive Summary:

The Bottom Line:

Cloud vendors offer enterprises poor service guarantees and limited financial redress if their service fails. Get-out clauses are rife, and worryingly, robust privacy policies are rare, potentially exposing enterprises to litigation. Enterprises must take a close look at the small print before they proceed, and develop proactive strategies to get the best out of cloud services.

Executive Summary

It’s time to get steamed about the cloud’s hot air. When it comes to contractual promises to enterprises, the cloud’s earth-bound risks currently outweigh its benefits. Weak service level guarantees, derisory financial redress and lack of operational transparency are all-too-common flaws of many cloud services offered to enterprises today.

Yankee Group’s assertions are the result of an investigation of 41 software-, infrastructure- and platform-as-a-service (SaaS, IaaS and PaaS) providers collectively marketing 46 different services. Included in the scope of research were standard terms of service (TOS), service-level agreements (SLAs) and privacy practices. We find that only half of service providers offer SLAs, and none offer financial compensation when they fail to perform against them (see Exhibit 1). Timelines to fix problems are often notional, and customers can expect limited reparation other than service credits or the ability to terminate their contract.

Cloud service providers must clean up their act fast: Major investment decisions hang in the balance, according to Yankee Group’s enterprise surveys. Cloud service outages are routinely hitting the headlines, driving sharper attention on the minutiae of cloud service commitments. In this report, we identify contractual risks for enterprises, as well as potential mitigation strategies.

Denise Dubie writes “Microsoft details how its management software portfolio will deliver heterogeneous orchestration and automation capabilities to help customers develop, deploy and manage applications in private and public cloud computing environments” as a preface to her Microsoft weaves management technology into cloud vision NetworkWorld report of 4/20/2010 from the Microsoft Management Summit in Las Vegas:

Microsoft's plans for cloud computing don't stop with infrastructure and applications. Company executives say Microsoft will also provide the heterogeneous management layer that customers will need to optimize application performance on-premises or in hosted environments.

Microsoft this week kicked off its Microsoft Management Summit 2010 conference in Las Vegas, welcoming some 3,500 attendees to learn more about the vendor's products and plans for future technologies. Bob Muglia, president of the server and tools business at Microsoft, opened the event officially with a keynote speech that detailed -- with product demonstrations -- how Microsoft intends to manage customer environments from today's data centers to tomorrow's shared clouds, on-premises and off.

Starting with its Dynamic Systems Initiative 10 years ago, Microsoft envisioned an environment that would connect the workflow between development and operations, providing end-to-end management, and helping customers reduce the overhead in delivering and optimizing business applications, Muglia said. It may not have been called cloud at the time, he clarified, but Microsoft recognized the model.

"There is a huge amount of opportunity to simplify the process and reduce the costs" associated with IT, Muglia told attendees.

Microsoft has worked on products such as Visual Studio and Systems Center, partnered with vendors such as HP for storage capabilities, and utilized acquired automation and orchestration software from Opalis, to take automation to the next level to enable IT staff to work on higher-priority tasks. The Opalis technology can work across platforms and enable Microsoft in the future to provide more heterogeneous management capabilities for customers. According to Paul Ross, group product marketing manager at Microsoft, the company already broadened its management reach into VMware environments with its Systems Center Virtual Machine Manager and added Unix and Linux support to System Center as well. …

Alex WilliamsGoogle's Vint Cerf on Private Clouds v. Public Clouds post of 4/20/2010 to the ReadWriteCloud blog begins:

guest_cloudhole_main.jpgThe debate about private clouds continue as the traditional heavyweight enterprise software providers make their big and glossy pitches for their vision of a private cloud.

So, it may come from Google, but still, it is refreshing to hear the intellectual tone that a scholar like Vint Cerf provides. Cerf is Google's chief technology evangelist but his reflections give a sound bearing on how private and public clouds do interact.

He spoke last week at the Google Atmosphere Conference. We came across one of the discussions he had with fellow Google innovators. He repeats what we hear him say a lot. It comes down to interoperability. Private clouds are tools. Google develops tools that are distributed on the Internet. The question is how do clouds interact?

It's a contrast to what we see with Microsoft or Oracle in its quest to sell cloud computing environments into the enterprise.

In the meantime Amazon continues its own quest to dispel private cloud computing as a myth, not a reality.

Alex continues with excerpts from AWS VP Adam Selipsky’s interview with eWeek.

Bridget Botelho reports on 4/20/2010 from the Microsoft Management Summit in Las Vegas for SearchWindowsServer.com: Microsoft pushes cloud computing on reluctant IT pros:

Though cloud computing is high on Microsoft's agenda, IT managers attending the software vendor's annual management confab here this week either aren't ready to make the transition or simply don't want to.

IT pros at the Microsoft Management Summit 2010 were pummeled with cloud imagery as the software vendor disclosed plans for its System Center management portfolio. Some IT managers said cloud computing is interesting in theory, but they are not really sure how to make it work for their organization. IT managers are also realizing that cloud is no longer just a buzzword because – ready or not - giant vendors such as Microsoft have made it a core part of their product road maps.

Bryan Nettles, a help desk administrator at BJ Services Company, a Houston- based chemical processor, said his organization hasn't yet looked at cloud computing services as an option. The cost differential between premises-based enterprise technology and the cost to build a private cloud are still unclear. Nettles said he buys servers and uses virtualization software from VMware Inc., though on a small scale. For him, cloud computing will remain an abstract idea for some time.

Cloud computing? Zzzzzz

Plenty of IT managers have already had their fill of cloud computing hype. Erik Swenson, an IT manager with a construction company in Denver, expressed his frustration over the vendor fire hosing.

"Cloud computing is the new buzz and desktop virtualization is being rammed down the throats of us small IT shops before we can catch our breath," he said. "If our CFO reads about it in some business magazine he gets all excited and comes into my office asking me 'what about this, what about that?' Companies like Microsoft employ genius marketing people to make sure the 'keeping up with the Joneses' mentality holds sway. And IT people are left constantly chasing our tails in a futile effort to keep up."

But Microsoft has embraced cloud computing, as was evident in the MMS 2010 keynote. In addition, the company's CEO, Steve Ballmer, was recently quoted saying that 70% of Microsoft is dedicated to the cloud today, but it will be 90% sooner rather than later. …

Bridget continues with a cloud-positive quote from an IT architect.

The Cloud Communications Alliance announced its formation from the Cloud Computing Expo New York 2010 on 4/20/2010:

The Internet and ubiquitous broadband created a revolution in the way enterprises manage applications giving rise to the concept of Cloud Computing.

The same revolution is now changing the way enterprises communicate giving birth to a new industry: Cloud Communications. More than simply VoIP or Unified Communications, it’s an entirely new way to build, deploy, and scale enterprise communications systems.  

The Cloud Communications Alliance brings together the nation’s leading Cloud Communications providers to create the first nationwide high-definition enterprise voice network in the cloud - with no PBX to buy and no long distance costs between cloud customers.

More information about what the Cloud Communications Alliance is up to is here.

<Return to section navigation list> 

Cloud Security and Governance

Thomas Bittman reports “The weighted score for ‘Security and Privacy’ was more than the score for the next three concerns combined.Polling Data on Public/Private Cloud Computing” in his Polling Data on Public/Private Cloud Computing post of 4/21/2010 to the Gartner blogs:

chickenclouds2

I’ve been looking for an excuse to use this cartoon – I finally found it!

I’m finishing a research note on some polls I took recently of data center executives, managers and decision-makers. Interesting results. Here’s a summary:

(1) The first poll was focused on the top three concerns that data center professionals have with public cloud computing. The weighted score for “Security and Privacy” was more than the score for the next three concerns combined. Sometimes, when it looks like a meteor, it is a meteor (see, I got the cartoon in here)!

(2) The next two polls focused on public cloud computing plans versus private cloud computing plans. Three-fourths said that they were or would be pursuing a private cloud computing strategy by 2012 (only 4% said they weren’t). Three-fourths said that they would invest more in private cloud computing than in public cloud computing through 2012. Hype plays a part here, but we continue to believe that IT organizations will spend more money on private than on public cloud computing through at least 2012.

(3) The final poll focused on challenges with private cloud computing. Technology” was considered sixth out of seven challenges offered. “Management and Operational Processes” came in first, closely followed by “Funding/Chargeback Model.” Process, people and relationship changes will be bigger challenges with private cloud computing than technology.

Once again, thanks to Doug Savage for allowing me to use one of his cartoons (check out the others on his site).

<Return to section navigation list> 

Cloud Computing Events

Mike Erickson announces the Windows Azure Salt Lake City Users Group’s meeting on 4/21/2010 (today) at the New Horizons Learning Center, 2355 Technology Dr., West Valley UT 84119:

Salt Lake Azure User GroupThis month we will continue to cover the Windows Azure AppFabric by focusing on the Service Bus. We will discuss the various messaging patterns which the Service Bus supports. We will review the creation and configuration of an Azure project. Then we will build and run services locally which will be exposed and accessible through the internet. We will discuss the security that the Service Bus provides for these services and see how to secure access to the services. As always we will have pizza for everyone and some prizes for the lucky ones! Please register if you plan to attend so that we can plan appropriately and remember to invite your friends.

Please register so that we can order enough dinner for everyone! Register Here

My 27 TechEd North America Sessions with Keyword “Azure” post of 4/20/2010, repeated from Windows Azure and Cloud Computing Posts for 4/19/2010+, lists:

  • 18 Breakout Sessions
  • 5 Interactive Sessions
  • 4 Hands-on Labs

scheduled for TechEd North America 2010. The posts abstracts are from the current TechEd Session Catalog.

I repeated this item because I added the item to yesterday’s post late in the day.

SRTSolutions announces a series of Windows Azure Bootcamps in May and June, 2010:

What is a Windows Azure Boot Camp?

Windows Azure Boot Camp is a two day deep dive class to get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Azure, as well as a series of labs so you can practice what you just learned. ABC is more than just a class, it is also an event in a box. If you don't see a class near you, then throw your own. We provide all of the materials and training you need to host your own class. This can be for your company, your customers, your friends, or even your family. Please let us know so we can give you all of the details.

Awesome. How much does it cost?

Thanks to all of our fantabulous sponsors, this two day training event is FREE! We will provide drinks and snacks, but you will be on your own for lunch on both days. This is a training class after all.

Here’s the Schedule with dates and locations in the Midwest and Right Coast.

Cumulux presents An Hour In The Cloud With Windows Azure – Webcast, which lasts 90 minutes, on 5/14/2010 at 9:00 AM PDT:

Why is Azure important?

Have you wondered how cloud computing can bring your ideas to the web faster? Reduce your IT costs? Or be able to respond quickly to changes in your business and customers needs? If so, then you need to find out what Azure is all about!

What is this Windows Azure training?

In this Windows Azure training session you will learn about Azure; how to develop and launch your own application as well as how it impacts YOUR daily life.

What does it cost me?

There is NO cost to participate in this training.

The Agenda:

  • Overview of Cloud Computing
  • Introduction to Windows Azure
  • Tour of the Azure Portal
  • Uploading your first Azure package
  • Real world Scenario
  • Experiencing your first cloud app & behind the scenes
  • Q & A

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Alex Williams reports Fujitsu Making $537 Million Investment in Cloud Computing in this 4/21/2010 post to the ReadWriteCloud blog:

The Nikkei Daily in Japan is reporting that Fujitsu will invest $537 million in cloud computing for 2011.

That seems like a staggering investment to us but perhaps it's not at all surprising considering the metamorphosis in the IT sector.

According to The Nikkei and Reuters, the investments will be for more servers and external memory storage at data centers in the U.S., the U.K., Germany, Australia and Singapore.

fujitsuLogo_cropped.gifFujitsu is not a name that is often thought of in terms of cloud computing. But it is one of the largest IT management services companies in the world, competing with the likes of companies such as CA and Microsoft, two providers with deep investments of their own in cloud computing. …

Alex continues with excerpts from an interview with Fujitsu CTO Dr. Joseph Reger.

James Governor asserts VMware’s SpringSource Redis and Rabbit acquisitions: A Database Play is Emerging in this 4/21/2010 post to the MonkChips blog:

The VMware Q1 financials call had some interesting futures stuff, worth quoting CEO Paul Maritz in full

“This is Paul. I would be happy if you had to come around and have a cup of coffee with me and we could discuss that for several hours. The very, very short answer to your question is that we are not trying to get into the database business per se. We are trying to be into the business of enabling applications for the cloud, both private and public. And as I said building off of our SpringSource acquisition we are adding to the repertoire of underlying middleware and technologies that we think are going to be needed to generate – to develop a new generation of applications. So, in that sense our hiring of the gentleman in question is a further indication as was the RabbitMQ acquisition of our intent to build a very compelling suite to enable you to build cloud based applications.

“If you want us to get into the whole database and date storage discussion, as I said, swing by and we can have a long and interesting debate about that.”

Developing cloud based apps will require new data management and storage models. VMware is getting well ahead of the curve by investing in Redis, a well thought of, blazing fast, Key Value store, what is being called a NoSQL database. I wrote up why RabbitMQ is interesting the other day.

Thor With The HammerWhile Maritz may say VMware isn’t getting into the database business, he means not the relational database market. The fact is application development has been dominated by relational- Oracle on distributed, IBM on the mainframe – models. Cloud apps are changing that. As alternative data stores become natural targets for new application workloads VMware does indeed plan to become a database player, or NoSQL player, or data store, or whatever you want to call it.

We have been forcing round holes into square pegs with object/relational mapping for years, but the approach is breaking down. Tools and datastores are becoming heterodox. something RedMonk has heralded for years.

As Matthew Aslett from the 451 tweeted yesterday from the NoSQLEU conference:

“#nosqleu phrase of the day: choose the best solution/tool/storage model for the job. There might be something in ‘Not Only SQL’ after all”

The enterprise is beginning to notice that the web is working differently, and that there are alternatives emerging to the relational sledgehammer. VMware is positioning itself for the change.

Bart De Smet offers links to Some introductory Rx (Reactive Framework) samples in this 4/19/2010 post:

During my last tour I’ve been collecting quite some fundamental and introductory Rx samples as illustrations with my presentations on the topic. As promised, I’m sharing those out through my blog. More Rx content is to follow in the (hopefully near) future, with an exhaustive discussion of various design principles and choices, the underlying theoretical foundation of Rx and coverage of lots of operators.

In the meantime, download the sample project here. While the project targets Visual Studio 2010 RTM, you can simply take the Program.cs file and build a Visual Studio 2008 project around it, referencing the necessary Rx assemblies (which you can download from DevLabs).

Enjoy!

<Return to section navigation list> 

blog comments powered by Disqus