Sunday, February 28, 2010

Windows Azure and Cloud Computing Posts for 2/27/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010. 

Azure Blob, Table and Queue Services

Steve Nagy asserts “The Microsoft CDN is now also utilisable from Windows Azure Blob Storage” in his Azure Locations Around The World post of 2/28/2010. See the Windows Azure Infrastructure section for more details.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Rajib Bahar, Tim Filer and Jason Stratz participate in an 00:08:00 Series Of Discussion[s] On Project Management And Sql Azure Or Cloud Computing Part 2 podcast of 2/28/2010.

I haven’t been able to find Part 1.

David Robinson issued his Final Reminder – Please Upgrade your SQL Azure CTP Account TODAY post on 2/27/2010:

It is that time.  Remember this post  and this one reminding you to upgrade your SQL Azure CTP account? When we are so used to protecting and storing your data, it is hard for us to let go and just delete it.  Please don’t make us do it.  Upgrade your account today.  Here’s how:

To upgrade your Community Technology Preview (CTP) accounts to paid commercial subscriptions:

Please visit our offer page and select the offer of your choice.  When you purchase the selected offer, you will need to sign in with the same Windows Live ID as that associated with your CTP accounts. If you wish to purchase a new commercial subscription but NOT upgrade your existing CTP accounts, please use a different Windows Live ID other than the one used with your CTP accounts when ordering or remove all applications and data associated with your CTP accounts prior to sign up.

On March 1, 2010, the SQL Azure CTP accounts that have not been upgraded will be deleted. It is important to export your data if you do not plan to upgrade to a commercial subscription prior to these dates.

If you have questions or need assistance, please contact the Azure Support Desk

I’ve never been able to remove my old CTP account from the MIX 08 days. Maybe it will go away tomorrow.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Dennis van der Stelt solves the AppFabric: Configuration binding extension could not be found exception in this 2/27/2010 post:

I have recently installed the Windows Azure AppFabric because I’m writing an article for the Dutch .NET Magazine. Problem is I try to do everything in Visual Studio 2010 these days, just because it’s so cool to have something that’s buggy. Seriously! Sometimes you get headaches because you just can’t figure out why something’s not working, only to find out it’s because it really isn’t working because of the current beta version you’re working with. But on the other side it’s really fun and you learn a lot.

As now, when I got the following message:

“Configuration binding extension 'system.serviceModel/bindings/netTcpRelayBinding' could not be found. Verify that this binding extension is properly registered in system.serviceModel/extensions/bindingExtensions and that it is spelled correctly.”

It’s a System.ConfigurationErrorsException which can mean that you might be right with what you configured, the .NET runtime just can’t figure out what it is that is wrong. This time it’s because some extensions to WCF weren’t added to the machine.config of .NET 4.0 RC. It was however added to the machine.config of .NET 2.0 so I took it from there. And for future reference for my dear readers and all others that come in via Google, I’m posting the fix here.

Sidenote : I’m using 2.0.50727 and 4.0.30128 version of the .NET Framework, but the versions might differ on your machine.

Go to C:\Windows\Microsoft.NET\Framework\v2.0.50727\CONFIG\ and read the machine.config from there. In the node configuration\system.servicemodel\extensions\ you find two nodes. The first is bindingElementExtensions and the second is bindingExtensions. You’ll see some bindings with a name that contains “relay” in it. Copy these into notepad or so.

Now open up C:\Windows\Microsoft.NET\Framework\v4.0.30128\Config\ and edit the machine.config there. Copy the lines from the 2.0 config that are missing in the 4.0 config and your AppFabric service should be able to start.

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Gaurav Mantri posted on 2/28/2010 some Recent Ideas for Cloud Storage Studio on the Cerebrata community support site:

There also are several suggestions from Cloud Storage Studio users.

Adam Hicks describes Setting up a simple web app talking to a database in the Azure Cloud in this 2/27/2010 post:

I went to an Azure Open Space Coding Day in Birmingham organised by Dave Evans with help from Eric Nelson, Dave Gristwood and a few others from Microsoft. As a complete Azure newbie, I found the day extremely useful. There were 30 developers working together to learn more about what Windows Azure which made it a lot easier to get past any stumbling blocks which I would have undoubtedly hit if I was on my own!! I have written up here a few things I learnt today so that I can look back and remember…

Adam continues with a detailed illustrated tutorial.

Jim O’Neil’s I Don’t Think We’re In Dallas Anymore post of 2/26/2010 describes his recent trio of presentations on Microsoft codename “Dallas”:

The topic was “Dallas”, but the weather was all New York winter…

On the way to Rochester Squint really hard - I-90 toward Buffalo Driving essentials

This week, I took a trip down I-90, specifically the New York Thruway, to visit three of our .NET User Groups:I-90 all the way

Coordinating for one user group is a good deal of work by itself, so special thanks to Bob Nims, Andy Beaulieu, and Griff Townsend (the leaders of the above groups) for coordinating and adjusting their schedules to accommodate me.   And, of course, thanks to everyone who came out to hear about “Dallas.”  It’s not often I present to a .NET group where no one has heard of the topic, so it was kind of fun being able to introduce “Dallas” to everyone there.

What’s “Dallas”, you ask, well in a nutshell it’s a Data-as-a-Service marketplace, hosted on Azure, that provides a low friction, RESTful interface to huge amounts of data – such as from the AP, NASA, United Nations, InfoUSA and more.  You can grab my slides from the presentation here and check out my Dallas blog postings for more information.

Brandon Werner shows you How To Host Your Site and Content On Azure Quickly and Easily in this 2/25/2010 post:

This entry seeks to provide you with a quick and easy way to get up to speed on Azure quickly by deploying your own personal website as an MVC application in to the cloud. Consider it a “Hello World”. I will do the following:

  • Demonstrate how to write and deploy a simple Azure hosted website
  • Demonstrate how to to create your own image and content server using Azure Storage and expose your content publically through URLs
  • Demonstrate how to use new tools like Azure Storage Explorer to access your cloud storage

Now that Azure has been released, a lot of people are busy coding a lot of awesome applications. I’m proud of you. I’m not one of them. I just have a personal website that I’ve hosted through a collection of GoDaddy, Amazon S3 (for images and PowerPoint slides, etc.) and some custom JavaScript.

So over the Thanksgiving week I decided to move all my stuff over to Azure for fun. This includes hosting my website, moving my RoR code over to a MVC code (don’t freak, MVC is pretty much set up like RoR and PHP as far as directories and deployment, so it’s easy), and moving all my images and other media over to Azure Storage so that I can just reference images and CSS using URLs without needing to redeploy my website (much like I did with Amazon’s S3). …

Brandon continues with a detailed tutorial that’s longer than any of my posts.

Bob Familiar promoted ARCast.TV - Scalable Tax Solutions With CCH and Windows Azure on 2/25/2010:

While it's income season for most of us here in the States, for CCH, a Wolters Kluwer business, serving up sales tax information is a full-time job.

In this episode of ARCast, Denny Boynton sits down with Jones Pavan and Gurleen Randhawa at the 2009 Professional Developers Conference to discuss their use of Windows Azure to build highly scalable solutions for their customers.

The video is at ARCast.TV - Scalable Tax Solutions With CCH and Windows Azure.

Return to section navigation list> 

Windows Azure Infrastructure

Steve Nagy’s Azure Locations Around The World post of 2/28/2010 summarizes recent expansion of data center support for the Windows Azure Platform:

During the early CTP of Azure you could only select 2 locations for your Windows Azure compute and storage accounts, and 1 location for SQL Azure (SQL Data Services) and AppFabric (.Net Services). Now we have a lot more options for all 3 main technologies:

  • North Central US
  • South Central US
  • North Europe
  • Southeast Asia

On top of this we now have a single location for Dallas accounts:

  • South Central US

The Microsoft CDN is now also utilisable from Windows Azure Blob Storage as well. Microsoft has been building a CDN for some and uses it for a variety of purposes. There are over 18 edges in the CDN, Australia being one of those nodes.

So next time you’re wondering about the locations available, hopefully you won’t need to go into the actual portal project creation process just to find out.

Abel Avram quotes David Linthicum in a Windows Azure: Pending Success or Eventual Niche? post of 2/28/2010 to the InfoQ blog:

Microsoft has had its successes and failures over time, and it has managed to come first with some products even if it came later in the game. Is Microsoft going to be as successful with Windows Azure as it has been with the Windows operating system? Or will it remain a niche player like Windows Mobile?

Like any large company, Microsoft has had several great products, some reasonable ones and some total failures. Among those which are considered failures (or at the very least limited successes considering the amount of money and number of developers invested in them) we can start from Microsoft’s early days with Windows 1.0 (1985), Windows 2.0 and Windows 386; continuing with WebTV (1995), Windows Millennium (2000) and Internet Explorer 6; and more recent examples such as Zune, Windows Mobile or Vista.

Abel continues with a laundry list of Microsoft product failures and concludes:

Microsoft has often been late to the game with products, however in many cases it has managed to catch up and take the lead as with Windows or Office. In a continuation of this trend, Microsoft was definitely not the first out of the gate with a cloud offering, with Windows Azure entering the game after Amazon EC2,, Rackspace or Google had already established their presence in this market. However, in an article entitled “Can Microsoft Catch Up by Giving Away Azure?”, David Linthicum wondered if Microsoft will manage to catch up with AWS, Google or other cloud providers. Linthicum mentioned that Microsoft has managed to win even if they were not the first to enter a specific market:

“Once again Microsoft is late to the party. However, they continue to hold a special space in the hearts of many enterprises, a brand loyalty that most cloud computing providers just don't have. The concept here is to get as many users on the platform as possible, in the shortest amount of time. However, is that a good strategy for Microsoft?

“If you look at the history of Microsoft they seem to get into games late, and still win. Their entrance into the emerging Web in the '90s was almost kicking and screaming after the Microsoft Network was released. However, once they set their sites on the Web, they owned the browser market after only a year.”

But the cloud is different as Linthicum remarked:

“The cloud is a bit different. Cloud computing providers have already established their presence in the market. It's going to be difficult to attack users who are already loyal to one or two of the larger players, that is... unless you're willing to give it away for free.

“The reality of cloud computing is that the subscription cost of the platform has very little bearing on the ROI of the platform. Azure, like the other cloud providers, will have to prove to be productive in order to be truly cost effective. That also means being open, something that Microsoft has had issues with in the past. It does not look like the leopard has changed its stripes with Azure.

“What do you think? Will Windows Azure be as successful as the .NET Framework and Visual Studio, or is it destined to be a minor player like Windows Mobile or the Zune?”

Joannes Vermorel proposes MapReduce as burstable low-cost CPU for Windows Azure in this 2/27/2010 post:

About two months ago, when Mike Wickstrand setup a UserVoice instance for Windows Azure, I immediately posted my own suggestion concerning MapReduce. MapReduce is a distributed computing concept initially published by Google late 2004.

Against all odds, my suggestion, driven by the needs of Lokad, made it into the Top 10 most requested features for Windows Azure (well, 9th rank and about 20x times less voted than the No1 request for scaled down hosting).

Lately, I had the opportunity to discuss more with folks at Microsoft gathering market feedback on this item. In software business, there is frequent tendency for users to ask for features they don't want in the end. The difficulty being that proposed features may or may not correctly address initial problems.

Preparing the interview, I realized that, to some extend, I had fallen for the same trap when asking for MapReduce. Actually, we have already reimplemented our own MapReduce equivalent, which is not that hard thanks to the Queue Storage.

I care very little about framework specifics, may it be MapReduce, Hadoop, DryadLinq or something not-invented-yet. Lokad has no cloud legacy calling for a specific implementation.

What I do care about is much simpler. In order to deliver truckloads of forecasts, Lokad needs :

    1. large scale CPU
    2. burstable CPU
    3. low cost CPU

    Windows Azure is already doing a great job addressing Point 1. Thanks to the massive Microsoft investments on Azure datacenters, thousands of VMs can already be instantiated if needed.

    When asking for MapReduce, I was instead expressing my concern for Point 2 and Point 3. Indeed,

    • Amazon MapReduce offers 5x cheaper CPU compared to classical VM-based CPU.
    • VM-based CPU is not very burstable: it takes minutes to spawn a new VM, not seconds.

    Joannes continues with a list of his needs for lighter-weight, lower-cost Windows Azure instances.

    Dan Kasun challenges Matt Asay’s post about Microsoft lock-in a Microsoft’s REAL choice for Governments… post of 2/27/2010:

    I'm fresh from the US Public Sector CIO Summit - it was an amazing week and I had some truly enlightening conversations and experiences. Everyone I spoke with had very positive perceptions and feedback on Microsoft’s strategy - but I happened to come across Matt Asay's blog comments, "Software industry's false choice for governments," where he articulated a different perspective.

    I'm not sure what conference Matt attended, but I think he may have missed a significant portion of the CIO Summit’s theme. Just about every discussion at the Summit focused on openness and interoperability, and the importance of providing choice to enable government innovation. Granted, my viewpoint may be a bit biased given my employment with Microsoft – but I had several discussions with customers and partners who had a similar experience and perspective.

    I wanted to address some of the points that Matt outlined in his post (note: lines copied directly from Matt’s post here are indented and italicized. I’ve included the entire post, in segments, to avoid the perception of taking things out of context).

    Dan continues with a refutation of each of Matt’s points.

    Murray Gordon’s Windows Azure Resources for [Business] Decision Makers post of 2/26/2010 provides a useful link potpourri:

    Jared Bienz, an ISV Architect Evangelist on our team covering the South Central Region, came up with this fairly comprehensive list of great resources on Windows Azure and Live Framework.

    This content is great for business decision makers.

    Don’t miss the link to the SQL Azure Migration Wizard at the bottom. If you haven’t seen it, this is a powerful tool.

    Windows Azure Links

    Windows Azure Homepage: The main public landing page for Windows Azure.

    Windows Azure and ISVs – A Guide for Decision Makers: A great whitepapre written by David Chappell for ISV decision makers.

    Azure Case Studies: A great selection of whitepapers from big named companies like 3M, Seimens, Kelley Blue Book and more.

    Azure Pricing: Main public pricing page.

    Official Azure ROI and TO Calculator: The official TCO and ROI wizard.

    Getting Started with Azure: The primary public start page for those looking to move to or start development on Windows Azure.

    Azure Application Compatibility Support: Compatibility resources offered through the Microsoft FrontRunner program.

    Other Azure Resources: A great page summarizing many more Azure resources beyond what’s available in this list.

    SQL Azure Links

    SQL Azure Migration Wizard: A great tool for migrating data between local SQL and SQL Azure, even between one SQL Azure instance and another. It does it’s best to transfer both structure and data, and it lets you know why if it can’t.

    My Using the SQL Azure Migration Wizard v3.1.3/3.1.4 with the AdventureWorksLT2008R2 Sample Database is an fully illustrated, detailed tutorial for a recent version of SQLAzureMW.

    <Return to section navigation list> 

    Cloud Security and Governance

    Jonathan Penn puts cloud computing first on the list of his What I expect from the RSA Conference post of 2/26/2010:

    I’ll be pretty busy at the RSA Conference this year, with participation in the always-well-attended Industry Analyst Roundtable discussion with my colleagues at Gartner and IDC (March 2, 1:00 PM, Orange Room 302), and moderation a very interesting session on the changing nature of the vendor-CISO relationship (March 4, 9:10 AM, Green Room 123) with the CEO of Sophos and the CISO of Raymond James Financial.

    And about 30 vendor briefings, with some time to cruise the exhibit floor. I’ll probably have to view many of the keynotes online, unfortunately. But I promise to blog each day about what I’m seeing (and not seeing) at the event.

    Here’s what I expect:

    • Cloudiness. Lots of solutions focused on securing IT as it adopts cloud (IaaS, PaaS, and SaaS) computing. This is a marked difference from last year, which showed many vendors offering security products that simply exist “in the cloud” (ie, cloud/SaaS as a delivery model)
    • Commotion …
    • Corroboration …
    • Consistency …

    Jon notes in his blog’s sidebar:

    I'll be participating in the "Industry Analyst Roundtable" session. Moderated by Asheem Chandna, Partner at Greylock Partners, the panel will include myself from Forrester, Chris Christiansen of IDC, and John Pescatore of Gartner. It will held on Tuesday March 2 in Orange Room 302 at 1:00 PM PT.

    <Return to section navigation list> 

    Cloud Computing Events

    MSDN Events presents The MSDN Mid Atlantic Roadshow on 3/3/2010 at 10:30 AM to 5:00 PM at Sheraton Richmond West, 6624 W Broad St, Richmond, Virginia 23230, USA:

    MSDN Events presents: Take Your Applications Sky High with Cloud Computing and the Windows Azure Platform

    Join your local MSDN Events team as we take a deep dive into cloud computing and the Windows Azure Platform. We’ll start with a developer-focused overview of this new platform and the cloud computing services that can be used either together or independently to build highly scalable applications. As the day unfolds, we’ll explore data storage, SQL Azure, and the basics of deployment with Windows Azure. Register today for these free, live sessions in your local area. …

    SESSION 1: Overview of Cloud Computing and Windows Azure

    The Windows Azure platform is a set of high-performance cloud computing services that can be used together or independently and enable developers to leverage existing skills and familiar tools to develop cloud applications. In this session, we’ll provide a developer-focused overview of this new online service computing platform. We’ll explore the components, key features and real day-to-day benefits of Windows Azure.

    Highlights include:

    • What is cloud computing?
    • Running web and web service applications in the cloud
    • Using the Windows Azure and local developer cloud fabric
    • Getting started – tools, SDKs and accounts
    • Writing applications for Windows Azure

    SESSION 2: Survey of Windows Azure Platform Storage Options

    Durable data storage is a key component of any cloud computing offering. The Windows Azure Platform offers many options, which can be used alone or in combination. Windows Azure itself offers ready-to-use and lightweight storage in the form of tables, blobs, and queues. Another choice for storage is SQL Azure, a true relational database in the cloud. In this session, we’ll explore the highlights of these implementations and how to both create and use storage in each form. We’ll give you guidance on choosing the right forms of storage for your application scenarios.

    Highlights include:

    • Understanding table & blob storage
    • Programming against table & blob storage
    • Working with queue storage
    • Managing credentials and connection strings
    • Scaling and configuration
    • Understanding SQL Azure databases versus local SQL Server databases
    • SQL Azure firewall, logins and passwords
    • Database creation, deployments and migrations
    • Database management using SQL Management Studio
    • Programming against SQL Azure databases

    SESSION 3: Going Live with your Azure Solution

    Windows Azure features a powerful, yet simple deployment model. By focusing on your application and abstracting away the infrastructure details, you can deploy almost any app with minimal fuss. In this session, we’ll walk you through the basics of Windows Azure deployment, including site monitoring, diagnostics and performance issues.

    Highlights include:

    • Start-to-Finish Visual Studio demonstration of a realistic XML data driven business web site from the desktop to the cloud.
    • Windows Azure Deployments
    • Start-to-Finish Visual Studio demonstration of a realistic SQL Server data driven business web site from the desktop to the cloud.
    • Configuration of your application in the cloud
    • Guidance and Suggestions to ensure your success

    Registration Options: Event ID 1032439969 Register online Register by Phone: 1-877-MSEVENT (673-8368)

    See Jonathan Penn puts cloud computing first on the list of his What I expect from the RSA Conference post in the Cloud Security and Governance section.

    Mike Taulty describes Free Client Day at Microsoft TechDays UK – London, 15th April in this 2/26/2010 post:

    I wanted to do a call out for the Client development day that we’re running as part of the UK TechDays event. This is a FREE event that’s running all day in London on the 15th April and it’s targeted at developers who are interested in hearing about client development with .NET Framework V4.0 and for Windows 7.

    The agenda on the website doesn’t quite do it justice in that we’ve managed to get some really fantastic speakers for this day;


    … Windows Azure has been through a lot of changes since announcement at PDC08 ( including going live :-) ) and local Azure guy David Gristwood will give you an update on where Azure is, where it fits and what Cloud services you can make use of in your client apps today.

    <Return to section navigation list> 

    Other Cloud Computing Platforms and Services

    Maureen O’Grara claims “Cirious is HP’s vision for creating an enterprise cloud software platform” in her HP Is Cirious About Clouds post of 2/27/2010:

    HP Labs has opened an advanced research facility in Singapore, where it means to re-examine data center and application design principles to explore how future cloud computing needs will be met.

    The facility will support a number of cloud initiatives already underway at other HP Labs sites, collaborating closely with the Service Automation and Integration Lab (SAIL) in Silicon Valley and the Automated Infrastructure Lab (AIL) in England.

    Together, the three will work on Cirious, HP's vision for creating an enterprise cloud software platform.

    Through applied and exploratory research, Singapore is supposed to work with customers, partners, HP business divisions and the academe to generate advances that drive Cirious research.

    As part of the Open Cirrus project, HP has already partnered with Intel, Yahoo and the Infocomm Development Authority (IDA) of Singapore to create a global, multi-data center, open source test bed for cloud computing research and education. IDA houses one of nine test bed locations worldwide.

    It is also in bed with SingTel building Alatum, Singapore's largest commercial grid services platform, under IDA's Grid Service Provisioning project. Alatum offers a variety of computing power, storage and software applications on a pay-per-use, on-demand and online basis. It currently has 15 ISV partners and 70 customers.

    Not to mention its US$250 million partnership with Microsoft.

    <Return to section navigation list> 

    Friday, February 26, 2010

    Windows Azure and Cloud Computing Posts for 2/26/2010+

    Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

    Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

    To use the above links, first click the post’s title to display the single article you want to navigate.

    Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

    Read the detailed TOC here (PDF) and download the sample code here.

    Discuss the book on its WROX P2P Forum.

    See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

    Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

    You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

    • Chapter 12: “Managing SQL Azure Accounts and Databases”
    • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

    HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010. 

    Azure Blob, Table and Queue Services

    No significant articles today.

    <Return to section navigation list> 

    SQL Azure Database (SADB, formerly SDS and SSDS)

    Dave Robinson reminds SQL Azure users on 2/26/2020 about updates to online SQL Azure Documentation:

    Just a quick reminder that with every Service Update (SU), our awesome user education team makes updates to the documentation up on MSDN. Come and check out the latest version of the SQL Azure Documentation on MSDN, and don’t forget to take a look at the Developer’s Guide section for examples on PHP, ASP.NET, ADO.NET, ADO.NET Data Services, and more…

    Nick Hill from the MCS UK Solution Development Team explains how to run Ruby on Rails on Windows Azure with SQL Azure in this detailed, illustrated 2/26/2010 tutorial:

    I was recently talking to a customer about the possibility of moving a web site from Linux to Windows Azure. The hosting costs of the application are not excessive, and the customer is happy with the service received. Nevertheless they were very interested in exploring the hosting costs and potential future benefits of the Windows Azure platform.

    The web site was developed using Ruby on Rails®, an open-source web framework with a reputation for "programmer happiness and sustainable productivity". The site also makes use of MySQL and memcached.

    Most of the necessary jigsaw pieces are already in place:

    • Tushar Shanbhag talked about running MySQL on Windows Azure at the PDC last October.
    • Simon Davies blogged about getting up and running with Ruby on Rails on Windows Azure, and released a Sample Cloud Project.
    • Dominic Green recently discussed memcached on this blog.

    We calculated that we could make savings in the hosting costs simply by moving to Azure. However, it soon became apparent that if we could replace MySQL with SQL Azure, then this would provide a number of additional benefits:

    • Further reduce hosting costs. MySQL would require a worker role, hosted on its own dedicated node. This would cost around $1200/year whereas hosting the 400Mb database on SQL Azure would cost $9.99 per month.
    • Reduced complexity and management. The MySQL accelerator shows how to deploy in a worker role, and includes sample code for managing and backing up the database. However this is all custom code which would increase the total cost of ownership of the solution. Using SQL Azure would greatly simplify the architecture.
    • Business Intelligence. The application currently has a custom Business Intelligence module, developed using Flash. Moving to SQL Azure would allow the customer to take advantage of roadmap features to provide clients with a much more sophisticated Business Intelligence module, while again reducing total cost of ownership.

    The only missing piece in this jigsaw is the connection of a Ruby on Rails application to SQL Azure.

    Nick continues with the “missing piece” to create a project with a page similar to this live example:

    <Return to section navigation list> 

    AppFabric: Access Control, Service Bus and Workflow

    Mike Kirkwood’s Lady Gaga as the Killer App: Moving Identity into the Cloud post of 2/26/2010 to the Read/Write Cloud blog begins:

    Protocols, protocols, everywhere, and not a drop to drink. OAuth, OpenID, UX, Shibboleth, SAML, XRI, FOAF, Facebook Connect, that is a small sampling of some of the technologies that have been invented to move Internet Identity forward forward for the web.

    Today, at the Open ID User Experience Summit, a jaw-dropping statistic was given that 89% of users coming to chose a third-party logon rather than create a new account. "Signup with Facebook, Twitter, or MySpace" is the default option on - and it works.

    Mike continues with details about “why is this site getting such high level of adoption of third-party logons, which hasn't been seen at this level anywhere else.”

    Nick Eaton’s Microsoft, UW partner for cloud-services integration article of 2/25/2010 for the SeattlePI’s The Microsoft Blog describes how:

    The UW's IT staff and Microsoft are working together on single-sign-on technology through Live@edu, Microsoft's student-oriented hosted e-mail, communications and collaboration service similar to the Business Productivity Online Suite (BPOS). (By the way, Ron Markezich, corporate vice president for Online Services, on Wednesday said Live@edu will soon be branded something along the lines of BPOS@edu.)

    The technology, currently in testing, will allow UW and non-UW researchers to collaborate and also allow students to connect to third-party online services through their UW NetIDs. …

    From the outside looking in, the technology isn't very sexy. In fact, in its current implementation the Identity Federation service looks like – well, is – a log-in page. [Emphasis added.]

    But there's a lot of complexity under the hood – lots of acronyms. It works through claims-based Federated Identity Management (FIdM) in Windows Identity Foundation (WIF), and through Active Directory (AD) via Active Directory Federation Services (ADFSv2 – formerly codenamed "Geneva"). Outside of a Microsoft-based undercarriage, SAML/Shibboleth federation supports OpenLDAP. … If you want more information on the technical aspects, go here or here.

    Dave Fisher, senior program manager for Microsoft's Live@edu team, said the technology will be broadly available by late 2010. "We have the technologies, they're real, we're putting this together," he told the workshop audience Thursday, "and we're looking forward to offering these to all of you, to all of our customers."

    <Return to section navigation list>

    Live Windows Azure Apps, APIs, Tools and Test Harnesses

    Brian Hitney announced the availability of the Azure Miniseries #4: Monitoring Applications screencast on 2/26/2010:

    In this screencast, we'll take a look at monitoring Azure applications by capturing event logs and performance counters. We'll also look at using PowerShell to deploy and configure applications using the management API. Finally, we'll take a sneak peek at Azure Diagnostics Manager, a tool from Cerebrata that allows you explore event logs and look at performance counters visually.

    Brian continues with some PowerShell snippets for creating a self-signed certificate. Here’s the link to the screencast on Channel9.

    Here are some links from the screencast:

    Ryan Dunn and David Aiken posted the second Cloud Cover - Episode 2 Webcast to Channel9 on 2/26/2010:

    Steve was in Japan this week filming a commercial, so David Aiken replaced him. Join Ryan and David this week as they cover the Microsoft cloud.
    Follow and interact with the show at @cloudcovershow
    In this episode:

    • Walk through the RoleEntryPoint and the hooks you can use to build your web and worker services.
    • Learn about the billing model in Windows Azure.
    • Find out how to troubleshoot the Initializing-Busy-Stopping loop.

    Show Links:

    1. Miami311
    2. SharpCloud
    3. Troubleshooting Initializing-Busy-Stopping

    Luke Timmerman’s Microsoft Builds Out Health IT Portfolio, Waits (and Waits) for Market to Materialize post of 2/26/2010 for XConomy/Seattle claims more than 70 health monitoring devices now connect to HealthVault:

    Patience has got to be the watchword for the 800 or so people who work at Microsoft’s Health Solutions Group. There’s certainly been a lot of political rhetoric over the past year about dragging the inefficient world of pen and paper medical records into the 21st century—but this is still one big market opportunity waiting to be tapped.

    More than four years have gone by since former CEO Peter Neupert re-joined Microsoft to spearhead its worldwide health strategy. This division isn’t going to pay the bills like Windows 7 does anytime soon, but Microsoft has shown it is willing to keep building a wide and deep portfolio of products, and to be patient for when its day will come. That was the sense I got during a wide-ranging conversation I had earlier this week with Nate McLemore, Microsoft’s general manager of business development and policy in the Health Solutions Group.

    “We are taking this very seriously and investing a lot,” McLemore says. “It’s a top-of-mind issue for governments, for businesses, and for consumers. They are all our customers.” …

    So what kind of traction is Microsoft seeing here? The company isn’t saying how many consumers are using HealthVault. It’s measuring progress in other ways, like how there were 46 healthcare organizations who adopted HealthVault when the platform was introduced in October 2007, and now that number has climbed to 150. When the program launched, there were nine devices that could upload data to be compatible with HealthVault—think blood sugar monitors for diabetics, for example. Now there are 70.

    But really, the various constituents—doctors, patients, hospitals—who all need to adopt these technologies are taking their sweet time. All that e-health money that was authorized for spending from the American Reinvestment and Recovery Act—something between $19 billion or $47 billion, depending on budget assumptions—is still waiting to be put to work, McLemore says. …

    Hospitals are another key piece of the puzzle. About 115 U.S. hospitals have bought a license to the Amalga Unified Intelligence System, including academic leaders like the University of Washington and Mayo Clinic, as well as community hospitals. This is the program that helps hospital workers run queries that enable all the various proprietary software programs in a hospital to talk to each other. Microsoft’s latest move to strengthen this product came via an acquisition earlier this month of Andover, MA-based Sentillion, for an undisclosed sum.

    Sentillion was considered useful because it enables a physician to check a lab report, a pharmacy record, or anything else while basically toggling between programs on a desktop, like anybody else would on a computer running Windows 7. It’s meant to eliminate the extra time-consuming hassles of logging in and out of separate programs, which might discourage a physician from double-checking something when they are busy—and might later prove to be important, McLemore says.

    Luke’s article is an excellent summary of HealthVault’s current status and its relationship to Microsoft’s other health-oriented properties, as well as President Obama’s health coverage expansion plans.

    Bob Familiar posted a link to ARCast.TV - Scalable Tax Solutions With CCH and Windows Azure on 2/25/2010:

    While it's income season for most of us here in the States, for CCH, a Wolters Kluwer business, serving up sales tax information is a full-time job.
    In this episode of ARCast, Denny Boynton sits down with Jones Pavan and Gurleen Randhawa at the 2009 Professional Developers Conference to discuss their use of Windows Azure to build highly scalable solutions for their customers: ARCast.TV - Scalable Tax Solutions With CCH and Windows Azure

    The Windows Azure Team posted a Real World Windows Azure: Interview with Jim Graham, Technical Manager at 3M on 3/25/2010:

    As part of the Real World Windows Azure series, we talked to Jim Graham, Technical Manager at 3M, about using the Windows Azure platform for the company's innovative Visual Attention Service. Here's what he had to say:

    MSDN: Tell us about 3M-what kind of products do you develop?

    Graham: We are a recognized world leader in research and development. We develop a wide range of consumer and industrial products but are most well-known for brands such as Post-it, Scotch, Thinsulate, and Scotch-Brite.

    MSDN: What was the biggest challenge 3M faced prior to implementing Windows Azure?

    Graham: We had a prototype Web-based application hosted in our data centers-the 3M Visual Attention Service (VAS)-which makes it possible for designers to test the effectiveness of their content using visual attention models. To make it a viable offering, the VAS application had to be available to customers in real time; be capable of processing images, returning near-immediate results, and scaling rapidly; and it had to carry a low up-front investment risk for us, especially in this economic climate.

    MSDN: Can you describe how 3M used Windows Azure to make the 3M VAS application a viable product?

    Graham: We built the user interface from the ground-up on Windows Azure. We used the Windows Azure development fabric, which made it very easy to run and test the VAS application before deploying it. VAS incorporates a number of unmanaged, high-performance image-processing software libraries, and by using the development fabric, we were able to perform quick iterations of code. We're using the Windows Azure platform AppFabric Access Control Service to authenticate users, Microsoft SQL Azure to manage images that users upload, and Queues in Windows Azure to provide near real-time analysis. We launched the product in November 2009 and our growing customer base has exceeded expectations.

    The interview continues with more of this series’ stock questions. You can:

    Nick Hill from the MCS UK Solution Development Team explains how to run Ruby on Rails on Windows Azure with SQL Azure in this detailed, illustrated 2/26/2010 tutorial that’s described in the SQL Azure Database (SADB) section above.

    Return to section navigation list> 

    Windows Azure Infrastructure

    Lori MacVittie prefaces her Pay No Attention to the Infrastructure Behind the Cloudy Curtain advice of 2/26/2010 with “What is needed to customize the cloud is a pair of data center ruby slippers called Infrastructure 2.0:”

    Frank Gens of IDC discussed the “New IDC IT Cloud Services Survey: Top Benefits and Challenges” in his blog and what is not surprising is that security continues to top the challenges associated with cloud services. What may be surprising to some is the increasing focus on customization. It shouldn’t be. As customers continue to push at the  boundaries  of the cloud computing model they will inevitably find it unable to meet some need they have, such as customization.

    imageSee, when IT professionals said they didn’t want to worry about infrastructure that didn’t necessarily mean they didn’t care about the infrastructure. What they meant was they didn’t want to bear the operational and capital expenses associated with infrastructure if they didn’t have to. That’s a very different story than not caring about the infrastructure or about their ability to provision it, manage it, and ultimately control it. Applications are never deployed in a vacuum, after all, and part of the way in which they are secured, optimized, and made highly available is through its supporting infrastructure. Many of those options are simply no longer available in “the cloud”, and this is likely to be a bullet point in the “against cloud” column for many organizations who employ a more infrastructure inclusive strategy to delivering applications.

    We could easily argue that “lack of interoperability standards” (cited higher on the challenge scale at 80.2% of respondents concerned to very concerned about standards in the survey) is directly related to this lack of customization capability (76% cited this as a concern). After all, interoperability standards across infrastructure of similar ilk would, ostensibly, make it easier for cloud computing providers to offer the infrastructure services required to customize the environment.

    Lori goes on with a description of “composite data center structure:”

    Mary Jo Foley reports Behind the IDC data: Windows still No. 1 in server operating systems in this 2/26/2010 post to ZDNet’s All About Microsoft blog:

    International Data Corp. released its fourth-quarter global server data on February 25, listing the top providers of server hardware. But what about on the software front?

    According to IDC’s data, Windows is still the dominant player. The fourth quarter 2009 was more robust than the third, in terms of total revenues and units. Windows’ share of the total stayed constant unit-wise, yet declined, dollar-wise, when compared to the previous calendar quarter.

    That said, Windows is still far and away the No. 1 server operating system, in terms of units, and the definite leader in terms of dollars. …

    Mary Jo continues with IDC’s OS share data break out.

    Bruce Guptil, Charlie Burns, Bill McNee and Mark West wrote Saugatuck Technologies’ four-page Cloud IT: Stages of Simultaneous, Disruptive Growth and Change Research Alert of 2/25/2010 (requires site registration):

    Demand for and use of cloud-based business applications (SaaS), IT infrastructure, and business services – referred to in this research piece as “Cloud IT” – is exploding (see Note 1).

    What is Happening? This core finding is the foundation of Saugatuck Technology’s latest research study, titled “Lead, Follow, and Get Out of the Way: A Working Model for Cloud IT through 2014” (SSR-706). The study leverages interviews with experienced user organization executives, analysis of global survey data (from multiple research programs), and insights from leading Cloud providers – to build a realistic, working model of Cloud evolution, adoption, and most importantly, cost-effective management. …

    The authors continue with a diagram of the “Saugatuck Cloud IT Reality Model,” additional concludes and detailed notes.

    Salvatore Genovese clams “The majority of IT professionals in Europe now have partial understanding of cloud storage” in his Europe Finally Discovers Cloud Computing post of 2/25/2010:

    A survey conducted by data management and storage vendor NetApp has revealed that overwhelmingly virtualisation (70%) is the top priority for IT investment in 2010, and that awareness of cloud computing is on the rise.

    The majority of IT professionals (74%) now claim to have partial understanding of cloud storage, placing the technology firmly on the IT and business agenda.

    The survey of more than 100 professionals revealed cost savings (22%) and a pay-as-you-go model (23%) as the most appealing benefits of cloud storage, as well as perceived security (29%) and integration (22%) issues as the biggest barriers to adoption.

    "What this tells us is that cloud is here and top of mind for businesses," said Dave Allen, UK MD, NetApp. "Just as virtualisation has gone full circle from an abstract concept to a technology that more than 70% of IT professionals plan to implement this year, we will see cloud follow the same pattern over 2010. The basic knowledge foundations have been laid, the next step is for businesses to overcome barriers and understand exactly how cloud could work for specific data and applications.

    "As server and storage virtualisation gains speed and momentum over 2010, we can expect to see more businesses also virtualise the network and move further towards a unified computing model. From a unified core, IT professionals will find it to easier to launch both internal and external clouds that deliver a lower cost of ownership and save the business money. …

    R “Ray” Wang reports Quarterly Financial Tracker: Q4 CY 2009 SaaS Vendors Continue To Trump On Premise Vendors In YoY Growth on 2/24/2010:

    … The recession continued to take its toll on software sales with a slight impact to the SaaS vendors.  Growth rates have come down from the high 30’s to the low 20’s.  But with “flat” the new growth metric in this down economy, results remain impressive.  Traditional on-premise vendors see some light at the end of the tunnel.  License revenues have started to stabilize on a year-over-year basis. …

    He then goes on to list major events in the 2009 calendar year (CY) Q4 and provides tables showing breakneck sales growth:

    and support for his claim that “Many On Premise Vendors Rely On Maintenance To Bolster Sagging License Revenues:”


    <Return to section navigation list> 

    Cloud Security and Governance

    Adam Swidler announced Google joins the Cloud Security Alliance on 2/26/2009:

    Today we're happy to announce that Google has joined the Cloud Security Alliance, a non-profit organization of experts focused on best practices and education efforts around the security of cloud computing.

    Cloud computing continues to gain momentum, and organizations such as the CSA are an important part of an ecosystem that works to increase transparency, lower risks, and promote independent research. The CSA's focus on security best practices offers valuable information to organizations looking to move to the cloud, and as a member of the CSA, we look forward to providing ongoing education about cloud computing and its value to the organizations that use it.

    Google's activities with the CSA include sponsoring the Cloud Security Alliance Summit at RSA Conference 2010 on March 1, 2010 in San Francisco, California, and participating in a CSA panel discussion at SecureCloud 2010, held on March 16 and 17 in Barcelona, Spain.

    Learn more about Google's cloud computing solutions for organizations.

    Glad to hear it!

    Friday, February 26, 2010 at 1:45 PM

    Gorka Sadowski claims logs are “The Last Barrier Between You and Disaster” in his Unleashing The Power of Logs post of 2/26/2010:

    This article discusses some of the main defensive security solutions used today and explains the reasons why employing a Log Management and Intelligence solution is critical to complement these protection methods.

    Let's first look at the most common defensive security solutions that have been popular these past few years. This is not an exhaustive list of all existing technologies, but rather a high-level view of some of the prevalent ones.

    1. Anti-virus
    2. Firewalls/VPN
    3. IDS/IPS
    4. Anti-Trojan/worms
    5. Anti-Spyware
    6. SIEMs

    These correspond to an approach called "Defense in Depth" that aims to put successive rings of protection between the bad guys and the information to protect, making successful attacks harder and harder. …

    Gorka performs a detailed analysis of the preceding existing technologies and concludes:

    We talked about the importance of process and procedures surrounding the use of defensive security solutions. Indeed, we demonstrated that an "install and forget" approach to security is doomed to disaster.

    A sound Log Management and Intelligence system should not only be part of your bag of tricks, but integral to your process and procedures as a way to verify and ensure the validity of your security solutions. Log Management and Intelligence is more than just an added safety measure - it could be the last, and most effective barrier between you and disaster.

    Ellen Messmer asserts “Cloud computing providers are teaming with security vendors to shore up hosted environments” in her Cloud computing security challenges unite hosting providers, security specialists post of 2/26/2010 to Network World’s Security blog:

    As cloud computing adoption climbs, hosting providers are inking deals with security vendors to provide security-as-a-service options to customers. But will enterprise IT managers buy into these often novel forms of security woven into a cloud computing environment?

    There's definitely some resistance as IT and security managers struggle to sort out risk factors and compliance issues.

    "A good number of organizations are now using what they consider to be cloud services," says Bill Trussell, managing director of security research at TheInfoPro, which just published its semi-annual survey of information security professionals at large and midsize firms in North America. But when TheInfoPro asked respondents about whether they'd use cloud-based security services in cloud computing environments, less than 15% cited that as being very likely.

    "When asked whether organizations would extend functions such as user access and provisioning, or two-factor authentication, to cloud providers, it wasn't too popular," Trussell says. Enterprise security professionals are still nervous about something largely unfamiliar that doesn't sit on their premises and isn't under their direct control — or even under the direct control of the cloud-computing provider they use, since the security service is controlled by a third-party vendor with security expertise. …

    Steven J. Murdoch and Ross Anderson ask Online Payment Security: Should the Government Intervene? in this 2/25/2010 post about their technical paper to the Consumer Fin Tech Focus blog:

    Title: Online Payment Security: Should the Government Intervene?

    Background: On January 26, 2010, two researchers at Cambridge University, Steven J. Murdoch and Ross Anderson, released a working paper with the provocative title, "Verifed by Visa and MasterCard SecureCode: or, How Not to Design Authentication".  It directly attacks 3D Secure as a poorly designed authentication scheme, and calls for regulatory intervention to protect consumers.  To what extent does the report raise fair criticisms, and how should the industry and/or regulators respond?

    More: RSA Security, the primary vendor of 3D Secure technology, responded to the paper in two blog posts, which can be found here and here.  Their main point seems to be that the Cambridge researchers are confusing the implementation of 3D Secure by some UK banks with the 3D Secure architecture, which is capable of supporting a wide variety of authentication methods, not just those criticized in the paper.

    However, this flexibility is exactly what the researchers are criticizing on p.4, where they write, "The 3DS specification only covers the communication between the merchant, issuer, acquirer, and payment scheme, not how customer verification is performed.  This is left to the issuer, and some have made extremely unwise choices."  As I see it, the real question here is: to what extent is a vendor obligated to restrict the choices its customers make with regard to the use of its technology, and if the vendor has no such obligation, should the government step in and establish such restrictions?

    <Return to section navigation list> 

    Cloud Computing Events

    The Health Information Management Systems Society (HIMSS) 2010 Conference starts 3/1/2010 and runs through 3/4/2010 at the Georgia World Conference Center, Atlanta, GA, USA. Following are the five sessions returned by a search on cloud:

    Almost everyone who is anybody in the Health Information Technology (HIT) business will be in Atlanta next week for HIMSS.

    Barbara Duck’s Microsoft Technologies for Connecting Patients, Physicians and Providers at HIMSS 2010 post of 2/26/2010 includes links to Microsoft-oriented presentations, videos, etc.

    Brian Loesgen announces a Day-long Azure Conference in San Diego, March 6th in this 2/26/2010 post:

    The San Diego .NET User Group will be hosting our first Azure day-long conference on March 6th. This first one will be more overview level, we’re planning at least one other which will be more focused and deeper for later this year, and then perhaps more after that. For this first one, we will be covering:

    • Azure Intro (David Chou)
    • Azure Development (Daniel Egan)
    • SQL Azure (Lynn Langit)
    • Azure AppFabric (Brian Loesgen)

    And yes, the way it turned out, all speakers for this are from Microsoft, and David is a fellow co-author from the oh_so_close_to_done_now book I’ve been involved with for quite some time…. “SOA with .NET and Windows Azure”.

    These conferences are usually a lot of fun and great investments, and this one in particular will be a great way to jump-start your Azure knowledge, or fill in some gaps you may have.

    The event page with further details is here. As usual, great discounts for user group members.

    Lynn Langit posted the slide deck from her 2/26/2010 Intro to Windows Azure for Developers – MSDN session in Burbank, CA and provides links to more of her presentations.

    Chris Hoff (@Beaker) reports Virtual Networking/Nexus 1000v Virtual Switch Blogger Roundtable/WebEx Logistics – March 2nd. on 2/25/2010:

    About a year before I started working at the Jolly Green Giant (Cisco) I had a rather loud and addictive hobby that was focused on proving that Cisco would offer a “third party” virtual switch for VMware environments.  This sort of unhealthy fascination also dovetailed with another related to “Project California” which later became the UCS (Unified Computing System.)  Both are now something I talk about in my day job quite a bit.

    So I don’t normally directly blog about specific work-related stuff here, but I’m going to make a quasi-exception.

    The PM’s from our SAVBU (Server and Virtualization Business Unit) who own the Nexus 1000v and UCS product lines asked me if I’d get together a bunch of bloggers, analysts, end users, pundits, crusaders, super heroes, networking and security geeks and have a discussion about virtual networking — specifically the 1000v.

    Chris continues with details for attending or connecting to the WebEx roundtable.

    Microsoft Public Relations presents an 00:41:32 video of Bob Muglia’s presentation to the Goldman Sachs Technology and Internet Conference 2010 of 2/23/2010 in which he forecasted that Windows Azure wouldn’t return profits in the near future. A downloadable transcript in Word .doc format is here.

    David Linthicum will present the the keynote at the Conference to be held 4/18 to 4/21/2010 at the Hilton Hotel in Baltimore, MD, USA:

    If your organization has implemented SharePoint, is planning to implement SharePoint, or is just evaluating SharePoint, this conference is possibly the best resource available to you.  Rather than spending countless hours sifting through the plethora of blogs, books, websites, and articles about SharePoint, learn exactly what you need to know in targeted sessions.  With general sessions and specific breakout sessions that focus both on business users and technical professionals, there is something for everyone.  Regardless of your organization's immediate interests -- Public facing Web sites, Intranets, BI solutions, Community Extranets, or Social Networking -- the SharePoint Conference .Org 2010 will help you learn what you need to know to get the most out of SharePoint.  The SharePoint Conference .Org is designed for you.  The conference is not designed for consultants or companies that provide SharePoint development services for third-parties. …

    <Return to section navigation list> 

    Other Cloud Computing Platforms and Services

    Michael Sheehan reports on 2/26/2010 a Video: GoGrid February 2010 Feature Release – Webinar & Presentation:

    On Wednesday February 24, 2010, GoGrid hosted a webinar for new and existing GoGrid users designed to discuss the recent February 2010 Feature updates to GoGrid. There is a blog post that details all of the new features included in the release as well as a screencast which walks through these features and important changes. The webinar covered the following information:

    • What is our view of Cloud Computing
    • What is GoGrid
    • New feature: GoGrid Dedicated Servers
    • What is Hybrid Infrastructure
    • A GoGrid Portal Demo
    • Deploying a GoGrid Dedicated Server
    • The new GoGrid List View
    • Walk-through of other Interface Enhancements & Links
    • Question & Answer Session

    The entire Webinar is below and is broken into two parts:

    • Part One – Overview presentation, discussion of Cloud & GoGrid, demonstration of the GoGrid Portal & GoGrid Dedicated Server Deployments (30 minutes in length)
    • Part Two – Question & Answer session from the audience and Additional Information (19 minutes in length)

    Also included later on in this post is the stand-alone presentation (without audio, demo walk-through or question and answers).

    Dana Gardner claims “GoToManage to tear down IT management boundaries for cloud computing” in his  Citrix Acquires Paglo, Launches GoToManage Cloud Computing Platform post of 2/26/2010:

    In a move to enter the burgeoning SaaS-based IT management market, Citrix Online announced its acquisition of Menlo Park, Calif.-based Paglo Labs on Wednesday. The first fruits of the acquisition is an integrated web-based platform for monitoring, controlling and supporting IT infrastructure.

    Dubbed GoToManage, the new service lets Citrix Online tap into the growing demand for software-as-a-service (Saas)-based IT management, a market Forrester Research predicts will reach $4 billion in 2013. Citrix Online is positioning the latest addition to its online services portfolio as an affordable alternative to premise-based software. [Disclosure: Paglo is a sponsor of BriefingsDirect podcasts. Learn more about Paglo's offerings and value.]

    I expect that as more enterprises experiment and adopt more mixed-hosted services -- including cloud, SaaS, IaaS, and outsourced ecosystems solutions -- that web-based management capabilities will become a requirement. In order to manage across boundaries, you need management reach that has mastered those boundaries. On-premises and traditional IT management is clearly not there yet.

    Elizabeth Cholawsky, vice president of Products and Services at Citrix Online, explains the reasoning behind the acquisition:

    “Our customers increasingly tell us they are interested in adding IT management services to our remote support capabilities. With the growing acceptance of SaaS and the increasing use of IT services in small- and medium-sized businesses, we decided IT management reinforced our remote support strategy.” …

    Dana continues with his analysis of the transaction.

    Reuven Cohen’s VMware's New Cloud Mission - The Bottom up post of 2/26/2010 begins:

    Ok, I admit it. At first I didn't have a clue what the point of VMware's new "Get it off the Board Agenda" site had to do with promoting cloud adoption. In a nutshell VMware has created a new viral marketing campaign geared toward the idea of keeping executives out of the decision process for buying cloud related services. But why? Doesn't this seem somewhat counter productive, or does it?

    First I suppose you need to get into the head of who's buying cloud products and services today. For service providers and telco's this probably means an SVP of some sort has been given the job of defining a revenue generating cloud strategy and service offering, so I'm not sure if this person would be the target for the campaign.

    It's probably more likely geared toward the end customers, the customers of my customers if you will. The Google's, Amazons, Salesforce and Microsoft's style clouds and how they're being adopted. The random developer or business unit with a problem to solve. The classic "New York Times" cloud story comes to mind. The story goes something like this, Derek Gottfrid, random NYT programmer had to solve a very hard problem with no time or money. So without prior permission he goes to Amazon Web Services where he leverages the power of EC2 and the free open source Hadoop project. With in a few hours he is able build a cloud application to utilize hundreds of machines concurrently and process a 150 years worth of data in less than 36 hours at next to no cost. Yup, it's called bottom up adoption.

    Ruv continues with an answer to his “So what is VMware promoting you ask?” question.

    <Return to section navigation list> 

    Thursday, February 25, 2010

    Windows Azure and Cloud Computing Posts for 2/25/2010+

    Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

    Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

    To use the above links, first click the post’s title to display the single article you want to navigate.

    Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

    Read the detailed TOC here (PDF) and download the sample code here.

    Discuss the book on its WROX P2P Forum.

    See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

    Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

    You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

    • Chapter 12: “Managing SQL Azure Accounts and Databases”
    • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

    HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010. 

    Azure Blob, Table and Queue Services

    See articles by Jeff Barr, Werner Vogels and James Hamilton about new “strong consistency” features for Amazon SimpleDB in the Other Cloud Computing Platforms and Services section.

    <Return to section navigation list> 

    SQL Azure Database (SADB, formerly SDS and SSDS)

    No signigicant articles so far today.

    <Return to section navigation list> 

    AppFabric: Access Control, Service Bus and Workflow

    See Vittorio Bertocci’s Identity @ MIX10 post of 10/24/2010 in the Cloud Computing Events section for details on two WIF sessions. 

    <Return to section navigation list>

    Live Windows Azure Apps, APIs, Tools and Test Harnesses

    Eric Nelson reports a deployment change in his Windows Azure Portal surfaces the relationship between billing and deployed post of 2/25/2010:

    I previously posted on when you get charged with Windows Azure as it is contrary to many folks expectations. Today (25th Feb 2010) I was pleased to see a new warning on the portal.

    Note that you only need to delete the deployment not the service. And to delete you must first stop it.

    Well done team.


    Milly Shaw describes on 2/25/2010 Southampton University’s use of Windows Azure in its Cloud computing in space program:

    Dr Steven Johnston from Southampton University explains how his team has used cloud computing to predict satellite collisions in space.

    10,000 space objects are currently being tracked. These objects are satellites, typically space debris. Johnston’s team are using Azure to project the orbits to predict collisions.

    Cloud computing is elastic - it can be scaled up or down, which can be useful for processor-intensive work such as Southampton’s Space Situational Awareness System Tech[nology].

    There is a scale of cloud computing, from Amazon Web Services - where you are responsible for managing a virtual machine - to Google Docs, in which you have no control at all over the machines you are working with. In the middle lies Windows Azure. You can’t configure Azure yourself, but it does have a more managed infrastructure.

    While Southampton’s space-tracking system relies on Azure, Steven freely admits that both cloud computing in general and Azure in particular have limitations: there’s no standard API for moving a VM, which essentially locks in a user to a particular system; there are bandwidth problems with moving data; and the inherent access limitations make it more difficult to create clusters.

    Microsoft Research supports the Microsoft Institute for High Performance Computing at Southampton University; here’s a link to its Cloud Computing for Planetary Defense presentation, which includes a “Space Situational Awareness” “Azure Demo” topics.

    Kristin Bockius reports U.S. Public Sector CIO Summit 2010 Day 2: Windows Azure Platform Apps Meeting Real Government Needs in this 2/25/2010 post to the new Bright Side of Government blog:

    We recently launched the Microsoft Windows Azure Development Contest for state and local government partners to provide an opportunity for all of our partners to showcase their development skills by creating a Windows Azure-based application that meets real needs of government customers. Voting closed last week, the judges have made their final decisions, and the results of the contest will be announced later today here at the CIO Summit. In the mean time, we would like to highlight a handful of the innovative applications that partners have already developed for government use:

    • COMAND – Developed by Trafrec Corp, this Windows Mobile application allows law enforcement officers to create or capture accident, citation, field interview, tow and DUI reports from the field. The data is synced to servers in the cloud in real time, and the sharing of captured data (which can also be accessed from the cloud back at the station) eliminates duplication and reduces reporting errors. Trafrec provides the software licensing at no cost to law enforcement agencies including police departments, DMVs, and DOTs. To view a demonstration of the application, check out this online presentation.
    • iLink GIS Framework – State and local government agencies are always looking for more efficient ways to deliver current contextual information online to their citizens, so iLink Systems created its “GIS Mapping Applications Framework” to meet this need. This reusable framework allows easy creation and deployment of intuitive, map-based solutions that can visualize social services, health services, public safety, schools, arts, and recreation. To try out this application, take a look at the demo script here.
    • LiveBallot – The Democracy Live system was deployed in 19 elections last year to help state and local governments reduce the costs associated with paper sample ballots, voter pamphlets, and other paper-based voter information. Democracy Live’s LiveBallot is capable of delivering voter information to over 200 million eligible U.S. voters and provides on-demand display of a voter’s specific balloting information. For more information, visit
    • FullArmor AppPortalAppPortal is a software + services solution that enables government organizations to deploy and manage App-V virtual applications through a cloud-based service or through internal infrastructure. Built on the Windows Azure platform, AppPortal can be implemented without any capital expenditures in datacenters, servers, or traditional system management software, and the deployment can be centrally managed, self-service enabled or a combination of both. For more information, visit FullArmor’s AppPortal Web site.
    • Miami 311 Miami 311 is a public-facing, open government transparency solution that allows citizens to monitor and analyze non-emergency requests, like a pothole repair or sidewalk damage, which are mapped out visually based on calls to 3-1-1. Miami 311 also serves as a dashboard for City Commissioners to see and monitor citizen requests in their district. For more information, see yesterday’s Bright side blog post on Miami’s Windows Azure application. [See details in interview below.]

    As the Windows Azure Platform evolves and new tools are developed, our state and local government customers will benefit from even greater flexibility, functionality, and reliability in the cloud, and citizens will benefit from greater transparency and more open government at the state and local level. We will post the results of Microsoft Windows Azure Platform Development Contest later today, so be sure to follow additional coverage here on Bright Side and look for our tweets @Microsoft_Gov with the hashtag #USPSCIO.

    The U.S. Public Sector CIO Summit 2010 site is here.

    Eric Nelson’s Q&A: How do I find out the status of the Windows Azure Platform services? of 2/25/2010 describes the Windows Azure status status dashboard and service history:

    The Windows Azure Platform includes a status dashboard as well as RSS feeds for individual services and regions.

    You get to see the current status:


    As well as the history of an individual service:


    As I’m based in the UK, I subscribe to the North Europe feeds for each of the services:

    The Windows Azure Team presented Real World Windows Azure: Interview with James Osteen, Assistant Director of Information Technology for the City of Miami on 2/24/2010:

    As part of the Real World Windows Azure series, we talked to James Osteen, Assistant Director of Information Technology for the City of Miami, about using the Windows Azure platform to deliver the city's 311 citizen response application and the benefits that Windows Azure provides. Here's what he had to say:

    MSDN: Tell us about the City of Miami and the role that IT plays in its management.

    Osteen: The City of Miami is in southeastern Florida and serves more than 425,000 citizens. With a team of 80 employees, the IT department plays a critical role because we continuously strive to improve existing services and offer new services to the residents of Miami.

    MSDN: What was the biggest challenge the City of Miami faced prior to implementing Windows Azure?

    Osteen: The 311 application that we wanted to develop, which would give residents the ability to submit and track nonemergency incidents used mapping technology that required significant computing resources. Unfortunately, we have limited budget and resources, including a five-year procurement cycle for server hardware. We needed a cost-effective, scalable solution that would maximize what resources we do have.

    MSDN: Can you describe the solution you built with Windows Azure to help maximize your limited resources?

    Osteen: This month we're launching our 311 application. We're using MapDotNet UX, an off-the-shelf product built for Windows Azure that integrates with Windows Azure storage services and Bing maps for enterprise, and also has an interface based on Microsoft Silverlight 3.0 browser plug-in. It gives us the powerful mapping technology that we need to enable residents to track nonemergency service requests and view other open requests in a particular area. We're using Blob Storage to store spatial data, which we previously had in shapefile and KML formats.

    Figure 1: Miami 311 Application-Built on Windows Azure, the Miami 311 application enables citizens to report and track nonemergency incidents. …

    MSDN: What makes your solution unique?

    Osteen: Windows Azure is going to radically reshape how we develop new services for our citizens. For instance, the development fabric enables our developers to run and test an application on their local computer before deploying it, so we do not have to use multiple testing, debugging, and production environments before deploying new applications or updates. This means that we're going to significantly increase how fast we can go to market with new features or applications. Windows Azure is the future for the City of Miami.

    MSDN: What are some of the key benefits the City of Miami has seen since implementing Windows Azure?

    Osteen:  In addition to the fast development, we're able to reduce costs. With Windows Azure, we can scale horizontally and vertically very quickly as demand requires, without worrying about trying to predict our server needs five years in advance. Also, by relying on Microsoft-hosted data centers, we've improved our disaster-recovery strategy, which is important in our hurricane-prone region.

    Read the full story here.

    The Silverlight Team chimes in with a Visit the McNuggets® Village to see how McDonald’s® uses Silverlight in their latest campaign post on 2/24/2010 which includes a link to the Silverlight/Azure app:

    McDonald’s has used Silverlight and a host of other exciting Microsoft technologies to build a Vancouver 2010 Winter Olympics-themed campaign for McNuggets called “Dip it. Dunk it. Let the games begin”.  The site, which runs only during the Winter Olympic games, features a mini game where you can give Gold, Silver or Bronze awards to either the all-new, Sweet Chili Sauce® (only available during the games), or to one of your favorite classic dipping sauces.  Explore venues within the village while gaining sports information.

    imageMake a stop at the Athletes Dorm to watch video profiles of Olympic athletes.  At the Future Gold Training Center, browse through photos, videos and read blogs from “McDonald’s Champion Kids®”. 

    The campaign was created by Tribal DDB Worldwide, a leading creative agency based in the U.S. Midwest.  Tribal DDB chose to develop the McNuggets Village using Silverlight 3 for two primary reasons: cost and speed to market. 

    The Silverlight application calls out to WCF services running on Windows Azure. A Silverlight-based IE8 web slice tracks game scores using Windows Azure for hosting. Amazingly, the entire Windows Azure infrastructure was built in only 2 days! Data is stored in SQL Azure, and the WCF services/content (images, video, etc.) are running in Windows Azure. [Emphasis added.]

    John Moore reports a new partnership between Microsoft and electronic health record [EHR] vendor Eclipsys in his The Migration to Modular HIT Apps post of 2/24/2010:

    Chilmark has been seeing a progressive movement by a number of HIT providers, especially among HIE vendors (Axolotl, Covisint and Medicity) to open up their HIE platform (publish APIs) to potentially support a multitude of modular apps to meet various provider needs.  Basically, these vendors are moving to a Platform as a Service (PaaS) model, each taking a slightly different spin on a PaaS that will likely require Chilmark to produce a separate report to explore further.  What is important though for this industry is that this is a fairly nascent trend that will likely accelerate in the future.

    And today, we can add one more vendor to the PaaS mix, Microsoft, who announced a partnership with EHR vendor, Eclipsys who has built several modular apps (Data Connectivity, Quick Order Entry and Visual Workflow) on top of Microsoft’s Amalga UIS.  Eclipsys will be demonstrating these apps next week at HIMSS.

    What’s in it for all Stakeholders:

    Microsoft is taking Amalga UIS from simply being a data aggregator/reporting engine to becoming a platform similar to HealthVault thereby making the data that it aggregates actionable by the apps that ride on top of it.  This creates a higher value proposition for Amalga UIS in future deals with large hospitals and IDNs.

    Eclipsys & other HIT vendors now have an opportunity to enter accounts that may have been dominated by large, monolithic solutions from such companies as Cerner and Epic.  It may also provide smaller HIT vendors an ability to rise above the noise and gain some traction in the market.

    Hospital CIOs & end users will no longer be strictly tied to only those apps provided by their core HIT vendor(s), but may now be able to “flex-in” certain “best-of-breed” apps as needed to meet specific internal needs/requirements.  In our briefing call with Microsoft yesterday, Microsoft stated that the Amalga UIS APIs will also be made available to customers allowing them to build their own apps, further increasing the utlity of Amalga UIS.

    There’s no direct tie to Windows Azure in John’s post, but it’s a good bet that an Amalga PaaS offering would offer the option of hosting in Microsoft’s data centers.

    Gaurav Mantri announces the availability of Cerebrata’s Azure Diagnostics Manager - A WPF based rich client to manage Windows Azure Diagnostics as of 2/21/2010:

    I am pleased to announce the availability of Azure Diagnostics Manager utility in private beta. In PDC, Windows Azure team announced the new Diagnostics API and Management however we felt that there was a gap (and thus a great opportunity) regarding the tooling to surface this data. Thus we built this utility so that developers can get access to the diagnostics data in a user friendly way.

    What is Azure Diagnostics Manager:
    Azure Diagnostics Manager (ADM) is a WPF based rich client application using which you can manage diagnostics data logged by your applications using Diagnostics API made available by Windows Azure. Here are the features available in ADM in its current release:

    • Event Viewer: View/download events logged by your applications in a “Windows Events Viewer” like user interface.
    • Performance Counters Viewer: View/download performance counters data logged by your applications in a “Perfmon” like user interface.
    • Trace Logs Viewer: View/download trace data logged by your applications.
    • Infrastructure Logs Viewer: View/download infrastructure logs data.
    • IIS Logs: View/Download IIS logs.
    • IIS Failed Request Logs: View/Download IIS Failed Request logs

    In the next few versions we’ll be adding these features:

    • Support for Crash Dump Logs
    • Support for On Demand Transfer
    • Support for Remote Diagnostics Management

    Return to section navigation list> 

    Windows Azure Infrastructure

    Lori MacVittie claims “There’s a reason for the angst elicited by inaccurate definitions of cloud computing and it may lead to rethinking a laissez-faire view of such definitions” in her May I Mambo Dogface to the Banana Patch? post of 2/25/2010:

    angry_woman Language impacts our perception and can dramatically change the way we understand – or don’t understand – ideas. Because one of the primary uses of language is to present arguments or assert propositions such as “We need to allocate X percent of our budget to a cloud computing initiative” it makes it important that everyone involved in the conversation agrees on basic meanings and definitions. This is one of the reasons I, at least, have a conniption whenever someone who is attempting to educate people on a particular technological concept completely misses the ball. If we don’t clearly articulate what is and is not cloud computing the danger is that business-stakeholders and end-users will see cloud computing as nothing all that difficult, or nothing spectacular..we are in danger of the folks who often fund such initiatives not “getting it.” Language is our common ground, or at least it’s supposed to be.

    The following grossly inaccurate definition is brought to you by Reading Eagle. Its “What is cloud computing?” article asserts that “84% of Americans use cloud computing in some form.” Then goes on to explain in more depth (and I use that phrase loosely and intentionally to prove a point).

    Lori continues with an indictment of the Reading Eagle definition she quotes as being for the Internet, not cloud computing and asks:


    It’s important to remember that IT does not exist in a vacuum; other constituents have a stake in IT and exert their influence over the direction IT takes in many ways, including the budget. Many cloud computing surveys and studies have asked about budget size and type, but very few take the extra step to ask who – or what group – actually has control and influence over those budgets, which is at least as interesting – if not more so – than the actual budget details itself.


    Saugatuck Technology claims “Core research on Cloud use drives effective strategies of ‘Lead, Follow, and Get Out of the Way’” in its Saugatuck Releases First Cloud Leadership Study press release of 2/25/2010:

    Traditional IT and business leadership strategies and tactics will be of limited effectiveness in a Cloud-based environment where anyone can do practically anything they want or need to. Thus, how user organizations plan for and manage Cloud IT will have to improve upon established practices and policies. IT, Finance, and line-of-business executives must learn and use new, combined management tactics when it comes to Cloud IT.

    This core finding is the foundation of the latest research study from Saugatuck Technology Inc., titled “Lead, Follow, and Get Out of the Way: A Working Model for Cloud IT through 2014.” Released today via Saugatuck’s website, the study uses interviews with experienced user organization executives, analysis of global survey data, and insights from leading Cloud providers, to build a realistic, working model of Cloud evolution, adoption, and most importantly, cost-effective management.

    “Cloud adoption right now is a point-solution phenomenon, and we know from every previous instance of IT that point solutions cost more in the long run,” according to Saugatuck Managing Director Bruce Guptill, the study’s lead author. “If IT, Finance, and business leaders can see and understand how Cloud adoption is growing and changing, they can manage it, and take advantage of tremendous cost efficiencies. In this report, we have developed a simple, evolutionary model of Cloud IT adoption and its impact over time. And we deliver guidance to user executives as well as to providers of Cloud IT and traditional IT as to how best to manage the Cloud’s present and future.”

    The press release continues with more “key findings.” As I recall, a more common form of the title is Thomas Paine’s “Lead, follow or get out of the way.”

    Dilip Krishnan’s Book Excerpt and Interview: Cloud Computing and SOA Convergence in Your Enterprise: A Step-by-Step Guide begins:

    A new book by David Linthicum, Cloud Computing and SOA Convergence in Your Enterprise: A Step-by-Step Guide, describes how to get the enterprise ready for cloud computing by carefully modeling enterprise data, information services and processes in a service oriented manner to make the transition to providing and consuming cloud services easier.

    The book [offers] high level prescriptive guidance on the approaches to moving enterprise services and business processes to the cloud. David starts the discussion of his book with the “why?” of moving to the cloud and he goes on to define the cloud computing and the various components that make up cloud solutions, i.e. Storage-as-a-Service, Database-as-a-Service, Information-as-a-Service, Application-as-a-Service etc. and examines the need and business drivers for moving such services to the cloud.

    In his book, David suggests a bottom-up approach starting from modeling the data, then the services that move the information to achieve business activities and finally the process that tie these activities together. He advocates modeling the governance of these services and processes and a plan to test these processes in the cloud.

    Addison-Wesley / Prentice Hall provided Infoq readers with the excerpt describing techniques modeling the data and services for the cloud. The excerpt is a condensed version of the following chapters from the book.

    • Chapter 5: Working from Your Data to the Clouds
    • Chapter 6: Working from Your Services to the Clouds

    Dilip continues with a lengthy Q&A session with David. I purchased the book and recommend it highly.

    TechNet Flash editor Mitch Irsfeld offers a Clearing the Fog Around Cloud Computing post, which begins:

    Describing the general concept of cloud computing, its promise and benefits, is fairly straightforward. But the varied approaches and emerging product offerings from a growing list of vendors create confusion, especially as you look at it from the standpoint of your existing infrastructure. Microsoft's approach to cloud services allows you to deploy cloud workloads and integrate them back into your existing on-premises systems.

    I've collected a few resources to get you going:

    <Return to section navigation list> 

    Cloud Security and Governance posted on 2/15/2010 an Infosec Guru Ron Ross on NIST's Revolutionary Guidance podcast:

    NIST senior computer scientist Ron Ross heads a National Institute of Standards and Technology-Defense Department team that created the just-released information security guidance for federal agencies: Special Publication 800-37, Revision 1, Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach.

    In an interview with, Ross discusses the:

    • Importance of the new guidance that provides for real-time monitoring of IT systems.
    • Challenges federal agencies face in adopting NIST IT security guidance.
    • State of cybersecurity in the federal government.

    Ross was interviewed by's Eric Chabrow.

    The highly regarded NIST senior computer scientist and information security researcher serves as the institute's FISMA implementation project leader. He also supports the State Department in the international outreach program for information security and critical infrastructure protection. Ross previously served as the director of the National Information Assurance Partnership, a joint activity of NIST and the National Security Agency.

    Robert Westervelt’s Cloud security issues, targeted attacks to be hot-button topics at RSA post of 2/25/2010 to covers the upcoming RSA conference in San Francisco:

    … Many companies are moving slowly with cloud computing adoption, Crawford said, despite an increasing number of vendors refitting their technologies and their distribution methods to provide cloud-based services. The integration between servers, storage networks and management software is adding more complexity, and the complexity and lack of visibility into cloud environments breeds insecurity. All of the challenges being cited could be causing companies to be more cautious in adopting cloud-based services. In a recent Enterprise Management Associates survey of 850 IT executives, only 11% indicated they planned to implement cloud in the next 12 months, Crawford said.

    Some of the challenges include the lack of visibility in cloud environments and the loss of control over company data and how it's being secured by service providers. Crawford said too many vendor proprietary systems are complicating cloud adoption. New standards could change all that, he added, citing the MashSSL Alliance, which focuses on establishing secure channels during browser sessions, as a good start.

    "There's considerable attention being paid to the potential risks of cloud computing, but the number of organizations planning on adopting cloud computing models is small compared to a lot of the hype in the market," Crawford said. "A lot of the hype about cloud services out there is out of proportion to what companies are really doing."

    Click here for more details on the RSA Conference 2010.

    Angela Moscaritolo asks “What should IT personnel and executives at enterprises know before adopting a cloud computing model? How are CISOs dealing with the trend of consumerization? How will mobile app stores affect the threat environment?” in her Crime, cloud & consumerization on tap at RSA Conference article of 2/24/2010 for SC Magazine:

    … Individuals from Verizon Business, SaaS web security vendor Zscaler, and web-based DNS management software provider OpenDNS also plan to discuss outsourcing and cloud computing in a session  called “SaaS-based security solutions,” scheduled for March 3.

    Cloud computing is one of the most hyped trends in IT security – and will also be another major theme of the show, said Scott Crawford, managing research director at Enterprise Management Associates, during Wednesday's call. He will moderate the session on SaaS.

    “The number of organizations planning to adopt a cloud computing model is small compared to the hype and expectations in the market,” Crawford said.

    A top priority and one of the most difficult challenges of cloud computing is ensuring data is protected, he said. Manageability and performance are other concerns.

    During the March 3 session, panelists plan to discuss what organizations need to know before jumping into a service-based model for security. 

    David Linthicum asserts “Legal issues such as data privacy and compliance regs may have you considering where the clouds actually reside” in his When clouds need to stop at the border post of 2/25/2010 to the InfoWorld Cloud Computing blog:

    Clouds are everywhere and can be used from anywhere, right? Wrong. The fact is that when considering national laws, you may find that your data is legally not able to leave the border.

    That's the case in many parts of Europe that forbid some data from being transmitted or stored outside of the country. Canada also has some rules that prohibit some data being stored in the United States due to the U.S. Patriot Act's provisions that let the federal government examine corporate records.

    To get around this issue, several cloud computing providers, such as and, have points of presence in many developed countries. There's a performance argument for this distribution of systems, but another reason is to adhere to many laws directing where some data can legally reside.

    It's important to note that the legal issues are local to where your customer resides. You have to understand the laws and make sure that personally identifiable data and some financial records are kept local if required by the law. …

    Jay Heiser asks Do We Need Cloud Computing Laws? in this 10/24/2010 post to the Gartner Blog:

    … Let me go on the record and say that I’m not aware of any modern society that has thrived without the rule of law.  Law is needed, yet it is impossible to ever get it exactly right.  Bad rules can be counterproductive, encouraging the negative behavior they are meant to reduce.  The Basel II Accord, which promulgated the quantification of operational risk, arguably encouraged financial service firms to take greater risks by giving them a justification mechanism.

    2010 is looking like ‘The Year of Privacy’ in the US Federal Government. H.R. 2221, the Data Accountability and Trust Act (DATA), was voted on by the House and has been read at the Senate, where it is currently under committee review.  A parallel bill in the Senate, S. 1490:Personal Data Privacy and Security Act of 2009, was approved by committee, but appears to have been superseded by the House’s somewhat eponymous DATA bill.  Up to a dozen other proposed bills nibble away at identity theft, social security number conventions, and the use of PII, so clearly the legislative branch has an appetite for this issue.

    DATA and S. 1490  require more than just breach notification. Organizations with PII must take proactive efforts to control private  information, expecting “the Federal Trade Commission ( FTC) to promulgate regulations requiring each person engaged in interstate commerce that owns or possesses electronic data containing personal information to establish security policies and procedures.”  It is no coincidence that the FTC has begun a series of roundtable discussions on the privacy implications of new IT practices and technologies, including social networking and cloud computing. …

    The Brookings Institute held a governance studies event in January on Cloud Computing for Business and Society.  In his keynote speech, Microsoft’s Brad Smith strongly suggested the need for regulations on cloud computing. This is not a new theme for Microsoft.  Ray Ozzie suggested similar ideas last year during a Gartner interview.  I don’t want to second guess Microsoft’s motivations in lobbying so consistently for a larger Federal presence in their new business area, but one interpretation could be that they expect regulations to function as a barrier to market entry, reducing competitive pressures in the cloud.

    <Return to section navigation list> 

    Cloud Computing Events

    Brian Gorbett recommends attending a two-day Azure Boot Camp on 6/1 and 6/2/2010 at the Microsoft Chicago office: 200 E Randolph St, Suite 200, Chicago, Illinois 60601, US:

    Event Overview: Join us for this 2 day deep dive program to help prepare you to deliver solutions on the Windows Azure Platform. We have worked to bring the region’s best Azure experts together to teach you how to work in the cloud. Each day will be filled with training, discussion, reviewing real scenarios, and hands on labs. Snacks and drinks will be provided, we advise you to make your own lunch arrangements prior to the event.
    You will also need to bring a computer loaded with the software listed below. For more information please visit, or email us at

    See Event ID: 1032444477 to register, learn what software you need to preload to the PC you bring and read a detailed a detailed course-contents list.

    Bill Wilder announces an update to the agenda for the upcoming Boston Azure User Group meeting in his Curt Devlin to Speak about Identity in the Cloud at Boston Azure Meeting post of 2/25/2010:

    Thursday February 25, 2010 at NERD in Cambridge, MA

    logo for BostonAzure.orgThe following is an update to the agenda for the upcoming Boston Azure User Group meeting this coming Thursday.

    To RSVP for the meeting (helps you breeze through security and helps us have enough pizza on hand), for directions, and more details about the group, please check out

    To get on the Boston Azure email list, please visit

    [6:00-6:30 PM] Boston Azure Theater

    The meeting activities begin at 6:00 PM with Boston Azure Theater, which is an informal viewing of some Azure-related video. This month will feature the first half of
    Matthew Kerner’s talk on Windows Azure Monitoring, Logging, and Management APIs from the November 2009 Microsoft PDC conference.

    [6:30-7:00 PM] Upcoming Boston Azure Events and Firestarter

    Around 6:30, Bill Wilder (that’s me) will first show off an interesting CodeProject contest, then will lead a discussion about the future of the Boston Azure user group and the upcoming All-Day-Saturday-May-8th event.

    Curt Devlin will take the stage at 7:00 PM.

    Vittorio Bertocci’s Identity @ MIX10 post of 10/24/2010 asks:

    Have you booked your trip to Vegas yet? Amidst all the excitement for Windows Phone 7 Series, Silverlight & the cloud, if you know how to search the MIX content you’ll find two identity pearls:

    Caleb Baker: Using Windows Identity Foundation For Creating Identity-Driven Experiences in Silverlight

    Come learn how you can leverage Windows Identity Foundation to simplify access to your Silverlight applications and delight your users with custom-tailored experiences. Discover how you can enable single sign on for your Silverlight applications no matter where they are hosted, explore how you can use claims-based identity to adapt the user experience for customers, learn how to take advantage of web services protected by federated security… all with the same consistent developer APIs, that Windows Identity Foundation already offers for ASP.NET and WCF projects.

    That’s right, an entire session about one of the questions you ask the most about WIF: how to use it with Silverlight? Caleb is gong to go into the details of how you can use those two technologies together.  Really looking forward to that one!

    Mike Jones: Improving the Usability and Security of OpenID

    OpenID is gaining popularity as an Internet identity system. Nonetheless, it is widely recognized that both usability and security issues are limiting the adoption and applicability of OpenID as it exists today; both kinds of issues can be improved by the introduction of an active client for OpenID. This session will describe a community collaboration to explore these issues through working code. We will demonstrate an experimental multi-protocol version of Windows CardSpace that enables end users to bring their OpenIDs with them to sites, while mitigating phishing attacks, including its use at production OpenID sites.

    Mike is going to explore another combination, CardSpace and OpenID. Among other things, that’s precisely what I mentioned in the Mordor post… another session I would definitely not miss.

    Besides the sessions themselves, this will be a chance to speak with them and others from the WIF PG. You are still on time for registering to the conference! If you are in Vegas that week, don’t forget to swing by & say hi… as usual, I am still very recognizable :-)

    <Return to section navigation list> 

    Other Cloud Computing Platforms and Services

    James Urquhart offers his commentary about CA to acquire cloud platform provider 3Tera in this 3/24/2010 post:

    … The technology assets and key personnel of utility computing management software provider Cassatt were acquired in June, and service level management software provider Oblicore was acquired earlier this year.

    CA also plans to expand 3Tera's virtualization support beyond the Xen virtualization platform, to include VMWare ESX and Microsoft HyperV. [Emphasis added.]

    Jay Fry, vice president of Business Unit Strategy for CA's Cloud Products and Solutions Business Line, said in a phone interview that there were three key ways that 3Tera enhances CA's cloud story.

    First, they bring customers. According to 3Tera's Web site, there are around 30 service provider partners running their cloud services on AppLogic, and Fry noted that they have a handful of enterprises who have standardized on the platform as well. Second, their user interface and application delivery technologies are a strong complement to CA's portfolio of infrastructure management tools.

    Finally, 3Tera will allow CA to simplify the process of implementing and using private cloud computing services using CA's product portfolio, according to Fry. "Enterprises are not even sure how to get started with private cloud, and [AppLogic] makes it easier to talk about those first couple steps," noted Fry. …

    Jeff Barr’s Amazon SimpleDB Consistency Enhancements post of 10/24/2010 announces:

    We've added two new features to Amazon SimpleDB to make it even easier for you to implement several different data storage and retrieval scenarios.

    The first new feature allows you to do a consistent read. Up until now, SimpleDB implemented eventually consistent reads. You now have the option to choose the type of read which best meets the needs of each part of your application. Before I dive into the specifics, here's a quick guide to the two types of reads:

    • Eventually consistent reads offer the lowest read latency and the highest read throughput. However, the reads can return data that has been made obsolete or overwritten by a very recent write. In general, the time window where this can occur is smaller than one second.
    • Consistent reads always return the most recent data, with the potential for slightly higher read latency and a small reduction in read throughput.

    SimpleDB's Select and GetAttributes functions now accept an optional ConsistentRead flag. This flag has a default value of false, so existing applications will continue to use eventually consistent reads. If the flag is set to true, SimpleDB will return a consistent read.

    The second new feature allows you to issue SimpleDB PutAttributes and DeleteAttributes operations on a conditional basis. In other words, you can tell SimpleDB to perform the indicated operation if and only if a given single-valued attribute has the value specified in the PutAttributes or Delete call. You can easily implement counters (the value itself is effectively the version number), delete accounts only if the current balance is zero, and insert an item only if it does not exist.

    Jeff continues with a discussion about implementing optimistic concurrency control with these new features. Amazon also offers an Amazon SimpleDB Consistency Enhancements tutorial of 2/24/2010.

    Werner Vogels compares eventual consistency with optimistic consistency based on SimpleDB’s new ConsistentRead flag, and PutAttributes and Delete Attributes functions in his Choosing Consistency post of 3/24/2010:

    Amazon SimpleDB has launched today with a new set of features giving the customer more control over which consistency and concurrency models to use in their database operations. There are now conditional put and delete operations as well a new "consistent read" option. These new features will make it easier to transition those applications to SimpleDB that are designed with traditional database tools in mind.

    Revisiting the Consistency Challenges

    Architecting distributed systems that need to reliably operate at world-wide scale is not a simple task. There are many factors that come into play when you need to meet stringent availability and performance requirements under ultra-scalable conditions. I laid out some of these challenges in an article explaining the concept of eventual consistency. If you need to achieve high-availability and scalable performance, you will need to resort to data replication techniques. But replication in itself brings a whole set of challenges that need to be addressed. For example updates to data now needs to happen in several locations, so what do you do if one or more of those locations is (temporarily) not accessible?

    A whole field of computer science is dedicated to finding solutions for the hard problems of building reliable distributed systems. Eric Brewer of UC Berkeley summarized these challenges in what has been called the CAP Theorem, which states that of the three properties of shared-data systems--data consistency, system availability, and tolerance to network partitions--only two can be achieved at any given time. In practice the possibility of network partitions is a given, so the real trade-off is between availability and consistency: In a system that needs to be highly available there are a number of scenarios under which consistency cannot be guaranteed and for a system that needs to be strongly consistent, there are a number of scenarios under which availability may need to be sacrificed. These trade-off scenarios generally involve edge conditions that almost never happen, but in a system that needs to operate at massive scale serving trillions and trillions of requests on a daily basis, one needs to be prepared to handle these cases. …

    James Hamilton adds his I love eventual consistency but... discussion of new SimpleDB features on 2/24/2010:

    I love eventual consistency but there are some applications that are much easier to implement with strong consistency. Many like eventual consistency because it allows us to scale-out nearly without bound but it does come with a cost in programming model complexity. For example, assume your program needs to assign work order numbers uniquely and without gaps in the sequence. Eventual consistency makes this type of application difficult to write.

    Applications built upon eventually consistent stores have to be prepared to deal with update anomalies like lost updates. For example, assume there is an update at time T1 where a given attribute is set to 2. Later, at time T2, the same attribute is set to a value of 3. What will the value of this attribute be at a subsequent time T3? Unfortunately, the answer is we’re not sure. If T1 and T2 are well separated in time, it will almost certainly be 3. But it might be 2. And it is conceivable that it could be some value other than 2 or 3 even if there have been no subsequent updates. Coding to eventual consistency is not the easiest thing in the world. For many applications its fine and, with care, most applications can be written correctly on an eventually consistent model. But it is often more difficult. …

    James continues with an analysis of how data stores that implement strong consistency can be scaled.

    <Return to section navigation list>