Monday, June 01, 2009

Windows Azure and Cloud Computing Posts for 5/25/2009+

Windows Azure, Azure Data Services, SQL Data Services and related cloud computing topics now appear in this weekly series.

This drawing changed on 5/30/2009. Can you spot the difference? Leave a comment.

Updated 5/30 to 5/31/2009: More on Azure SAS 70 and ISO/IEC 27001:2005 certs, Steve Marx’s sample blob code
Updated 5/27 to 5/29/2009: New Windows Azure SDK (May 2009 CTP), SAS 70 Type I and II attestation for Microsoft cloud, Amazon and Google announcements, BizTalk adapters, other additions
• Updated 5/26/2009: Additions

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

Azure Blob, Table and Queue Services

<Return to section navigation list> 

Steve Marx’s Sample Code for New Windows Azure Blob Features post of 5/31/2009 adds five new methods to the “sample” StorageClient library to implement Copy Blob and Get Block List. The methods are named:

  • public abstract bool CopyBlob(string to, string from, bool overwrite);
  • public abstract List<string> GetBlockList(string blobName, string eTag);
  • public abstract List<string> GetCommittedBlockList(string blobName, string eTag);
  • public abstract List<string> GetUncommittedBlockList(string blobName, string eTag);
  • public abstract List<string> GetAllBlockList(string blobName, string eTag);

Steve writes: In the code, these are declared in BlobStorage.cs.  They’re implemented in RestBlobStorage.cs with the help of a few additional constants in RestHelpers.cs.

There’s also a console application included that tests the new functionality.  To use it, modify Program.cs to use a valid storage account and key.

You can download the full source code (updated storage client library and console test application) from here.

My “Master Azure Blob Storage” article will be the cover feature for the July 2009 issue of Visual Studio Magazine. It describes the code for the live OakLeaf Systems Azure Blob Services Test Harness demo and you’ll be able to download the source code when the Redmond Media Group posts it in late June or early July.

Ryan Dunn puts a damper on the New Windows Azure Storage Features – May 2009 announcement by noting that the May 2009 CTP’s sample SerivceClient API doesn’t support the new storage features described below. The following is from Ryan’s New Windows Azure Storage Features post of 5/28/2009:

The biggest impact to developers at this point is that to get these new features, you will need to include a new versioning header in your call to storage.  Additionally, the StorageClient library has not been updated yet to reflect these new APIs, so you will need to wait for some examples (coming from Steve) or an update to the SDK.  You can also refer to the MSDN documentation for more details on the API and roll your own in the meantime. [Emphasis added.]

Most Azure testers (including me) use StorageClient to manipulate tables and blobs. Does this mean we can expect a June 2009 CTP?

Brad Calder’s New Windows Azure Storage Features – May 2009 post of 5/28/2009 briefly describes the following new Blob, Table, and Queue storage features:

  • Entity Group Transactions for Windows Azure Tables
  • Copy Blob for Windows Azure Blob
  • Get Block List for Windows Azure Blob
  • API Versioning
  • Unicode Property Names for Tables
  • PartitionKey and RowKey sizes of up to 1K characters
  • Future changes to max timeouts for Azure storage

See the details of the release of the new Windows Azure SDK (May 2009 CTP) on 5/28/2009 in the Azure Infrastructure section.

For more information, details about these new features can be found in the MSDN documentation here:
http://msdn.microsoft.com/en-us/library/dd179355.aspx

In addition, May 2009 versions of the Windows Azure Storage Table and Blob White papers, with information about these features, will be pushed here soon:
http://msdn.microsoft.com/en-us/azure/cc994380.aspx

Steve Marx will have some examples on using the above new functionality on his blog soon:
http://blog.smarx.com/

Important: According to Jim Nakashima’s May CTP of the Windows Azure Tools and SDK - Now Supports Visual Studio 2010 Beta 1 post of 5/28/2009:

You can now use Visual Studio 2010 Beta 1 to build your Cloud Services.  There are a couple of interesting things to note however:

  • The Windows Azure Cloud does not yet support .Net Framework 4.0 – the tools will always create Web and Worker roles that target .Net Framework 3.5. The tools will complain if you try to build a Role project that targets .Net Framework 4.0.
  • Visual Studio 2010 compatible samples are available here

Dmitry Lyalin describes his Azure Storage with Chris Rolon podcast of 3/25/2009:

In this Episode we speak to Chris Rolon of Neusdesic Corporation on Windows Azure Storage. Chris gives an overview of the various ways one can store data in Windows Azure and provides insight on his experiences working with the platform.

In this thread in the Windows Azure Forum.

SQL Data Services (SDS)

<Return to section navigation list> 

Alin Irimie’s SQL Data Services. Your Database in the Cloud post of 5/27/2009 is a comprehensive independent overview of the forthcoming SQL Data Services for Windows Azure.

David Robinson’s Scaling Out with SQL Data Services post of 5/27/2009 has a link to a public Tech*Talk of the same name with Rick Negrin.

.NET Services: Access Control, Service Bus and Workflow

 <Return to section navigation list>

Caleb Baker takes on this week’s Id Element chore in his 00:21:15 Channel 9 video segment, Caleb Baker on Geneva Server and SAML 2.0 Interoperability:

In this episode Caleb Baker, Sr. SDET on the Federated Identity team, discusses Geneva Server beta 2 and SAML 2.0 interoperability. Caleb was instrumental in testing Geneva Server with Sun’s OpenSSO Enterprise and Novell’s Access Manager products. The whitepapers for these interop tests are available here. Caleb also demonstrates how to configure Geneva Server both as a SAML Identity Provider and a Service Provider.

Vaneta Tashev shows you how to set up CardSpace issuance in the “Geneva” Team’s blog with Information Card Issuance: a small step for "Geneva" Server, a big leap for Federated Identity of 5/29/2009.

There’s no news so far whether the Windows Azure SDK (May 2009 CTP) fixed Windows Azure’s incompatibility with Geneva Beta 2.

Simon DaviesIntegration via the Cloud post of 5/27/2009 demonstrates using the .NET Services’ Service Bus to integrate similar Google App Engine and Windows Azure apps. The post contains a link to a video demonstration.

Matias Woloski reviews his “one hour presentation covering the basis of claim based identity and a demo of the work we did in this customer” in his Microsoft Architecture Day: Roadmap to Identity post of 5/27/2009:

The presentation started by telling a story about a typical company that created its first application. This app used Windows Authentication so life was easy :). However business grew over time and they ended up with multiple applications, each one with its own identity repository, different authentication methods, support users accessing from intranet, internet, extranet, cloud, and so on and so forth. I grab this diagram from Stuart Kwan presentation from the Genevea Product Team (so thanks Stuart for the cool representation).

Where’s the video? And a Spanish –> English translation of the graphics for those who no habla Espanol.

His Scenario: Token Exchange when you can’t change the client reviews a summary of a very interesting article published on the Identity issue of the Architecture Journal. This article talked about different patterns on the federated identity world. Last week we had an interesting requirement to solve in a project and this article came to my mind. Specifically one of the scenarios this article points out is when you don’t want the consumer to have knowledge about the relying party STS or you don’t want the client to “see” the token from that STS. The figure below illustrates this scenario. In this diagram, the relying party (service) will receive a token from STS A and will call the STS B to obtain the token.

Dimitry Sotkinov’s Active Directory, Exchange Online, and Azure applications post of 5/27/2009 reports that the video of his TEC session about IT professionals’ views of identity management and AD-integration for Exchange Online and Windows Azure has been posted.

Danny Garber announces the BizTalk .NET Services (ISB) Adapter in his New BizTalk WCF Custom Adapters for Azure Services Platform coming soon post of 5/27/2009 post. See the full monte in the Live Windows Azure Apps, Tools and Test Harnesses section.

Kalyan Bandarupalli’s .NET Access Control Service post of 5/25/2009 is a brief introduction to ACS:

.NET Access Control Service allows you to use the authentication and authorization services from external sources that are maintained by security experts. Security experts control the authentication and issue the token to the application. Application just uses those tokens by avoiding the authentication process.

My CheckScoreWorkflow Sample Appears to Return Wrong Answer post of 3/24/2009 was updated on 3/25/2009 with more typos in the Readme.htm document. The .NET Services sample apps team needs to hire a professional editor, proofreader of both.

Live Windows Azure Apps, Tools and Test Harnesses

<Return to section navigation list> 

The Windows Azure Team’s Changes to Alerts, Analytics, and Windows Live ID Integration post of 5/28/2009 announces a “set of changes to the Azure Services Developer Portal to alerts, analytics, and Windows Live ID integration” that went live on 5/28/2009.

The Cloud Computing Tools Team’s Now Available: May 2009 CTP of the Windows Azure Tools and SDK post of 5/28/2009 has links to:

  • Release notes/known issues are available here
  • Visual Studio 2010 compatible samples are available here

Olivier Meyer’s Videos from Tech-Ed: Building Location-Aware Services with SQL Server 2008 Spatial - WiE Demo and Code Walkthroughs post of 5/22/2009 reports:

Our session on WiE (Building location-aware services using SQL Server 2008)  was recorded and is available on the Tech-Ed web site for viewing by those who attended Tech-Ed.  Please visit the Tech-Ed Online to view any of the sessions from this year’s Tech-Ed.

Unfortunately I am not allowed to post the entire recording of the session directly on my blog.  That said, like any good database guy, I have a backup plan.. I had made some recordings of the demos before the show as a backup in case the demo machine failed during the show.   These are posted as part of this blog article (hoping that the video quality survives the posting process). 

A direct link to the Tech*Ed video is here. The video demonstrates the WiE project and source code posted to CodePlex! that Olivier describe in his 12/19/2008 post. Hopefully, he’ll update the project to the upcoming SQL Server Data Services release.

Danny Garber announces New BizTalk WCF Custom Adapters for Azure Services Platform coming soon in this 5/27/2009 post:

During my session at the TechEd 2009 in Los Angeles last week, I've demonstrated a demo implementation of a pretty compelling S+S scenario, during which I've connected on-premise running ESB and BizTalk Server 2009 components to the Microsoft Azure Services Platform using new BizTalk LiveMesh Adapter and BizTalk .NET Services (ISB) Adapter.

TechEd attendees can watch the recorded session at the following link: Introducing the Microsoft Integration Server: BizTalk Server 2009.

After the session talking to many of the attendees who watched my demo, I've decided not only to extend the BizTalk "Cloud" Adapters to include other Azure Services components such as SQL Data Services and .NET Routing and Queue Services, but also to make these adapters available for public through our open source community (CodePlex) or similiar.

Azure Infrastructure

<Return to section navigation list> 

••• Simon Davies points to a list of certificates of ISO/IEC 27001:2005 conformance for many Microsoft data centers, in his Securing Microsoft’s Cloud Infrastructure post of 5/29/2009. The list includes the Quincy, WA (USA Northwest) and San Antonio, TX (USA Southwest) Azure data centers:

 

The certificates are issued by the British Standards Institute (BSiGroup) and are effective 5/15/2009. BSIGroup has an ISO/IEC 27001:2005 Information Security page with a link to Steps to certification. Wikipedia describes the certification process and provides links to additional resources in its ISO/IEC 27001 article:

Organizations may be certified as compliant with ISO/IEC 27001 by a number of Accredited Registrars worldwide. Certification against any of the recognized national variants of ISO/IEC 27001 (e.g. JIS Q 27001, the Japanese version) by an accredited certification body is functionally equivalent to certification against ISO/IEC 27001 itself. Certification audits are usually conducted by ISO/IEC 27001 Lead Auditors.

In some countries, the bodies which verify conformity of management systems to specified standards are called "certification bodies", in others they are known as "registration bodies", "assessment and registration bodies", "certification/ registration bodies", and sometimes "registrars".

Simon also points out that “another white paper that discusses security and reliability as it relates to Microsoft Online Services has been published here.”

Reuven Cohen proposes Reintroducing the Universal Compute Unit & Compute Cycle, which he calls “A standard Cloud Performance Rating System” in this 5/31/2009 post. Ruv writes:

Recently I posed a questions, is there an opportunity to create a common or standard Cloud Performance Rating System? And if so, how might it work? The feed back has been staggering. With all the interest in a Standardized Cloud Performance Rating System concept, I thought it was time to reintroduce the Universal Compute Unit & Compute Cycle concept.

Last year I proposed the need for a "standard unit of measurement" for cloud computing similar to that of the International System of Units or better known as the metric system. This unit of cloud capacity is needed in order to ensure a level playing field as the demand and use of cloud computing becomes commoditized. …

••• Brian Doerr’s Cloud Computing: Enter the “Stacker” guest post of 5/31/2009 on GigaOm discusses the need for tools to support “Horizontal Aggregation” and vertical aggregation (“The Stacker”). Brian concludes:

The opportunity is now. The evolution of existing IT components toward the stacker and the separation of virtual and physical design forces provides the opportunity to incorporate these controls into the building blocks of the future service provider cloud. I’d like to see the industry accelerate efforts to harden and standardize newly emerging concepts and protocols in these areas.

This is the third post in a 3-part series. Please also see Part 1, Cloud Computing: A System of Control, and Part 2, Cloud Computing: Building Blocks for the Enterprise.

••• My forthcoming Cloud Computing with the Microsoft Azure Services Platform title for Wrox has official cover art, if you consider the below to be “art:”

The book is scheduled to be available for sale at PDC 2009 (mid-November 2009).

• Alan Wilensky continues his putdown of PaaS vendors in his Cloud insanity – the Shills come out of the woodwork post of 5/30/2009. Alan makes the following point regarding the insurability of cloud-based PaaS:

What I do not recommend is that mid market businesses that have CLOB (capital line of business) applications, hosted on their own racks, or managed by a conventional, stable vendor, change to a cloud solution until the PAAS and SAAS providers get industry rating and certifications. The SNA shops knew this, and went through the in house/ hosted rating travail. The result? An industry in which any business owner can get insurance for business continuity disruption that is caused by IT systems failures.

If you are a mid sized business with an internal server rack, distributed multisite architecture, or a hosted AS400 or new IBM architecture, you can insure your operations. You can insure any Redhat, Microsoft, BEA, Websphere, whatever installation, managed and rated SAS70, or hosted in your unairconditoned broom closet, but it will cost a little more. A nice underwriter will come to your place or your managed host’s place, and write a policy. …

If your business lines are damaged, taking crucial cash flow out of your pocket, and goads the potential for civil liability (in cases of service critical business), then you are truly screwed doubly, as there are no lines of underwriting that will insure a PAAS solution for anything but the actual costs of the outage.

However, it doesn’t appear to me that Alan’s arguments apply to cloud-based PaaS (or IaaS) hosted by very large and highly skilled organizations, such as Microsoft, Amazon, Google or Oracle/Sun, if they’re certified ISO/IEC 27001:2005 compliant, have attestations of SAS 70 compliance, and offer an adequate Service Level Agreement.

I wrote about SAS 70 certification in my SAS 70 Audits for Windows Azure and SQL Data Services? post of 3/9/2009. Although several readers of the Amazon Web Services: Overview of Security Processes article 9/5/2008, which I mentioned in my post, have requested information as recently as 5/22/2009 regarding when AWS might complete SAS 70 certification, their questions have gone unanswered as of 5/20/2009.

••• Chris Hoff (@Beaker) casts a baleful eye on the federal-government cloud in his Incomplete Thought: Cloud Computing & Innovation - Government IT’s Version of Ethanol? post of 5/31/2009:

Ethanol was designed to resolve dependencies on straight fossil fuels.  It was supposed to cost less and deliver better performance at lower emissions.  It hasn’t quite worked out that way.  In reality, ethanol has produced many profound unanticipated impacts; financial, environmental, economic, political and social.  Has it’s production and governmentally-forced adoption driven better solutions from being properly investigated?

Despite my unbridled enthusiasm for Cloud Computing, I am conflicted as I examine it using a similar context to the ethanol example above.  I fully admit that I’m stretching the analog here and mixing metaphors, but the article got me thinking and some of that is playing out here.  It *is* an “incomplete thought,” after all.

••• Eran Feigenbaum, Director of Security, Google Apps, claims “Google Apps has satisfactorily completed a SAS 70 Type II audit” in his SAS 70 Type II for Google Apps post of 11/4/2009, but I don’t see any reference to the audit’s applicability to the Google App Engine.

••• Chen-Xi Wang’s Teleconference: A Close Look At Cloud Computing Security Issues PowerPoint presentation for a Forrester Research teleconference of 5/9/2009 provides detailed security recommendations, including SAS 70 and ISO/IEC 27001:2005 compliance.

•• Charlie McNerney’s Securing Microsoft’s Cloud Infrastructure post of 5/27/2009 announces:

The [24-page] white paper we’re releasing today describes how our coordinated and strategic application of people, processes, technologies, and experience with consumer and enterprise security has resulted in continuous improvements to the security practices and policies of the Microsoft cloud infrastructure.  The Online Services Security and Compliance (OSSC) team within the Global Foundation Services division that supports Microsoft’s infrastructure for online services builds on the same security principles and processes the company has developed through years of experience managing security risks in traditional software development and operating environments.

Independent, third-party validation of OSSC’s approach includes Microsoft’s cloud infrastructure achieving both [Statement of Auditing Standard] SAS 70 Type I and Type II attestations and ISO/IEC 27001:2005 certification. We are proud to be one of the first major online service providers to achieve ISO 27001 certification for our infrastructure. We have also gone beyond the ISO standard, which includes some 150 security controls. We have developed 291 security controls to date to account for the unique challenges of the cloud infrastructure and what it takes to mitigate some of the risks involved [Emphasis added].

Charlie is GM, Business & Risk Management, Microsoft Global Foundation Services. I would have expected more promo from the Azure group on this event.

•• Reuven Cohen’s A Standardized Cloud Performance Rating System post of 5/29/2009 asks “Is there a simple way to compare the performance, security and quality of various cloud computing providers?”

Unlike CPU or Storage, Cloud Computing is significantly more complex involving many different moving parts (deployment approaches, architectures and operating models). Defining one common standardized basis of comparison would be practically impossible. But within the various aspects of cloud computing there certainly are distinct areas that we may be able to quantify. The most likely starting point would be infrastructure related offerings such as compute and storage clouds.

•• Bill McNee analyzes the state of the SaaS market in his SaaS Vendors Starting to Feel the Effects of Tough Economy research report (site registration required) of 5/28/2009 for Saugatuck Technology:

Saugatuck recently developed two market baskets to help us compare how well SaaS companies are fairing in the current economy relative to their on-premise software peers. While average Q109 revenue growth exceeded 25 percent for our small portfolio of SaaS firms, deferred revenues and billings have begun to slow (to 18 percent and 9 percent year-over-year, respectively), indicating that typical full year revenue growth will dip below 20 percent for the year.

•• Andrea DiMaio’s The Four Facets of Web 2.0 in Government post of 5/29/2009 details these four facets:

    1. Internal (intra or inter-government) collaboration.
    2. Institutional presence on external social networks
    3. Open government data
    4. Employees on external social networks

and concludes:

Last but not least, the real power of web 2.0 will be realized when these four areas will become one. When an employee will be networking with the citizens he or she services as well as colleagues, will be mashing up government and non-government content to post on his or her personal profile, which will become part of the institutional presence of government.

•• James Governor of Monkchips speculates in his Wherefore Java at Java One. Microsoft and OSS: Increment or Tipping Point? post of 5/29/2009 that Microsoft might announce in its Java One keynote that Azure will support Java:

Azure to support Java? The spectre of Google App Engine running Java looms large. From a Microsoft perspective the last thing it wants to see is the entire Java community swanning off into the arms of Google’s cloud. Enemy of my enemy and all that.

•• Sam Diaz provides a quick analysis of the White House’s cybersecurity event with a Near Term Action Plan table outake in his Washington: Together, we can tackle cybersecurity post of 5/29/2009.

•• Melissa Hathaway, Cybersecurity Chief at the National Security Council, discusses securing our nation's digital future in her Securing Our Digital Future post of 5/29/2009:

… Protecting cyberspace requires strong vision and leadership and will require changes in policy, technology, education, and perhaps law.  The 60-day cyberspace policy review summarizes our conclusions and outlines the beginning of a way forward in building a reliable, resilient, trustworthy digital infrastructure for the future.  There are opportunities for everyone—individuals, academia, industry, and governments—to contribute toward this vision. 

During the review we engaged in more than 40 meetings and received and read more than 100 papers that informed our recommendations.   As you will see in our review there is a lot of work for us to do together and an ambitious action plan to accomplish our goals.  It must begin with a national dialogue on cybersecurity and we should start with our family, friends, and colleagues. …

This appears to be related to Pres. Obama’s announcement that he’s creating a cyberczar position to safeguard government computer security. Thanks to @Beaker for the heads-up.

•• Reuven Cohen’s Cloud Computing: Weapons of mass disruption post of 11/28/2008, which popped up as new in my RSS reader today, gives props to Microsoft’s hybrid Software + Services approach:

The idea of a hybrid computing environment where some software aspects remain on your desktop and other are farmed out to the cloud seems to resonate with a lot of the people I've been talking to lately. Microsoft's Photosynth is prime example, they refer to this as a Software + Services and actually makes a lot of sense.

Microsoft describes their Software + Services philosophy as "a combination of local software and Internet services interacting with one another. Software makes services better and services make software better. And by bringing together the best of both worlds, we maximize choice, flexibility and capabilities for our customers." Microsoft at least from a "talk is cheap point" of view seems to get it.

•• The Windows Azure Team announces May CTP of Windows Azure SDK Released, Including Visual Studio 2010 Support with the following new features in the SDK and Tools:

    • Enhanced robustness and stability
    • Improved Visual Studio integration with Development Fabric and Development Storage
    • Microsoft Visual Studio 2010 Beta 1 support
    • Update for Visual Studio 2008 support
    • Improved integration with the Development Fabric and Storage services to improve the reliability of debug and run of Cloud Services from Visual Studio

Both are available now on http://dev.windowsazure.com.

For more information, details about these new features can be found in the MSDN documentation here:
http://msdn.microsoft.com/en-us/library/dd179355.aspx

In addition, May 2009 versions of the Windows Azure Storage Table and Blob White papers, with information about these features, will be pushed here soon:
http://msdn.microsoft.com/en-us/azure/cc994380.aspx

Steve Marx will have some examples on using the above new functionality on his blog soon:
http://blog.smarx.com/

•• Craig Balding has doubts about Wayne Horkan’s proposal to set-up a UK-specific cloud security forum, which he details in his No Country Left Behind: Sun UK CTO Pushes For UK Cloud Security Group post of 5/29/2009.

•• Tom Espiner’s Sun CTO to form cloud security forum post of 5/26/2009 for ZDNet UK describes Wayne Horkan’s attempt to “set up a cross-sector forum to resolve cloud-computing security issues.”

Cloud-computing systems could become as important as the UK critical national infrastructure, and they need to be secured in an appropriate manner, Wayne Horkan told ZDNet UK on Thursday. The Sun executive said he is working on setting up the forum alongside organisations such as the CBI, Microsoft and Accenture; government departments such as Berr, Dius and the Treasury; and the government's chief scientific advisor, Professor John Beddington.

"I'm concerned about the security of the supply," Horkan said at the Cloud Expo Europe conference in London. "If cloud computing becomes a utility, it's important to me that the UK as a nation state has good security of supply. It's important that the UK has the appropriate capability in cloud computing."

•• Leena Rao reports Ray Ozzie Asserts Microsoft’s Position In The Cloud at the J.P. Morgan Technology, Media and Telecom Conference held in Boston on 5/20/2009. Leena writes:

Microsoft’s Chief Software Architect Ray Ozzie made some interesting predictions on the future of cloud computing at J.P. Morgan’s Technology, Media and Telecom Conference in Boston today (see below for the full transcript). Ozzie says that while the IT community is in the very early stages of cloud computing innovations, the future for companies’ data hosting will be in the mixture of cloud computing and on-premise data centers.

Her post contains a complete transcript of Ray’s presentation.

•• Eric Chabrow’s DISA's Cloud Computing Initiatives post of 5/28/2009 contains the transcript of an interview with Henry Sienkiewicz, technical program advisor in the Defense Information Systems Agency's Computing Services Directorate:

Cloud computing is among the hottest topics in the federal government, with its efficiencies promising to save agencies - and eventually taxpayers - money. Despite its attractiveness, few agencies have implemented any type of cloud computing initiative, mostly because of IT security concerns.

The Defense Information Systems Agency is among the few government agencies actively involved in cloud computing.

Helping lead its efforts is Henry Sienkiewicz, technical program advisor in DISA's Computing Services Directorate. He sees cloud computing as another way information technology can serve the nation's war fighters by finding appropriate innovations and introducing them as rapidly as they can be secured.

•• James Urquhart posits Cloud is an operations model, not technology in this 5/28/2009 post:

When you run an application in a public or private cloud, there is no "cloud layer" that your software must pass through in order to leverage the physical infrastructure available to it. In the vast majority of cases, there is probably some virtualization involved, but the existence of hypervisors clearly does not make your data center resources into a cloud. Nor is the fact that Amazon EC2 uses Xen hypervisors the reason that they are a cloud.

What makes a cloud a cloud is the fact that the physical resources involved are operated to deliver abstracted IT resources on-demand, at scale, and (almost always) in a multi-tenant environment. It is how you use the technologies involved. For the most part, cloud computing uses the same management tools, operating systems, middleware, databases, server platforms, network cabling, storage arrays, and so on, that we have come to know and love over the last several decades.

•• Adrian Seccombe and Jim Reavis announced Jericho Forum and Cloud Security Alliance Join Forces to Address Cloud Computing Security in this 5/27/2009 press release:

London and San Francisco, 27 May 2009 – Jericho Forum, the high level independent security expert group, and the Cloud Security Alliance, a not-for-profit group of information security and cloud computing security leaders, announced today that they are working together to promote best practices for secure collaboration in the cloud.  Both groups have a single goal: to help business understand the opportunity posed by cloud computing and encourage common and secure cloud practices. Within the framework of the new partnership, both groups will continue to provide practical guidance on how to operate securely in the cloud while actively aiming to align the outcomes of their work.  

Joe McKendrick’s Insurers May Not Fully Embrace Cloud, But They Can Help Protect It post of 5/26/2009 to Insurance Networking News begins:

In a post last month, I talked about issues that may hold insurance companies back from fully embracing the cloud computing model for various parts of their businesses, especially where complex workflow transactions are involved.

However, there is a vast business opportunity in cloud computing for the industry, one that will play a vital role in building confidence in this emerging computing model. The Hartford, for example, now offers insurance to companies using cloud-based services.

and continues with an analysis of insurance requirements for “businesses to protect themselves against ‘information malpractice.’"

Peter Mell updated the new NIST, Computer Security Division, Computer Security Resource Center’s Cloud Computing site on 5/22/2009. The page has links to official versions of:

This material is public domain although attribution to NIST is requested. It may be freely duplicated and translated.

Comments on the definition can be sent to the email address: "cloud" at "nist" with a dot "gov" at the end.

• Alan Wilensky puts down Those Sincere yet hilarious PAAS People! in his 5/26/2009 post:

Assuring people that your cloud, PAAS, SAAS solution is just great, is no reassurance at all – it MAY work great, and MAY be reliable MOST of the time, but, if the company and the application are not rated and certified, if your business’ books are not open to any third party (so as to ascertain liquidity) such reassurances are just whitewash. See the original WorkXpress blog post here.

Alan continues with a reply to the WorkXpress post.

• Bill Brenner’s Google FAIL and the Fog Over Cloud Security article of 5/26/2009 for ITWorld begins:

Late last year, when I interviewed Google Apps senior security manager Eran Feigenbaum and his marketing partner, Adam Swidler, they talked up Google's place in cloud computing and how it was in a prime position to make a difference with cloud security. [ Four Questions On Google App Security]

But when Google suffers a massive outage as it did last week -- followed by another one Monday -- people can't help but have their doubts.

• Kenneth Oestereich writes in his Building a Real-World IaaS Cloud Foundation post of 5/26/2009:

Whenever we see a "stack diagram" of cloud architectures, most conversation centers on higher-level layers like defining what "Platform-as-a-Service" or "Software-as-a-Service" is. But underneath all of these diagrams is always a foundation layer called "Infrastructure-as-a-Service" (IaaS) or sometimes Hardware-as-a-Service (HaaS). IaaS doesn't get much attention, but it's really the critical factor to being able to provide reliability and basic services to all of the other layers. 

Andrea DiMaio asks (and answers) Could the cloud kill innovation? So what? in this 5/25/2009 post. He concludes:

While services available through a public cloud provided by a vendor are likely to be constantly innovated as a consequence of competitive pressures, those that governments embed in their own private clouds may not have enough incentives to be innovated over time. This argument has been used already in the past for government shared services, but becomes even more pertinent for cloud services.

But, after all, is that a bad thing? In the path to increasing commoditization the best future of a private government cloud may not be a better private government cloud.: It may actually be a public cloud.

Dana Gardner says “Suddenly, cloud computing is the dominant buzzword of the day, but the current confluence of trends includes much more” in his What's Next in IT: Corporate Flat Line or Next Renaissance? post of 5/25/2009, in which a

[P]anel of analysts … help dig into this current and budding new era of IT: Jim Kobielus, senior analyst at Forrester Research; Tony Baer, senior analyst at Ovum; Brad Shimmin, senior analyst, Current Analysis; Joe McKendrick, independent analyst and ZDNet blogger, and Ron Schmelzer, senior analyst at ZapThink. The chat is moderated by [Dana], as usual.

The post includes a link to a full transcript of the podcast.

Avi Cohen debunks The Cloud-Computing Myth for Forbes.com’ “Intelligent Investing” column in this 5/25/2009 post, subtitled “There's always a new era of ‘network computing’ around the corner, but we won't reach it soon.” Avi opines:

Let me save you some suspense, I believe the hype of cloud computing will once again taper off, even with advancements in Internet applications and improvements in connectivity the past few years. Bandwidth constraints and the growing cost of incremental traffic will partly be to blame (this is not a trivial hurdle; carriers grapple with an inability to charge by usage). The concept will also fail because of the complexity of maintaining and supporting so many remote devices and the costs of unavoidable outages; because part of a business' competitive advantage is its operational customization of off-the-shelf software; because of regulatory hurdles and restrictions not often considered part of the discussion.

Rick Mullin concludes in his The New Computing Pioneers article of 5/25/2009 for Chemical & Engineering News: “With in-house information technology burdened to the breaking point, the traditionally conservative drug industry is putting cloud computing to the test.” Rick continues:

Having spent the past five years catching up to other industries in the deployment of enterprise software systems that link researchers and laboratories companywide, big drug firms are now starting to push data storage and processing onto the Internet to be managed for them by companies such as Amazon, Google, and Microsoft on computers in undisclosed locations.

Pfizer, Eli Lilly & Co., Johnson & Johnson, and Genentech are among the drugmakers that are piloting into an emerging area of IT services called cloud computing, in which large, consumer-oriented computing firms offer time on their huge and dispersed infrastructures on a pay-as-you-go basis. These drug companies are among the first to gauge the cost- and time-saving pros and the potential management and security cons in this largely uncharted territory.

I spent 10 years in the specialty chemicals business (polyurethane foams and elastomers) and read C&EN faithfully every week.

William Hurley’s The battle between public and private clouds InfoWorld post of 5/25/2009 asks “When companies start building private clouds, will cloud providers eventually lose out?” Hurley answers:

I think companies like Amazon will start seeing less and less opportunity in the enterprise space, while hardware vendors like Cisco and their "unified computing" offering see more and more opportunities to make clouds rather than manage them. I think the large storage companies will win out. Cloud computing's biggest challenges at the enterprise level are the various rulings, regulations, and good ol' corporate bureacracy surrounding where data resides. Many enterprises interested in the cloud's benefits are hesitant to make the switch due to the legal ramifications. With that in mind, I see the large storage vendors like EMC and NetApp playing an obvious role, but I also see a role for some of the newer startups like Fusion-io and Storspeed.

Darryl Plummer questions Can the Cloud Return Us to Growth? in this 5/24/2009 Gartner blog post.

In a recent article published at Forbes.com, HP executive Russ Daniels penned an interesting piece called A Cloud In Every Garage. I have to admit that on reading the title, I thought I was in for a train wreck. The article looked to be positioned to follow the same mis-guided notions of “a cloud” as just another piece of infrastructure that is becoming so commonplace with vendors and the customers who listen to them (i.e. the customers who will follow them like lemmings right off the cliff into the next generation of vendor lock-in). I sat back to read it and was ready to write a rebuttal that explained that if cloud computing is just about next generation infrastructure (and buying into vendor “clouds”) then what is the big deal? I mean, advanced Data Centers have done that for quite a while now. And even more, visualization customers have had this capability for some time as well.

Alin Irimie announces COM and Windows Azure. Good News! in this 3/24/2009 post. Alin then moderates his title:

Some good news here, or bad, depends how you look at it. COM is not supported in Windows Azure. Azure VM’s don’t have COM runtime installed. Currently the only native code supported in Windows Azure is standard C++ library and standard Win32 API. There’s no MFC, COM or COM+.

And explains how to expose the C++ library to an Azure service.

Cloud Computing Events

<Return to section navigation list> 

 Sun MicrosystemsCommunity One conference offers 15 cloud computing sessions on 6/1/2009. As you’d expect, most are presented by Sun luminaries, such as Tim Bray. My recommendation is Standardizing the Cloud: Balancing Lock-in and Innovation in the Cloud 5:00 PM by RedMonk’s James Governor and Stephen O’Grady.

Watch live streaming video of six of the sessions here:

Time Title/Speakers
10:50 - 11:40 am Project OpenSolaris™ Dynamic Service Containers and Nimsoft Service-Level Management
Jason Carolan and Robert Holt, Sun Microsystems, Inc.
11:50 am - 12:40 pm Programming Languages and the Cloud
Ted Leung, Sun Microsystems, Inc.
1:40 - 2:30 pm Sun Cloud APIs Birds of a Feather
Tim Bray and Craig McClanahan, Sun Microsystems, Inc.
2:40 - 3:30 pm Practical Cloud Computing Patterns
Scott Mattoon, Ken Pepple, and John Stanford, Sun Microsystems, Inc.
4:00 - 4:50 pm Navigating a World of Many Clouds
Kevin Clark and Rajesh Ramchandani, Sun Microsystems, Inc.
5:00 - 5:50 pm Standardizing the Cloud: Balancing Lock-in and Innovation in the Cloud
Stephen O'Grady, RedMonk

When: 6/1 – 6/3/2009 
Where: Moscone Center, San Francisco

 Sun MicrosystemsJava One conference offers 24 cloud-related computing sessions on 6/2 – 6/5/2009. My recommendation is Using Java™ Technology in the Windows Azure Cloud via the Metro Web Services Stack with Sun’s Harold Carr and Microsoft’s Clemens Vasters on 6/3/2006 at 11:05 AM to 12:05 PM.

When: 6/2 – 6/5/2009 
Where: Moscone Center, San Francisco

IEEE Malta Section announces The Third International Conference on Advanced Engineering Computing and Applications in Sciences to be held October 11 –16, 2009 at Sliema, Malta will include a Cloud Computing Track. Microsoft Research is one of the sponsors.

When: 10/11 – 10/16/2009 
Where: Sliema, Malta (Nice venue!)

••• Diego Mariño says Welcome to CloudCamp Barcelona - June 15th, 2009:

CloudCamp is an unconference where early adopters of Cloud Computing technologies exchange ideas. With the rapid change occurring in the industry, we need a place we can meet to share our experiences, challenges and solutions. At CloudCamp, you are encouraged you to share your thoughts in several open discussions, as we strive for the advancement of Cloud Computing. End users, IT professionals and vendors are all encouraged to participate.

Our second CloudCamp in Barcelona will be June 15th, 2009

When: 6/15/2009
Where: BarcelonaActiva (link), Carrer de la Llacuna, 162 (map) 08018, Barcelona, Spain

Brent Stineman announces a Clouds and Collaboration training event/seminar in MSP for “People defining Business and IT strategy, CIO’s or people reporting to the CIO, Enterprise Architects, IT people interested in innovation”:

Turbulent times are also times of Opportunity. Companies are looking to survive but also to innovate and compete. Increasing the opportunities for innovation and improving productivity are essential. Improving Collaboration AND using Cloud computing will help you address these needs. At this event we welcome you to hear about these trends, and learn about what it can mean for your business.

When: 6/3/2009 3:00 PM CDT 
Where: Greater Minneapolis-St. Paul , MN 55437 US

Greg Ness moderates the FIRE Conference: Today's Networks Need to Embrace Automation according to his FIRE Infrastructure 2.0 Panel Now Viewable Online post of 5/28/2009:

This panel features Richard Kagan from Infoblox, Mark Thiele from VMware, Erik Giesa from F5 Networks and Doug Gourlay from Cisco. Infoblox’s Greg Ness is the moderator. It is about 35 minutes long.

Key message: Today’s networks are run like yesterday’s businesses and factories because they have not embraced automation. Those high cost networks today managed by legions of clerks will become chokepoints as enterprises embrace virtualization and cloud. Jobs and wealth will follow automation just as they followed investments in just-in-time manufacturing

Cisco's Gourlay: Cloud could break the Internet if not deployed on capable networks.


David Pallman announces May 2009 Orange County Azure User Group Meeting Tonight in his 5/28/2009 post:

The May 2009 Azure User Group meeting in Orange County is Thursday May 28th and will be about Azure Design Patterns.

The topic for the next Azure User Group meeting is Design Patterns for cloud computing on the Azure platform. As we've seen in prior months, Azure provides oodles of functionality that spans application hosting and storage to enterprise-grade security and access control. Design patterns help you think about these capabilities in the right way and how they can be combined into composite applications. We'll cover design patterns for hosting, data, communication, synchronization, and security as well as composite application patterns that combine them. We'll be doing hands-on code demos of a number of composite applications, including a grid computing application.

If attending, please be sure to RSVP at the link below so we can properly plan ordering of pizza and beverages. We hope to see you there!

RSVP Link: http://www.clicktoattend.com?id=138087

When: 5/28/2009 6:00 to 8:00 PM PDT
Where: QuickStart Intelligence, 16815 Von Karman Ave, Suite 100, Irvine, CA

•• Tim Long announces a Free Microsoft Web & Cloud Computing technical event in Swansea, June 10th 2009:

UK ISV Developer Evangelism Team : Announcing free Microsoft Web & Cloud Computing technical event in Swansea, June 10th 2009

The briefing will include the following technologies and topics: Building rich, interactive and scalable web applications with ASP.NET and Silverlight; SQL Server, Data Mining, Business Intelligence, and designing high performing data access systems; Windows Azure and Microsoft’s utility cloud computing platform.

When: 6/10/2009
Where: Conference Centre, Technium 2, Kings Road, Swansea, SA1 8PH

Nuno Filipe Godinho summarizes the presentation by Luis Martins, Architect Advisor from Microsoft Portugal, in his Cloud Computing Conference 2009 – Overview of the Windows Azure and the Azure Services Platform post of 5/28/2009. He also has links to similar summaries of other sessions at the Cloud Computing Conference 2009 being held in Porto, Portugal 5/28 –5/29/2009.

When: October 5-6, 2009
Where: Washington, D.C.

SYS-CON’s SYS-CON Announces Government IT Conference & Expo post of 5/27/2009 describes the new addition to their conference roster:

SYS-CON Events announced today the latest event in its innovative series of real-world technology conferences, Government IT Conference & Expo, a two-day deep dive into the new wave of Internet-based technologies that are changing the way that Federal agencies leverage, procure and utilize IT. GovITExpo, which is being held October 5-6, 2009 in Washington, DC, builds on the success of SYS-CON's Cloud Computing Expo, the fastest-growing conference anywhere in the world devoted to the delivery of massively scalable IT as a service using Internet technologies. Data storage, security and software services are among the major themes of the technical program.

When: October 5-6, 2009
Where: Washington, D.C.

• David Pallman will be Speaking at San Diego .NET User Group 5/26 about “the Azure developer experience. Directions can be found here: http://www.sandiegodotnet.com/.”

Bob Evans’ Interop Raises CIO-Level Interest In Cloud, Virtualization: Video post of 5/25/2009 begins:

While last week's Interop event in Las Vegas showcased a wide range of innovative enterprise technologies including mobility, WAN optimization, security, storage, smartphones, data center infrastructure, servers, and more, the biggest CIO-level activity was centered around cloud computing, whose acceptance and potential are growing rapidly, and virtualization, which has already become a cornerstone in 21st-century enterprise IT strategy.

Jeffrey M. Kaplan performs a post-mortem of the two-day Enterprise Cloud Summit at Interop in his Cloud Computing: Recapping a Week of IT Events post of 5/25/2009 to the Seeking Alpha site. Jeffrey concludes:

I came away from the event with the firm impression that cloud computing has captured the attention of the mainstream of IT/network professionals who are the mainstay attendees of Interop. They recognize that cloud computing can have an impact on the way they acquire and utilize computing resources. They see it as a potential threat and opportunity in their infrastructure environment, and want to better understand how it works and how to maximize its benefits.

Other Cloud Computing Platforms and Services

<Return to section navigation list> 

••• Dana Gardner’s The Cloud Can Set Coders Free post of 5/31/2009, subtitled “BriefingsDirect Analysts Unpack PaaS, Predict Future Impact on Enterprises and Developers,” provides links to a podcast and a full transcript of the discussion:

In a podcast featuring a panel industry thought leaders, moderated by yours' truly, we offer new insight into the current status of cloud offerings and the future need for open standards and governance. Who is using the cloud for what -- and where this trend is going -- are discussed as the podcast panelists unpack the Platform as a Service (PaaS) concept in BriefingsDirect Analyst Insights Edition, Volume 40.

Brady Forrest reports Amazon Hosts TIGER Mapping Data in this 5/29/2009 post to O’Reilly Radar:

Amazon is now hosting all United States TIGER Census data in its cloud. We just finished moving 140 gigs of shapefiles of U.S. states, counties, districts, parcels, military areas, and more over to Amazon. This means that you can now load all of this data directly onto one of Amazon’s virtual machines, use the power of the cloud to work with these large data sets, generate output that you can then save on Amazon’s storage, and even use Amazon’s cloud to distribute what you make. …

The TIGER Data is one of the first Public Data Sets to be moved off of S3 and switched to an EBS. By running as an EBS users can mount the EC2 instance as a drive and easily run their processes (like rendering tiles with Mapnik) with the data remotely. If you're a geo-hacker this makes a rich set of Geo data readily available to you without consuming your own storage resources or dealing with the normally slow download process.

Reuven Cohen delivers a concise description of Google Wave in his Google Jumps into the Cloud Wave (AJAX over XMPP) post of 5/28/2009.

James Hamilton provides a comprehensive list of his Perspectives articles in his Select Past Perspectives Postings post of 5/28/2009. Categories are:

  • Talks and Presentations
  • Data Center Architecture and Efficiency
  • Service Architectures
  • Storage
  • Server Hardware
  • High-Scale Service Optimizations, Techniques, & Random Observations

Altogether an amazing set of resources.

Reuven Cohen’s unconfirmed RumorMill: Amazon to Open Source Web Services API's post of 5/28/2009 begins:

I usually try to avoid posting rumors but this one is particularly interesting, I first heard about it a few weeks back but recently had independent confirmation. Word is Amazon's legal team is currently "investigating" open sourcing their various web services API's including EC2, S3 etc. (The rumor has not been officially confirmed by Amazon, but my sources are usually pretty good)

If true, this move makes a lot of sense for a number of reasons. The first and foremost is it would help foster the adoption of Amazon's API's which are already the de facto standards used by hundreds of thousands of AWS customers around the globe thus solidifying Amazons position as the market leader.

Lydia Leong “catches up on some commentary” with her Amazon’s CloudWatch and other features post of 5/28/2009:

Amazon recently introduced three new features: monitoring, load-balancing, and auto-scaling. (As usual, Werner Vogels has further explanation, and RightScale has a detailed examination.)

The monitoring service, called CloudWatch, provides utilization metrics for your running EC2 instances. This is a premium service on top of the regular EC2 fee; it costs 1.5 cents per instance-hour. The data is persisted for just two weeks, but is independent of running instances. If you need longer-term historical graphing, you’ll need to retrieve and archive the data yourself. There’s some simple data aggregation, but anyone who needs real correlation capabilities will want to feed this data back into their own monitoring tools.

James Urquhart adds his view about the Google-Salesforce partnership in his Google App Engine gets the Force.com post of 5/27/2009:

With this announcement, Salesforce.com has gained a Java platform to complement its own Force.com platform, which relies on the proprietary APEX programming language. Given the high penetration of Java in the enterprise applications market, this could open significant new doors for Salesforce.com among developers.

It should also be noted that this is not the first joint deliverable between Salesforce.com and Google, with connectivity between Force.com and Google Apps being announced last month. I'm keeping an eye on the two companies, as they are increasingly building a tight partnership, and have the potential of blurring the line between their respective partner and customer communities, creating potentially the largest cloud developer platform ecosystem by far.

Krishnan Subramanian’s Platform Interoperability: Force.com Talks To Google App Engine post of 5/28/2009 posits:

Yesterday, Salesforce.com announced a new version of Force.com for Google App Engine. This is a set of developer tools that helps developers build web apps on Google App Engine and then tap into the data on Salesforce.com. This opens up interesting possibilities for the developers and, especially, to the enterprises. With the release of Java support in Google App Engine, more and more enterprise applications are on the horizon. If these applications can use the data stored in the Salesforce.com Cloud, it will immensely benefit the enterprise customers.

Reuven Cohen reports Web Hosts Start to Feel Cloud-Based Revenue Leakage in this 5/27/2009 post, which also announces that “Enomaly ECP Cloud Service Provider Edition Will Launch June 1st.” The gist:

Recently we've had the chance to speak with a broad group of traditional web hosting and managed data center providers about the opportunities for cloud computing and infrastructure as a service platforms within their existing environments. In those conversations it has become interesting to see how our pitch has evolved.

Lately in order to prove our point, all we need to do is tell the web hosters to monitor any traffic to and from Amazon Web Services. What this AWS traffic represents is revenue leakage or lost revenue opportunities. It has become obvious that for hosting companies the cloud has little to do with efficiency or data center optimization but more to do with recapturing revenue lost to Amazon and other cloud infrastructure providers.

NASA’s Nebula Cloud Computing Platform Web site was updated on 5/20/2009 and NASA is now accepting e-mail registrations for a limited beta. Here’s a Nebula diagram that’s buried in the Services section of their site:

John Treadway wrote in his NASA’s NEBULA - Enterprise Cloud Computing for Rocket Scientists and Us… post of 5/26/2009:

The core virtualization and cloud services layer are provided by Eucalyptus, the Amazon Web Services open source clone.  Storage is provided by the open source Lustre clustering file system, while the core application development framework is the Python-based Django project.  Note that Google took serious bashing for releasing their AppEngine framework initially with only Python support.  Their IDE is an integrated stack of Subversion (source code control), Trac (bug tracking) and Agilio “agile development” project management tool set.   Lastly, the content repository is searchable using the Solr framework on top of the Apache Lucene search engine.

When this is released to the general public as an open source project, will this be solid competition vs. commercial enterprise cloud frameworks such as 3tera’s AppLogic?  Is this even a valid question?  I’ll see if I can find out…

• Scott Morrison’s SOA Governance Determines Success in the Cloud post of 5/26/2009 on the GigaOm blog warns:

The high-profile success of services such as Salesforce.com and Amazon Web Services has led many businesses to undertake cloud computing initiatives. Moving to “the cloud,” however, entails a variety of security, management and compliance risks that corporate executives may be unwilling to assume without having the proper governance mechanisms in place.

Anthony Franco asks Could Apple Enter the Cloud Computing Market? (And if so, could buying Adobe help?) in this 5/25/2009 post. Anthony continues:

This is total speculation, but I believe it would be very possible for Apple to enter the cloud computing space. A recent article in the Charlotte Observer (found via a mac rumors article) is claiming they are offering Apple a huge tax break to set up a billion dollar server farm in North Carolina.

Apple has an interesting competitive advantage here. They have loads of cash in reserve ($29 billion) and manufacture hardware, OS  and software. They have already proven their ability to scale services (iTunes). I would find it ridiculous if Apple has not at least talked about it internally. Analysts are already touting cloud computing as the future of computing – and Apple is usually a leader in computing trends.

1 comments:

Isaak Estes said...

Additionally WF-based applications running in the Workflow Service can only use WF’s sequential workflow model.

Isaak Estes
access control systems