Wednesday, September 16, 2009

Windows Azure and Cloud Computing Posts for 9/14/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

Update 9/16/2009: New Microsoft private cloud sites, live Azure apps, PHR articles, MapReduce API for Windows, security and privacy for personally identifiable info (PII)
• Update 9/15/2009: New live Azure apps, infrastructure stories, security suggestions, events, FTC orders Web PHR data breach reporting, Google Government Cloud and Google App Engine’s datastore gets new replication features

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use these links, click the post title to display the single article you want to navigate.

Azure Blob, Table and Queue Services

Anne Thomas Manes takes on the REST-* crowd that’s supporting new REST-based standard for middleware in her REST-* (I've got a bad feeling about this) post of 9/16/2009:

… What really concerns me about this effort is Bill [Burke]'s perspective on REST. As he said in his "What REST has to be" blog post:

I really don’t care in the end if any of the architectural principles of Roy’s thesis are broken as long these requirements [simplicity, low footprint, interoperability, and flexibility] are met.  Pragmatism has to be the most important thing here.  We can’t fall into religious and academic debates on the purity of a distributed interface.

I believe in being pragmatic, but if you don't adhere to the REST principles (everything is a resource with a uniform addressing scheme [i.e., a URL], interactions using representations, uniform methods, stateless interactions, using hypermedia as the engine of state), you won't produce RESTful systems, and you won't attain the desirable RESTful characteristics (scalability, serendipity, network effects, etc) that REST is supposed to enable. …

What really turns me off about this effort is an attempt to define a specification for REST-* transactions, especially with two-phase commit. It’s difficult to swallow a spec for a stateless transaction.

Mike Amundsen’s What level of visibility do you really need? post of 9/15/2009 addresses RESTful pragmatism:

[M]y buddy subbu (@sallamar) said something to me yesterday that lit a bulb above my head:

“In REST, the line between a purist and a pragmatist is "visibility". Everything else is implementation detail.”

[T]his is a powerful statement. …

Mike concludes:

[I]f you can confidently answer the question "what level of visibility do i really need?" you are a good part of the way toward building a pragmatic, successful RESTful implementation.

Arjan Einbu eliminates reliance on the StorageClient library in his Diving deeper into Windows Azure table storage post of 9/14/2009:

In this previous posts about Windows Azure Table Storage, I relied on the StorageClient project in the Azure SDK samples. This feels a bit strange, and raises the question: Am I expected to include references to sample projects and be using Microsoft.Samples.whatever namespaces in my future projects?

This raises a couple of questions about license, copyright, support and more. Instead of digging into those questions, I came up with some alternate questions:

  • What does this sample project give us?
  • How does it work?
  • Can we do these things ourselves?

A lot of the searching was done in the sample code, since most of the other articles about accessing Windows Azure Table Storage depend on the same sample files. I was disappointed to see that even the Windows Azure SDK help file shows some partial code calling into the sample project. Little help there…

So, looking at that previous post again; All my entity classes inherit from TableStorageEntity and I use the TableStorageDataContext as a base class for accessing entities in the Azure Table Storage. …

Johannes Vermorel’s Thinking the Table Storage of Windows Azure post of 9/14/2009 expresses concern about Azure Table storage:

… At this point, I feel very uncertain about Table Storage, not in the sense that I do not trust Microsoft to end-up with finely tuned product, but rather at the patterns and practices level. …

So far, I got the feeling that many developers feel attracted toward the Table Storage for the wrong reasons. In particular, Table Storage is not a substitute of your old plain SQL tables:

  • No support for transactions.
  • No support for keys (let alone foreign keys).
  • No possible refactoring (properties are frozen at setup).

If you are looking for those features, you’re most likely betting on the wrong horse. You should be considering SQL Azure instead. [Emphasis Johannes’.]

I agree in principle with Johannes’ conclusion, but SQL Azure needs some best practice recommendations for sharding from the patterns & practices group.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Ryan Barratt explains Google App Engine’s Migration to a Better Datastore in this detailed post of 9/14/2009 about Megastore:

Megastore is an internal library on top of Bigtable that supports declarative schemas, multi-row transactions, secondary indices, and recently, consistent replication across datacenters. The App Engine datastore uses Megastore liberally. We don't need all of its features - declarative schemas, for example - but we've been following the consistent replication feature closely during its development.

Megastore replication is similar to Bigtable replication in that it replicates data across multiple datacenters, but it replicates at the level of entire entity group transactions, not individual Bigtable column values. Furthermore, transactions on a given entity group are always replicated in order. This means that if Bigtable in datacenter A becomes unhealthy, and we must take the extreme option to switch to B before all of the data in A has flushed, B will be consistent and usable. Some writes may be stuck in A and unavailable in B, but B will always be a consistent recent snapshot of the data in A. Some scattered entity groups may be stale, i.e., they may not reflect the most recent updates, but we'd at least be able to start serving from B immediately, as opposed waiting for A to recover. …

Will SQL Azure’s forthcoming (and still secretive) Secure DataHub do the same thing? So far, I haven’t heard the SQL Azure story about replication of database copies between datacenters.

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

No significant new posts on this topic today.

<Return to section navigation list> 

Live Windows Azure Apps, Tools and Test Harnesses

Greg O’Connor contends that Cloud Computing + POC = ‘Obvious’ ISV Revenue Growth in this 9/16/2009 post:

One thing is clear to me - ISVs are doing POCs in the cloud

The obvious thing here is ISVs should reduce POC installation and configuration time to near zero. It is obvious, isn't it? And easy to understand. As it turns out, it's also easy to do.

How do I know? Because that's what we do at AppZero. Delivering an application as a pre-installed, pre-configured virtual application appliance with zero OS component shrinks install, set-up, and configure time to minutes. Yes, even large German ERP applications. (You knew I had to get at least one company mention in, didn't you?) In fact, I'd worked up a nifty Excel-based calculator to let potential ISV customers of ours quickly bang out the financial impact of having zero POC install time. I ran the calculator by some of our ISV clients to give it a real-world sanity check.

Talking with one AppZero-using ISV with revenue just under $1B in 2008, I was looking for just a few inputs to make my case: # of POCs, average deal size, average time invested per POC, win rate, how many SEs and the cost of an SE. Turns out that if you're the SVP that owns responsibility for that almost $1B worth of business, you know the win/loss metrics. Cold.

Greg is CEO of AppZero (formerly Trigence) which develops software tools to help organizations move applications to Cloud computing environments.

Katherine Hobson brings PHRs to mainstream media with her Time to Switch to an Online Personal Health Record? feature-length article of 9/16/2009 for U.S. News & World Report:

If you’re like most people, your personal medical record is a multiheaded beast: pieces of information scattered among the offices of multiple physicians, prescription data at a handful of different drugstores, and a manila folder full of receipts and lab reports in an overstuffed file cabinet at home. Now that it's possible to tame the beast, should you? A host of Web-based personal health records, or PHRs, have been rolled out over the past few years, including offerings from Internet heavyweights Google and Microsoft. The pitch: a central repository for all your health information—from family history to lab results to cholesterol readings—gathered from all those disparate sources, and ways to share it with doctors or other people that you deem appropriate. Plus, cool tools that draw on your information to alert you, for example, if you are taking medications that might interact, or to help you track weight loss. But there are cons as well as pros to putting all your personal health information online.

Katherine mentions Google Health, Microsoft HealthVault, Revolution Health, and PassportMD in her article. It should be noted that PassportMD’s online PHR service is free for Medicare members.

Peter Neupert claims that In the Health Reform Recipe, the Missing Ingredient Is the Consumer in his 9/15/2009 post in response to this week’s question from the Washington Post RX Blog:

I applaud the administration for shining a bright light on health reform. The government -- as buyer, regulator and leader -- must be a part of any solution. The political calculus has created a real sense of urgency to do something about this complex system which touches everyone and accounts for one sixth of our economy. The consequence however, through a lack of transparency and understanding, has reduced the public dialogue to be between "public insurance options" vs. "death panels." Framing the debate this way and consuming available public attention on "wedge" issues won't lead to a sustainable future system. …

Peter is Microsoft’s Corporate Vice President, Health Solutions Group

Dennis Palmer’s Facebook Application Cloud Call Me Wins the Twilio + Windows Azure Developer Contest according to this 9/15/2009 Twilio post:

Congratulations to Dennis Palmer, who worked right up until the deadline to deliver (just 12 minutes before midnight) Cloud Call Me - a Facebook application for calling your friends using Twilio and Azure.

We were impressed to see Dennis (@CoderDennis) tweeting late at night on Sunday as he fought hard to complete his app on time, along with support from his wife Destiny (@destinypalmer).  As it turns out, she was his sole beta tester.

• Bill Crouse, MD suggests that you Learn more about “Clinical Groupware and “services in the cloud” in this 9/14/2009 post to the Microsoft HealthBlog:

On my HealthBlog post of June 24th I posed the question, “Is it time for Clinical Groupware?”  Well, apparently the answer is YES!  Since June, a number of my colleagues have come together to form what is now known as the Clinical Groupware Collaborative.  The Collaborative’s mission is to promote lower-cost, flexible, easier-to-use and implement healthcare ICT solutions. Although the pure vision for clinical groupware is to deliver many of the ICT solutions a medical practice, clinic or hospital might need as “services in the cloud”, a blended model of software on local servers or PCs plus services in the cloud is probably more realistic. [Emphasis added.]

For those of you interested in learning more imageabout the Clinical  Groupware Collaborative and what you can do to participate, some of my colleagues will be hosting an informal get together on Tuesday, September 22, 2009, between 6:00 and 7:30 PM at the Hilton San Diego Bayfront Hotel, Aqua Room 304.  The gathering will take place immediately following the close of the DMAA conference (Disease Management Association of America).

• Kevin Leneway describes his Rosetta Phone sample application in Windows Azure + Twilio Example App: Rosetta Phone by Kevin Leneway post of 9/10/2009 to the Twilio blog:

Hello, my name is Kevin Leneway and I work at Microsoft as a Windows Azure community evangelist.  I’m a huge fan of Twilio and it’s great to see that this week’s netbook contest is focused on Twilio apps built on Windows Azure.

A few weeks back I put together a fun little application called Rosetta Phone using Twilio and Windows Azure. The idea of the application is to turn any ordinary phone into a translation device.  To see this app in action, check out this quick video demonstration [from the post.]

Johannes Vermorel’s O/C mapper - object to cloud post of 9/14/2009 describes “a new open source project called Lokad.Cloud that would isolate all the pieces of our cloud infrastructure that weren’t specific of Lokad”:

… The project has been initially subtitled Lokad.Cloud - .NET execution framework for Windows Azure, as the primary goal of this project was to provide some cloud equivalent of the plain old Windows Services. We did quickly end-up with QueueServices which happens to be quite handy to design horizontally scalable apps.

But more recently, the project has taken a new orientation, becoming more and more an O/C mapper (object to cloud) inspired by the terminology used by O/R mappers. When it comes to horizontal scaling, a key idea is that data and data processing cannot be considered in isolation anymore. … [Emphasis Johannes’.]

Alan Smith offers his CloudCasts Behind the Scenes Part 1 - Development and Deployment Webcast video of 9/13/2009, which he claims:

In this webcast I’ll run through the deployment of a new version of the CloudCasts website. I’ll start by showing the project structure and how I have configures the application to use local or cloud based storage in the development phase. I’ll give you a few tips on hosting your Azure applications in IIS to speed up development. I’ll then run through the deployment of the new version of the site to a staging environment, and finally taking it into production.

<Return to section navigation list> 

Windows Azure Infrastructure

•• The Microsoft Careers site returned 48 job openings dating from 8/23/2009 for the keyword Azure on 9/16/2009. But you must click Find Your Fit or another link before searching; otherwise Azure returns 0 hits.

•• Ezra Klein reports that Doctors Support the [Healthcare] Public Option in this 9/14/2009 article for the Washington Post:

That's the conclusion of a national poll conducted by the Robert Woods Johnson Foundation. The survey included more than 5,000 doctors spread across an array of specialties, and asked two sets of questions. The first gauged support for a public health-care system, a system with public and private options, and a solely private system. The second measured attitudes towards Medicare. The results should be generally cheering to reformers.

Doctors overwhelmingly support either a public option or a public system. Indeed, when you add the two groups together, it's more than 70 percent of respondents. There were some differences across specialties, but not a lot: about 75 percent of primary care doctors favored a public option or public system, while about 67 percent of surgeons felt similarly.

@ezraklein’s blogging his way through the Baucus bill at www.ezraklein.com.

John Foley reports about IW500: Cracks In The Enterprise Software Model in this 9/15/2009 post to InformationWeek’s Plug into the Cloud blog:

A panel of software industry veterans at the InformationWeek 500 conference today argued that the old, on-premises enterprise software model is nearing the end of the line. Taking its place: Web-based software and cloud computing.

"We're at the end of one technical lifecycle and at the beginning of another," said Aneel Bhusri, president of software-as-a-service vendor Workday. Before co-founding Workday with Dave Duffield, Bhusri was an executive with PeopleSoft, an old-guard software company acquired by Oracle in 2005.

Of course, the argument that SaaS is supplanting on-premises software isn't new, but Bhusri and the other panelists pointed to growing signs of discontent among business and technology managers and a new willingness, even requirement, to shift to SaaS and cloud computing. "We're starting to see cracks" in the old model, said Ray Wang, a software analyst with Altimeter Group. … [Emphasis added.]

Geva Perry contrinues his The Purpose-Driven Cloud series with The Purpose-Driven Cloud: Part 2 - Technology Frameworks post of 9/14/2009:

This is the second post in the series I described in The Purpose-Driven Cloud. the series is an attempt to categorize the various cloud platforms available today (and that may be available in the future). Each installment examines one of the dimensions that differentiates different cloud platforms. The first in the series discussed the Usability-Driven Cloud. This post is about: --

The Framework-Based Cloud

Framework-based cloud platforms are designed to be used with a specific programming language, product or technology stack. They enable developers who subscribe to that particular programing model to focus on the innovative aspect of their application and let the platform handle the plumbing problems of scalability, reliability and performance (and in some cases, integration).

Geva continues with analyses of Google App Engine, Heroku and Tibco Silver with only a brief reference to Azure.

• James Urquhart asks Does cloud computing affect innovation? and answers in this 9/14/2009 post to his Wisdom of Clouds CNet News blog:

Earlier this summer, Jonathan Zittrain wrote a New York Times OpEd piece that discussed his concerns with the cloud computing paradigm. Zittrain stated that, while it may seem on the surface that cloud adoption is as "inevitable as the move from answering machines to voice mail," he sees some real dangers.

Zittrain covered the usual concerns about data ownership, privacy, and the access that data placed in the cloud gives governments all over the world--a concern I certainly share. He went on to point out that these problems are solvable "with a little effort and political will," a view that I also adhere to.

To Zittrain, however, the biggest threat the cloud posed to computing wasn't privacy, but innovation--or a lack thereof[.] …

Steve Hamm quotes former Cassatt CEO Bill Coleman in A Positive Spin on Consolidation of the Computing Industry article for Business Week’s Globe Spotting section:

One of the great things about being a reporter is the surprises you find when you’re putting together a story. When I was developing my recently published story about Oracle Corp. and consolidation in the corporate computing world, I called up a couple of former executives of software companies expecting to hear them decry the trend. Instead, I found two fans of consolidation.

First, I talked to Bill Coleman, a one-time executive at Sun, now being taken over by Oracle, and founder of BEA Systems, bought by Oracle last year, and of Cassatt, bought by CA this year. “I believe we’re in the final phase of the maturation of the IT industry,” he told me. “There isn’t any more disruptive innovation. The functionality of the systems isn’t disruptive. That’s when things commoditize. It’s happening in all of the IT industry and Oracle is a sign of it.”

My immediate reaction was: Well, isn’t the oncoming shift to Cloud Computing the biggest disruption of all?

The way Coleman sees things, Cloud is the ultimate expression of commoditization and consolidation. When corporations buy most of their services from the cloud, the data center, hardware, software, communications, and services are consolidated. There will be a lot of incremental innovation in e-commerce and social networking services built on top of the Cloud technology infrastructure. …

Late in our conversation, Coleman amended what he said earlier about disruptive innovation being over in IT. “Cloud computing will be the final major disruptive innovation,” he said. … (Emphasis added.)

James Urquhart tries Clarifying Internal Cloud versus Private Cloud in this 9/14/2009 post to the Cisco Datacenter blog:

Many companies are rapidly evolving toward cloud computing, though from different starting points and not without debate as to the best direction or computing model. For example, advocates of public cloud computing sometimes advise not owning any software or hardware or employing any IT administrators and instead relying on professional providers of IT applications, platforms, infrastructure, and services.

On the other side of the debate are those who have spent years building IT infrastructures, whose concerns must be addressed since reliability, security, SLAs, and interoperability will determine the success of the various cloud computing models within the enterprise. …

In this video, James Urquhart, market strategist for cloud computing and data center virtualization at Cisco, provides a high-level breakdown on the distinctions between internal and external clouds vs. private and public clouds, and some of the benefits.

James said in a 9/14/2009 Tweet: “Oh, I know I'll get comments on this one... ;)” but there weren’t any comments when I read the post two hours later.

Kevin Hoffman explains The Difference Between Web Hosting and Cloud Computing in this 9/13/2009 post:

Azure completely changes the way people think about building scalable and reliable web applications available on the Internet.

Yesterday a friend of mine was asking me what I’ve been doing lately in my spare time. When I mentioned that I’d been doing a lot of messing around with Windows Azure, he was naturally curious. After explaining what Azure is, he asked me what the difference was between Windows Azure, a cloud computing environment, and traditional web hosting scenarios.

On a really high level, he’s got a valid point : With Azure you can develop your application offline locally and then when you’re done you can publish it to a remote host. To the casual observer, this looks exactly like what you might do with a web hosting company that provides space on an IIS box and let’s you use ASP.NET and maybe even a little SQL server database.

The Windows Azure Team wants you to Get Started with Windows Azure TODAY! by offering instant invitation tokens according to this 9/13/2009 post. …

Jamal Mazhar’s The Benefits and Challenges of Virtualization, Private and Public Clouds post of 9/14/2009 proclaims “The biggest advantage of using public cloud is reduction in fixed costs” with the help of the following graphic:

After my earlier blog discussing the evolution of IT, I had several discussions on the benefits and challenges of virtualization, private, and public clouds.  Following bar chart is an attempt to capture the benefits and challenges of various phases of IT evolution from the days of having dedicated physical servers for each application to the use of public cloud.


The chart is self explanatory, some key points to note are:

  • Going from virtualization to private cloud is basically a step to provide self service capabilities to the application owners.  It increases flexibility and also increases the management complexity, as it adds another layer of abstraction
  • The biggest advantage of using public cloud is reduction in fixed costs, as you don’t have to build the capacity for exceptional peak cases.  Several IT activities have elastic demand, development, functional testing, load testing, end of quarter processing etc.

<Return to section navigation list> 

Cloud Security and Governance

•• Mary Jo Foley reports Microsoft launches a 'private cloud' blog in her 9/16/2009 post to her All About Microsoft blog for ZDNet:

For those wondering what Microsoft has up its sleeve, in terms of its “private-cloud” strategy, there’s a new resource worth tracking: The company’s private-cloud blog.

The new blog — which I discovered via another blog (Microsoft’s increasingly prolific Nexus System Center blog) — so far seems to be little more than a site for the Dynamic Data Alliance, a group for Microsoft hosting partners that are building around Microsoft’s Dynamic Datacenter Toolkit for Hosters.

But here’s how Jeff Wettlaufer, Senior Technical Product Manager for System Center, described the new blog:

“Hey everyone, I just had a chat with our friends over in the Private Cloud Computing Team, and they wanted to let all of you know they now have a team blog. Their blog is intended to be a hub to highlight their partners, customers and internal Microsoft personalities; as well as to promote announcements related to the Dynamic Data Center Toolkits for Hosters & Enterprises, and the Dynamic Data Center Alliance.”

There’s also a http://www.microsoft.com/privatecloud URL that redirects to Microsoft Virtualization’s Microsoft Cloud Computing Infrastructure site with Private Cloud and Public Cloud pages.

•• Jamal Mazhar’s Building a Private Cloud Within a Public Cloud post of 9/16/2009 claims that “users within the corporate firewalls can access the server in the cloud seamlessly”:

One of our customers wanted to establish a site to site connectivity between their datacenter and  public cloud (Amazon EC2) and then have a private network within Amazon EC2 with their own custom IP addresses for their servers in the cloud.

Basically the idea here is to augment the internal datacenter resources with the resources in the public cloud securely so that the servers in the cloud appear as if they are part of their own private corporate network.  The idea here is to isolate the servers used by the customer in the cloud from the rest of the servers in the cloud using private network, just like the corporate internal datacenters are isolated using private network with private routers routing the internal traffic.

[The] Kaavo team setup the required network using OpenSwan and OpenVPN, see the figure below.

Private Cloud within a Public Cloud

•• Marianne McGee’s Health IT Is Part Of Senate Committee's New Healthcare Reform Bill post of 9/16/2009 to InformationWeek’s Healthcare Blogs reports:

The Senate Finance Committee today released a mark-up version of its new healthcare reform bill. The America’s Healthy Future Act has several technology provisions, including a proposal for bonus payments related to health IT programs.

One proposal calls for new performance bonuses for Medicare Advantage providers using health IT, "including electronic health records, clinical decision support and other tools to facilitate data collection and ensure patient-centered, appropriate care."

In addition to the health IT incentive proposal, the healthcare reform bill also includes IT related provisions for niche providers of healthcare that aren't covered in the $20 billion health IT stimulus programs of the American Recovery and Reinvestment Act passed earlier this year.

Those new proposals include a demonstration project for nursing homes to use IT to improve residents' care.

•• Joseph Goedert reports Advisory Panel OKs Privacy Standards in this 9/16/2009 post to the Health Data Management blog:

The HIT Standards Committee, a federal advisory panel created under the American Recovery and Reinvestment Act, has approved recommendations of its privacy and security workgroup. That moves the recommendations one step closer to being final, although they still face several more hurdles. …

The adopted privacy and security recommendations are available at http://healthit.hhs.gov. Click on Federal Advisory Committees, then Health IT Standards Committee, then scroll down to the Sept. 15 meeting documents.

Or click here and scroll down.

•• Reuven Cohen asks Is Privacy in The Cloud an Illusion? in this 9/15/2009 post:

With the announcement today that Adobe is acquiring Omniture, one of the largest web analytics firms -- something occurred to me. This is probably obvious to most, but the move to cloud based services has some pretty big potential ramifications when it comes to privacy and risks in unknowingly agreeing to the use of your private information being data mined. The Adobe Omniture deal makes this very apparent. More then ever, your privacy is becoming an illusion.

Like it or not when it comes to terms of use and privacy policies, most people don't read them. I'm a little bit of nut when it comes to them, but I fear I am in a minority. I also realize that when it comes to free and or low cost cloud services, you are typically asked to give up some privacy in return for the service. After all nothing is truly free. So the question now becomes how much of personal information should I be prepared to give away, be it for using a free cloud service or even for a paid service. What is an acceptable amount? None or some?

•• Chris Hoff (@Beaker) asks a Quick Question: Any Public Cloud Providers Using Intel TXT? in this 9/15/2009 post:

Does anyone know of any Public Cloud Provider (or Private for that matter) that utilizes Intel’s TXT?  [Trusted Execution Technology]

Specifically, does anyone know if Amazon makes use of Intel’s TXT via their Xen-derivative VMM?

Anyone care to share whether they know of any Cloud provider that PLANS to?

• David Linthicum offers The truth about lock-in and cloud computing: “You can't get both distinct features and portability across cloud platforms -- just as you can't with on-premise platforms” in this 9/15/2009 post to InfoWorld’s Cloud Computing blog:

Much of the static around cloud computing has come from the notion of portability, or the ability to move data and code from one cloud to another or back to the enterprise. Word on the street is that the emerging cloud computing platforms are "proprietary," thus portability, and in many cases interoperability, is going to be an issue. However, the core issue is a trade-off of features versus portability.

The portability concern is sound. Chances are that if you write an application on one of the existing cloud platform providers (such as Amazon Web Services or Google App Engine), moving that application to other cloud platforms or perhaps to an enterprise server is going to be a pain in the neck. There are just too many differences between the cloud platform providers, so many rewrites of code and reworking of data will have to be on your to-do list. …

Lori MacVittie describes The Cloud Metastructure Hubub in this 9/15/2009 post with a “How Infrastructure 2.0 might leverage publish-subscribe technology like PubSubHubub to enable portability of applications” deck:

Pieter_Bruegel_TowerBabelOne of the topics surrounding cloud computing that continues to rear its ugly head is the problem of portability across clouds.

Avoiding vendor lock-in has been problematic since the day the first line of proprietary code was written and cloud computing does nothing to address this. If anything, cloud makes this worse because one of its premises is that users (that’s you, IT staff) need not concern themselves with the underlying infrastructure. It’s a service, right, so you just use it and don’t worry about it.

Tower of Babel by Pieter Bruegel the Elder

Let’s assume for a moment that you can easily move applications from data center to cloud to cloud. Plenty of folks are working on that, but very few of them address the “rest of the story”: the metastructure.

Metastructure contains the metadata that describes the network, application network, and security infrastructure providing all those “don’t worry about” services cloud providers offer. Load balancing, firewalls, IPS, IDS, application acceleration, secure remote access. If you’ve spent time with your cloud provider tweaking those services – or configuring them yourself – then moving to a new cloud provider is not only a huge investment in time, it’s actually going to be painful because you’re essentially going to have to recreate every metastructure configuration again.

• Andreas M. Antonopoulos explains why “cloud computing makes auditors cringe” in his Cloud security through control vs.ownership article of 9/15/2009 for NetworkWorld:

Cloud computing makes auditors cringe. It's something we hear consistently from enterprise customers: it was hard enough to make virtualization "palatable" to auditors; cloud is going to be even harder. By breaking the links between hardware and software, virtualization liberates workloads from the physical constraints of a single machine. Cloud takes that a step further making the physical location irrelevant and even obscure.

Traditionally, control of information flows directly from ownership of the underlying platform. In the traditional security model location implies ownership, which in turn implies control. You build the layers of trust with the root of trust anchored to the specific piece of hardware. Virtualization breaks the link between location and application. Cloud (at least "public cloud") further breaks the link between ownership and control. …

• Tim Green’s Cloud security survey can help shape best practices post of 9/15/2009 to NetworkWorld’s Cloud Security Report suggests:

You can make a difference deciding what aspects of cloud security get the most attention in upcoming recommendations about best practices.

The Cloud Security Alliance is soliciting businesses to fill out a brief questionnaire about their perceptions of cloud services and their concerns about using such services. It is also asking what their use of cloud technology and services is likely to be in the near term.

Once the CSA has had the chance to digest the results, it will use them to help shape educational programs it plans to offer. It will also use them to refine the areas of guidance it will offer in its effort to provide best practices in a range of areas.

CSA has broken down its efforts into 15 domains ranging from legal to risk management to storage and virtualization.

• Unisys won’t surprise anyone with their Unisys Poll Shows Security Concerns as Leading Cause of User Hesitancy in Adopting Cloud Computing press release of 9/15/2009:

Of the 312 respondents, 51 percent cited security and data privacy concerns when answering the question, “What do you see as your greatest barrier to moving to cloud?” The next-highest barrier to adoption of cloud computing, cited by 21 percent of the respondents, was integration of cloud-based applications with existing systems.

Concerns about the ability to bring systems back in-house and regulatory/compliance issues were cited by 18 percent and 10 percent of respondents, respectively.

The poll results corroborated the outcome of another quick poll Unisys conducted in a June 2009 webinar on protecting data in the cloud, which was attended by more than 100 major firms and government agencies. Answering the question, "What is your greatest concern about moving workloads to the cloud?” 72 percent of the respondents to that poll cited security concerns. Other considerations ranked significantly lower on the scale of urgency – for example, integration issues were next highest at 34 percent.

• Help Net Security briefly reviews New book: Cloud Computing: Implementation, Management, and Security in this 9/15/2009 post:

Cloud Computing: Implementation, Management, and Security provides an understanding of what cloud computing really means, explores how disruptive it may become in the future, and examines its advantages and disadvantages. It gives business executives the knowledge necessary to make informed, educated decisions regarding cloud initiatives.

The authors first discuss the evolution of computing from a historical perspective, focusing primarily on advances that led to the development of cloud computing. They then survey some of the critical components that are necessary to make the cloud computing paradigm feasible.

They also present various standards based on the use and implementation issues surrounding cloud computing and describe the infrastructure management that is maintained by cloud computing service providers. After addressing significant legal and philosophical issues, the book concludes with a hard look at successful cloud computing vendors.

Helping to overcome the lack of understanding currently preventing even faster adoption of cloud computing, this book arms readers with guidance essential to make smart, strategic decisions on cloud initiatives.

Kevin Jackson’s NCOIC Officially Launches Cloud Computing Working Group post of 9/15/2009 claims “Organizationally this new working group will operate as part of the Specialized Frameworks Functional Team”:

The initial CCWG charter provides the following direction to the group:

  • Collaboration & engagement with government Cloud activities, standard bodies, vendors and NCOIC member companies to look at standards-based Cloud Computing to achieve capabilities such as peer-to-peer interoperability, improved usability and trust of the cloud, and portability across clouds.
  • Document current state of best practices, architectures and blue prints for commercially-available implementations, including examining security implications and how to implement an internal cloud.
- “In-Field”, Edge, and Enterprise Clouds - dynamic integration of capability, resource, and accepted configurations to solve a business/operational need

- “Infrastructure cloud" standards to develop consensus across vendors to reduce lock-in to a given vendor or platform.

- Develop Net-Centric Patterns on well developed instances

- Layered Quality of Service for Cloud Computing

  • Explore the effects between the cloud computing paradigm and other NCOIC deliverables like NIF, NCAT etc., including adding our Cloud Computing taxonomy to the NCOIC Lexicon.

Kevin Jackson (Dataline, LLC) will serve as the group’s interim chairman while Robert Marcus (SRI International) was designated as the interim Vice-Chairman. Formal elections for all leadership positions will be scheduled within the month.

Andrew Yee asks Is Cloud Computing Killing Open Source? in this 9/14/2009 CloudTalk post:

Gartner's Andrea Di Maio seems to think so in his blog post a couple of weeks ago.

According to Di Maio, the primary advantages for open source - vendor independence (since you have the source code) and cost (it's "free" plus you get to leverage the collective strength of the community) are no longer valid...at least not for government agencies. He argues that cloud computing offer a better proposition for these agencies and by extension other larger enterprises.

Is Di Maio right? Is it true that cloud computing is an alternative to open source, rather than a complementary technology? Di Maio has probably overstated the cloud computing effect on open source but he may be on to something.

The Federal Trade Commission’s FTC Issues Final Breach Notification Rule for Electronic Health Information notice of 8/17/2009 reports, in part:

Congress directed the FTC to issue the rule as part of the American Recovery and Reinvestment Act of 2009. The rule applies to both vendors of personal health records – which provide online repositories that people can use to keep track of their health information – and entities that offer third-party applications for personal health records. These applications could include, for example, devices such as blood pressure cuffs or pedometers whose readings consumers can upload into their personal health records. Consumers may benefit by using these innovations, but only if they are confident that their health information is secure and confidential.

Many entities offering these types of services are not subject to the privacy and security requirements of the Health Insurance Portability and Accountability Act (HIPAA), which applies to health care service providers such as doctors’ offices, hospitals, and insurance companies. The Recovery Act requires the Department of Health and Human Services to conduct a study and report by February 2010, in consultation with the FTC, on potential privacy, security, and breach-notification requirements for vendors of personal health records and related entities that are not subject to HIPAA. In the meantime, the Act requires the Commission to issue a rule requiring these entities to notify consumers if the security of their health information is breached. The Commission announced a proposed rule in April 2009, collected public comments until June 1, and is issuing the Final Rule today.

The Final Rule requires vendors of personal health records and related entities to notify consumers following a breach involving unsecured information. In addition, if a service provider to one of these entities has a breach, it must notify the entity, which in turn must notify consumers. The Final Rule also specifies the timing, method, and content of notification, and in the case of certain breaches involving 500 or more people, requires notice to the media. Entities covered by the rule must notify the FTC, and they may use a standard form, which can be found along with additional information about the rule at www.ftc.gov/healthbreach.

The new rule undoubtedly covers HealthVault, PassportMD, RelayHealth, and similar existing PHR services with Web UIs, as well as any new cloud-based PHR services.

Kelly Jackson Higgins reports DNS Cloud Security Services Arrive: “OpenDNS offers new subscription-based secure DNS service, other vendors' DNS services to follow” on 9/4/2009 for InformationWeek’s DarkReading column:

One of the first cloud-based secure DNS services was launched today amid intensified concerns over locking down vulnerable Domain Name Service servers.

OpenDNS, which provides a free DNS service for consumers and schools, now is offering a subscription-based commercial service for enterprises. Other vendors, such as Nominum, are considering offering secure DNS cloud services as well.

DNS security has gotten more attention than ever in the wake of the discovery of a major hole in DNS that was revealed by researcher Dan Kaminsky, and was later patched by several vendors. The so-called cache-poisoning flaw could allow an attacker to guess the transaction ID of a Web query and let the attacker hijack queries. Meanwhile, the Internet community has stepped up efforts to adopt the DNSSEC standard for protecting the DNS translation process from being compromised.

<Return to section navigation list> 

Cloud Computing Events

Steve Evans and Dave Nielsen announce Welcome to CloudCamp @ SVCodeCamp - Sunday, Oct 4th:

CloudCamp is an unconference where early adopters of Cloud Computing technologies exchange ideas. With the rapid change occurring in the industry, we need a place we can meet to share our experiences, challenges and solutions. At CloudCamp, you are encouraged you to share your thoughts in several open discussions, as we strive for the advancement of Cloud Computing. End users, IT professionals and vendors are all encouraged to participate.

Planning for CloudCamp @ CodeCamp is underway. If you are interested in helping, contact Steve Evans.

When: 9:00 AM to Whenever   
Where: Foothill College, 12345 El Monte Road (Parking Lot 5), Los Altos Hills, CA 94022, USA (co-located with Silicon Valley Code Camp)

The Clinical Groupware Collaborative will host “an informal get together on Tuesday, September 22, 2009, between 6:00 and 7:30 PM at the Hilton San Diego Bayfront Hotel, Aqua Room 304.  The gathering will take place immediately following the close of the DMAA conference (Disease Management Association of America).”

See the Live Windows Azure Apps, Tools and Test Harnesses section for more details.

When: 9/22/2009, 6:00 to 7:30 PM PDT   
Where: Hilton San Diego Bayfront Hotel, Aqua Room 304, San Diego, CA

David Chappell’s Azure World Tour continues in the following locations:

Click the appropriate link to sign up with Microsoft Events. Microsoft’s Partner Evangelist says:

David is a well respected authority on cloud computing and one of the most gifted presenters I have had the pleasure of working with and learning from. David will offer his insights on the Windows Azure platform and what it means for System Integrators (SI), Independent Software Vendors (ISVs), custom software development firms and enterprises. The events are in high-demand and since we only have a limited number of seats, please take a minute to register now.

When: See above   
Where: See above

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Andrea DiMaio’s Vendors Catch the Apps.gov Wind post of 9/16/2009 says:

Shortly after the official launch of the GSA cloud storefront Apps.gov, vendors that feature multiple times in the business and productivity application sections of the web site have leveraged their position.

Salesforce.com issued a press release  about its inclusion in Apps.gov, while Google announced Google Public Sector, a directory of existing Google offerings packaged with information that speak to a public sector audience. As reported by Infoworld yesterday, in conjunction with the Apps.gov launch, Google also announced the availability of cloud services designed specifically for US government agencies earlier next year (by when it aims at getting the required FISMA certification).

I expect players like Microsoft, Oracle, SAP and others to move rapidly in this space to establish their role as cloud application service providers. How ready federal clients are to embrace these new models for mission-critical applications remains to be determined.

Also, looking at the current plans of the federal administration, I cannot find much about platform as a service. This is key to address application integration in a scenario that becomes more rather than less complicated, with a combination of public and private cloud infrastructure, SaaS and on-premise applications. I am looking forward to seeing where IBM, Microsoft, Tibco and other aPaaS solutions fit into the US feds’ scheme of things.

I’m also interested in seeing where Microsoft’s Windows Azure and SQL Azure “fit into the US feds’ scheme of things” (See below.)

•• David Linthicum explains briefly on 9/16/2009 Why Apps.gov is a game-changer: “The announcement of Apps.gov is the first step in a long journey to the clouds by the U.S. government”:

… I'm not sure that anyone was surprised by the announcement; the U.S. government has a huge interest in cloud computing, and the GSA is leading the way. Existing cloud computing providers see the opportunity as well with Google creating a government version of its cloud offering. The rest of the large and small cloud computing minions are looking hard at the emerging government marketplace too. We could be heading to a time where the U.S. government is actually leading the way with an emerging technology, creating best practices well ahead of the commercial world.

Scott Watermasysk describes Qizmt – MapReduce On Windows from MySpace in this 9/15/2009 post:

A very interesting announcement by the folks at MySpace:

Today, we are open-sourcing Qizmt, an internally developed framework for distributed computation created by the Data Mining team here at MySpace. Qizmt can be used for many operations that require processing large amounts of data such as collaborative filtering for recommendations and analytics.

Contrary to what many of the tech pundits are reporting, this is not a recommendation engine. It is an implementation of MapReduce written for windows.

This opens up a nice set of options which are generally not available to developers on Windows. Heck, even [B]ing appears to leverage Hadoop.

Scott continues with a list of features, which begins:

  • Rapidly develop mapreducer jobs in C#.Net
  • Easy Do-It-Yourself Installer
  • Built-in IDE/Debugger
    • Automatically colors heap allocations in red
    • Autocomplete for rapid mapreducer development
    • Step through and debug mapreducer jobs directly on target cluster …

Is Qizmt pronounced kizmet?

Andrea DiMaio’s US Government Launches Cloud Application Store, But The Toughest Questions Remain Unanswered post of 9/16/2009 casts a jaundiced eye on App.gov:

Yesterday, in a speech given at the NASA Ames Research Center (watch it on YouTube), the US federal CIO Vivek Kundra announced the launch of Apps.gov, the GSA storefront to give federal agencies access to cloud services. This has been in the making for a while and the launch was originally scheduled during the Gov 2.0 Summit on 9-10 September.

The launch has created a big buzz on the Internet, with hundreds of tweets, blog posts, articles mostly supportive of this new initiative. The storefront provides access to four categories of services: business applications (including CRM, ERP), productivity applications (such as collaboration, office suites), infrastructure services (such as storage, virtual machines, web hosting) and social software.

This is part of a three-leg strategy, including also budgetary focus (to support a number of pilots on cloud computing for collaboration and lightweight workflows in 2010 and to provide guidance to agencies to adopt cloud computing in 2011) and policy planning & architecture (addressing centralized certification, security, privacy, etc.)

The social software  section is of particular interest, as it finally reveals to the public the amended terms of use agreed by GSA with some of the social software providers (see here for the Facebook example). …

Andrea continues with several on-target “initial observations about areas for improvement and clarification.”

•• Neil Versel claims Apple to make a push into healthcare in this 9/11/2009 post:

It seems inevitable, given the success of the iPhone in healthcare, but I'm hearing that Apple is getting ready to make a full-scale push into healthcare. I understand that the company invited several vendors to a meeting at an Apple office in Chicago this week. I have no further details on what was said or who was present, but I know that there are a couple of EMR vendors out there who have tailored their products for Macintosh, even if it's just optimizing the view over the Internet for the Safari browser.

There is this little matter of the billions of dollars in federal money being funneled into health IT over the next eight years, and Steve Jobs would be an idiot if he didn't go after some of the cash. Steve Jobs is no idiot.

Matthew Glotzbach announces in his Google Apps and Government post of 9/15/2009 to the Google Public Policy Blog (in conjunction with Vivek Kundra’s announcement at NASA Ames in Mountain View, CA of the launch of the Apps.gov online storefront for cloud computing apps and services):

… We also want to do our part to make it easier for government to transition to cloud computing. We recognize that government agencies have unique regulatory and compliance requirements for IT systems, and cloud computing is no exception. So we've invested a lot of time in understanding government's needs and how they relate to cloud computing. To help meet those requirements we're taking two important steps:

  • FISMA certification for Google Apps. In July, we announced our intent to secure certification for Google Apps to demonstrate compliance with the Federal Information Security Management Act (FISMA), the law defining security requirements that must be met by all US Federal government information systems. Our FISMA process is nearing completion. We will submit a Certification and Accreditation (C&A) package to the U.S. Government before the end of this year. Upon review and approval of the Google Apps C&A package, agencies will be able to deploy Google Apps knowing that it is authorized to operate under FISMA.
  • Dedicated Google cloud for government customers in the US. Today, we're excited to announce our intent to create a government cloud, which we expect to become operational in 2010. Offering the same services and features as our existing commercial cloud (such as Google Apps), this dedicated environment within existing Google facilities in the US will serve the unique needs of US federal, state, and local governments. It is similar to a "Community Cloud" as defined by the National Institute for Science and Technology. The government cloud will allow Google to manage and meet additional government policy requirements beyond FISMA.

We look forward to working with governments across the country on these exciting initiatives in the months ahead.

Matthew Glotzbach is Director of Product Management for Google Enterprise; Vivek Kundra is the new CIO of the US Federal Government.

• Werner Vogels tries to get Amazon a seat on the Federal gravy train by Expanding the Cloud: Amazon Web Services to support the Federal Government on 9/15/2009:

… Next to the ability to focus more on delivering value instead of managing infrastructure, the other benefits of cloud computing will also become of great importance to the public sector: the cost savings that they will be able to achieve can be immediately applied to truly meaningful programs and the self-service and elasticity of the cloud will help them bring programs to market much faster than they were ever able to do before. On stage with me at Gov 2.0 Casey mentioned that at this moment about 45% of Federal Computing projects could be considered for powering by the cloud, so the opportunities for the government to reduce cost and become more agile are significant.

I am looking forward to working closely with the Federal CIOs to make sure our services can meet the requirements that can make them successful in their quest. …

Daniel Terdiman provides an independent report of Kundra’s visit to NASA’s Ames Research Center in Google’s home town in this White House unveils cloud computing initiative article of 9/15/2009 for CNet News:

The Obama administration on Tuesday announced a far-reaching and long-term cloud computing policy intended to cut costs on infrastructure and reduce the environmental impact of government computing systems.

Speaking at NASA's Ames Research Center here, federal CIO Vivek Kundra unveiled the administration's first formal efforts to roll out a broad system designed to leverage existing infrastructure and in the process, slash federal spending on information technology, especially expensive data centers.

According to Kundra, the federal government today has an IT budget of $76 billion, of which more than $19 billion is spent on infrastructure alone. And within that system, he said, the government "has been building data center after data center," resulting in an environment in which the Department of Homeland Security alone, for example, has 23 data centers.

 

But while some of the benefits of the administration's cloud computing initiative are on display today--mainly at the brand new Apps.gov Web site--Kundra's presentation was short on specifics and vague about how long it may take the government to transition fully to its new paradigm.

Indeed, Kundra hinted that it could take as much as a decade to complete the cloud computing "journey." …

A 00:30:18 high-quality video of Kundra’s NASA Ames presentation is on You Tube here. Kundra starts speaking at 00:04:17.

This article is in the Other Cloud Computing Platforms and Services section because no one from Microsoft or the Azure team has contributed their two cents worth to the thread.

• Mary Hayes Weier reports Salesforce.com’s IT department dogfooding in her InformationWeek 500: SaaS Leader Salesforce.com Lives Cloud Computing article of 9/15/2009 for InformationWeek:

Marc Benioff has been predicting the death of software since he formed Salesforce (NYSE: CRM).com in 1999. But until a few years ago, when Salesforce began moving its applications and some IT infrastructure into the cloud, the company ran its own operations using on-premises software.

Now, by living cloud computing and not just preaching it, Salesforce is showing the transformational benefits--and not just the cost cuts that come with paring down data centers and dispensing with software licensing fees. The company's move into the cloud has brought profound changes to Salesforce's internal IT organization, including how developers work with business execs and even who develops the software. …

Randy Bias’ VMware vs. Amazon … ROUND ONE … FIGHT! post of 9/15/2009 begins:

More and more it’s becoming apparent that VMware and Amazon are headed for a serious collision.  Amazon is eager to capture more of the enterprise business market, VMware’s bread and butter.  Meanwhile, VMware is actively supporting a new crop of Amazon competitors with its recent vCloud Express release.  More importantly, what perhaps neither have realized or, at least as far I can tell Amazon hasn’t realized, is that the battle isn’t ultimately about so-called ‘public’ or ‘private’ clouds[1], but about standards, de facto or otherwise.

Randy then continues with a detailed analysis of the following topics:

    • Infrastructure History & Standards
    • The Contenders & Their Market Positions
    • The Standards Collision
    • What Happens When Worlds Collide?

Marianne McGee reports that Hospitals Are Helping Docs Defray EMR Costs, Challenges in a 9/14/2009 post to the InformationWeek Healthcare blog:

Are hospital-sponsored e-medical records the best way to get lots of doctors using these systems in their offices quickly and affordably? …

In fact, some hospitals in Boston and elsewhere seem to think these changes in federal regulations---combined with cloud computing models--provides a better shot in getting docs using EMRs in their practices vs. the doctors purchasing, deploying and supporting these systems themselves. [Emphasis added.]

In Boston, the latest hospital to roll out EMRs for its affiliated physician practices is Tufts Medical Center. The hospital is working with Dell and EMR vendor eClinicalWork, which is hosting the EMR and practice management software used by the docs via a SaaS model. Tufts joins at least one other Boston area hospital--Beth Israel Deaconess Medical Center--in sponsoring EMR rollouts to help reduce costs and ease support worries of hundreds of their affiliated physicians offices in digitizing patient records.

In a recent webcast sponsored by Dell, Tufts Medical Center CIO Bill Shickolovich said the hospital's IT organization is working "as a services provider" in helping hundreds of physician offices in the Boston community deploy EMRs "as swiftly and as safely as possible" with "the least amount of capital expenditure." …

<Return to section navigation list> 

blog comments powered by Disqus