Monday, September 19, 2011

Windows Azure and Cloud Computing Posts for 9/19/2011+

A compendium of Windows Azure, SQL Azure Database, AppFabric, Windows Azure Platform Appliance and other cloud-computing articles. image222

image433

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:


Azure Blob, Drive, Table and Queue Services

Avkash Chauhan reminded readers of a Great new feature introduced in Windows Azure Storage: Geo Replication in a 9/18/2011 post:

imageNow anyone using Windows Azure Storage can sleep without any tension, by knowing that their data in Windows Azure Storage (blob, table and queue) have another safe copy replicated within the same region. I am awake at 10:20 PM for other reasons however I am taking sigh of relief by knowing that I have a copy of my data is replicated..

imageThe best part, it is FREE!!!

A few terms to know about:

  • Geo-Failover
  • Geo-Replication Transaction Consistency
  • Primary & Secondary Location
  • Disabling Geo-Replication
  • Re-Bootstrap Storage Account
  • Cross-PartitionKey relationships

To learn about above term and many more please visit at: http://blogs.msdn.com/b/windowsazurestorage/archive/2011/09/15/introducing-geo-replication-for-windows-azure-storage.aspx 


Preps2 posted a Comparison of Windows Azure Storage Queues and Service Bus Queues on 9/17/2011:

Feature

Windows Azure Storage Queues

Service Bus Queues

Comments

Programming Models      
Raw REST/HTTP Yes Yes  
.NET API Yes(Windows Azure Managed Library) Yes(AppFabric SDK)  
Windows Communication Foundation (WCF) binding No Yes  
Windows Workflow Foundation (WF) integration No Yes  
Protocols      
Runtime REST over HTTP REST over HTTP Bi-directional TCP The Service Bus managed API leverages the bi-directional TCP protocol for improved performance over REST/HTTP.
Management REST over HTTP REST over HTTP  
Messaging Fundamentals      
Ordering Guarantees No First-In-First-Out (FIFO) Note: guaranteed FIFO requires the use of sessions.
Message processing guarantees At-Least-Once (ALO) At Least-Once (ALO)Exactly-Once (EO) The Service Bus generally supports the ALO guarantee; however EO can be supported by using SessionState to store application state and using transactions to atomically receive messages and update the SessionState. The AppFabric workflow uses this technique to provide EO processing guarantees.
Peek Lock Yes Visibility timeout: default=30s; max=2h Yes Lock timeout: default=30s; max=5m Windows Azure queues offer a visibility timeout to be set on each receive operation, while Service Bus lock timeouts are set per entity.
Duplicate Detection No Yes, send-side duplicate detection The Service Bus will remove duplicate messages sent to a queue/topic (based on MessageId).
Transactions No Partial The Service Bus supports local transactions involving a single entity (and its children). Transactions can also include updates to SessionState.
Receive Behavior Non-blocking, i.e., return immediately if no messages REST/HTTP: long poll based on user-provided timeout.NET API: 3 options: blocking, blocking with timeout, non-blocking.  
Batch Receive Yes(explicit) Yes. Either (a) Implicitly using prefetch, or (b) explicitly using transactions.  
Batch Send No Yes (using transactions)  
Receive and Delete No Yes Ability to reduce operation count (and associated cost) in exchange for lowered delivery assurance.
Advanced Features      
Dead lettering No Yes Windows Azure queues offer a ‘dequeue count’ on each message, so applications can choose to delete troublesome messages themselves.
Session Support No Yes Ability to have logical subgroups within a queue or topic.
Session State No Yes Ability to store arbitrary metadata with sessions. Required for integration with Workflow.
Message Deferral No Yes Ability for a receiver to defer a message until they are prepared to process it. Required for integration with Workflow.
Scheduled Delivery No Yes Allows a message to be scheduled for delivery at some future time.
Security      
Authentication Windows Azure credentials ACS roles ACS allows for three distinct roles: admin, sender and receiver. Windows Azure has a single role with total access, and no ability for delegation.
Management Features      
Get Message Count Approximate No Service Bus queues offer no operational insight at this point, but plan to in the future.
Clear Queue Yes No Convenience functions to clear queue efficiently.
Peek / Browse Yes No Windows Azure queues offer the ability to peek a message without locking it, which can be used to implement browse functionality.
Arbitrary Metadata Yes No Windows Azure queues allow an arbitrary set of <key, value> pairs on queue metadata.
Quotas/Limits      
Maximum message size 648KB 256KB  
Maximum queue size Unlimited 5GB Specified at queue creation, with specific values of 1,2,3,4 or 5 GB.
Maximum number of entities per service namespace n/a

10,000

 

Neil MacKenzie noted that “Windows Azure SDK v1.5 increased maximum message size to 64KB” in a 9/18/2011 comment. (Corrected above.)


<Return to section navigation list>

SQL Azure Database and Reporting

Rob Tiffany (@robtiffany) posted Sync Framework v4 is now Open Source, and ready to Connect any Device to SQL Server and SQL Azure on 9/11/2011 (missed when published):

imageThe profound effects of the Consumerization of IT (CoIT) is blurring the lines between consumers and the enterprise. The fact that virtually every type of mobile device is now a candidate to make employees productive means that cross-platform, enabling technologies are a must. Luckily, Microsoft has brought the power to synchronize data with either SQL Server on-premise or SQL Azure in the cloud to the world of mobility. If you’ve ever synched the music on your iPhone with iTunes, the calendar on your Android device with Gmail, or the Outlook email on your Windows Phone with Exchange, then you understand the importance of sync. In my experience architecting and building enterprise mobile apps for the world’s largest organizations over the last decade, data sync has always been a critical ingredient.

The new Sync Framework Toolkit found on MSDN builds on the existing Sync Framework 2.1′s ability to create disconnected applications, making it easier to expose data for synchronization to apps running on any client platform. Where Sync Framework 2.1 required clients to be based on Windows, this free toolkit allows other Microsoft platforms to be used for offline clients such as Silverlight, Windows Phone 7, Windows Mobile, Windows Embedded Handheld, and new Windows Slates. Additionally, non-Microsoft platforms such as iPhones, iPads, Android phones and tablets, Blackberries and browsers supporting HTML5 are all first-class sync citizens. The secret is that we no longer require the installation of the Sync Framework runtime on client devices. When coupled with use of an open protocol like OData for data transport, no platform or programming language is prevented from synchronizing data with our on-premise and cloud databases. When the data arrives on your device, you can serialize it as JSON, or insert it into SQL Server Compact or SQLite depending on your platform preferences.

The Sync Framework Toolkit provides all the features enabled by theSync Framework 4.0 October 2010 CTP. We are releasing the toolkit as source code samples on MSDN with the source code utilizing Sync Framework 2.1. Source code provides the flexibility to customize or extend the capabilities we have provided to suit your specific requirements. The client-side source code in the package is released under the Apache 2.0 license and the server-side source code under the MS-LPL license. The Sync Framework 2.1 is fully supported by Microsoft and the mobile-enabling source code is yours to use, build upon, and support for the apps you create.

imageNow some of you might be wondering why you would use a sync technology to move data rather than SOAP or REST web services. The reason has to do with performance and bandwidth efficiency. Using SOA, one would retrieve all the data needed to the device in order to see what has changed in SQL Server. The same goes for uploading data. Using the Sync Framework Toolkit, only the changes, or deltas, are transmitted over the air. The boosts performance and reduces bandwidth usage which saves time and money in a world of congested mobile data networks with capped mobile data plans. You also get a feature called batching, which breaks up the data sent over wireless networks into manageable pieces. This not only prevents you from blowing out your limited bandwidth, but it also keeps you from using too much RAM memory both on the server and your memory-constrained mobile device. When combined with conflict resolution and advanced filtering, I’m sold!

I think you’ll find the Sync Framework Toolkit to be an immensely valuable component of your MEAP solutions for the enterprise as well as the ones you build for consumers.

Keep Synching,


<Return to section navigation list>

MarketPlace DataMarket and OData

Glenn Gailey (@ggailey777) posted Creating “Cool Looking” Windows Phone Apps on 9/19/2011:

imageSo far, I’ve written several Window Phone 7 apps, including the OData and Windows Phone quickstart, a Northwind-based app for my MVVM walkthrough, and a couple of others—mostly consuming (as you might guess) OData. Since I created these apps to supplement documentation, they have never been much to look at—I never intended to publish them to the Marketplace. However, now that I have my Samsung Focus unlocked for development, I figured it was time to try to create a “real-world” app that I can a) get certified for the Marketplace and b) I can be proud of having on my mom’s phone.

imageFortunately, I’ve been working on an Azure-based OData content service project that can integrate very nicely with a Window Phone app, so for the past few weeks I have been coding and learning what makes these apps look so cool. I’ll try to share some of what I’ve learned in this post.

Coolest Windows Phone Apps (IMHO)

Just to give you some idea of where I can coming from, here’s a partial list of some of the cooler-looking apps that I’ve seen on the Windows Phone platform (feel free to add your favs in comments to this post):

  • Pictures – this built-in app features a Panorama control that displays layers of pictures, which makes even the most mundane snaps seem cooler.
  • IMDB – OK I admit that I must use IMDB when I watch movies, I’m weird that way. Plus this app is slick, with it’s use of multiple levels of Panorama controls and tons of excellent graphics (image is everything in Hollywood, right?)—if anything there may be too much bling.
  • Ferry Master – A nifty little app written by someone on the phone team for folks who take Washington State ferries, including background images of Puget Sound scenes.
  • Mix11 Explorer – this app, written for the Mix 2011 conference, uses a simple layering of basic shapes that mirrors the design of the Mix 2011 web site.
  • Kindle – Amazon’s classy eReader on the phone, with obviously high production value and graphics.

I would love to show screenshots of these, but most screens aren’t even available outside of the phone (just search Marketplace for these app names and check-out their screenshots.)

Find Classy Graphics

I know that the general design principles from the phone folks are that “modern design in a touch application is undecorated, free of chrome elements, and minimally designed.” However, graphics is the kind of content that really pops on the phone. Make sure that the graphics that you use are clean and have impact, and that you have the rights to use them or you could get blocked in certification (and you don’t want to get hassled by the owner after you publish the app). Many professional apps leverage some of the brand images and design themes of their corporate web sites.

Use Expression Blend

Up to this point, I’ve used exclusively Visual Studio Express to write my Windows Phone apps. As I mentioned, my apps have mostly involved programming the OData Client Library for Windows Phone. While much better for writing code (IMHO) and essential for debugging, the design facilities in Visual Studio are, shall we say, limited. Unless you are an expert in the powerful-but-labyrinthine XAML expression language, you are going to need some extra help. Fortunately, Microsoft’s Expression Blend is designed specifically for XAML (WPF, Silverlight, Windows Phone), and it even supports animations. Here’s a rundown of Expression Blend versus Visual Studio Express:

Expression Blend

While I’m still not completely comfortable with the UI, Expression Blend has proven very adept at these design aspects:

  • Applying styles – Expression is much more intuitive and visual than VS, even making it easy to create gradients
  • Working with graphics – very easy to add background images to elements
  • Animations – I haven’t even tried any of these yet, but good luck creating a storyboard in VS
  • Entering text – XAML is weird with multi-line text in text-blocks, and Expression will generate this code automagically
  • Preview in both orientations and in both light and dark themes – this is akin to previewing web pages in multiple browsers—make sure you take advantage of this functionality
Visual Studio Express

Visual Studio is hands-down best with these programming aspects:

  • Solution/project management – I prefer to setup the project and add resources in VS
  • Build/debug – both will launch the emulator, but in VS you can actually debug in the emulator—or the device, which is much better
  • Add Service Reference – I don’t think that Expression has anything like this tool, which is very important for OData (especially in Mango when it actually works)

Because each IDE has it’s own strengths, I’ve found myself keeping my project open in both Expression and VS, and flipping back and forth during development.

Use Layering

In XAML, you can control the layer (z) of elements in the display as well as the opacity of elements. This control enables you to create a nice, modern layered look (think of today’s Windows 7 versus XP) with background element being partly visible through elements in front. This enables you to create a more “composed” screen with a sense of depth.

User Interaction

Smartphones are interactive devices, and you can make it respond to orientation changes, motion, location, and even giggling. Plus, navigating a page by swiping with your finger is the best part. In fact, swiping is so cool that Windows 8 features it heavily in the newly-announced Windows Metro (which looks a lot like Windows Phone 7). Make sure that you include some of this in your apps, at least handle the orientation change from portrait to landscape, which is easier than you think if you do your XAML correctly.

Panorama Control is Cool

As a key component to nearly all of my favorite apps, the Panorama control features all of the aspects we just discussed: leveraging graphics, using layers, and user interactions. It’s basically a long horizontal control that, unlike the Pivot control, displays a single, contiguous background image across multiple screen-sized items. As the user flips between items, there is a nice layered motion effect (like in those 1930’s Popeye cartoons) where the individual items, and their graphics, moves faster than the background images, giving an illusion of depth to the app.

Follow the Guidelines

To support the general “coolness” of the platform, the Windows Phone team has published an extensive set of design guidelines. Of course, they recommend that you stick as closely as possible to the phone-driven themes (which the IMDB app doesn’t do too well), but most of the guidance is meant to promote easy-to-use apps and a more uniform platform. (I’m not sure how this compares to, say, iPhone apps, since I’ve never had an iPhone.)


At any rate, since I am planning to go through the entire Marketplace process, I will probably post another blog with the results of my adventure.


<Return to section navigation list>

Windows Azure AppFabric: Apps, Access Control, WIF and Service Bus

Clemens Vasters (@clemensv) posted Service Bus Topics and Queues – Advanced on 9/18/2011:

image72232222222This session is a followup to the Service Bus session that I did at the build conference and explains advanced usage patterns:

Clemens later posted Securing Service Bus with the Access Control Service post on 9/19/2011:

This session explains how to secure Service Bus using the Access Control Service. This is also an extension session for my session at BUILD, but watching the BUILD session is not a strict prerequisite.

 


Abishek Lal described Windows Azure Service Bus Queues, Topics / Subscriptions in a 9/14/2011 post (missed when published):

imageWe just released to production the Queues, Publish-Subscribe with Topics / Subscriptions features that were earlier showcased in Community Technology Preview for Service Bus. For an introductory period these Brokered Messaging features are free, but you continue to pay Azure data transfer. Also the Relay features continue to be charged as before. Following are steps to quickly get started, and the links to additional resources:

image72232222222The steps below are to download and run a simple application that showcases the publish-subscribe capabilities of Service Bus. You will need to have Visual Studio and NuGet installed. We will start by creating a new console application in VS:

1blog

Open the Project –> Properties and change the Target framework to .NET Framework 4

2blog

From the References node in Solution Explorer click on the context menu item for Add Library Package Reference. This will show only if you have NuGet Extension installed. (To learn more about NuGet see this TechEd video).

3blog

Search for “AppFabric” and select the Windows Azure AppFabric Service Bus Samples – PublishSubscribe item. Then complete the Install and close this dialog.

4blog

Note that the required client assemblies are now referenced and some new code files are added.

5blog

Add the following line to the main method in Program.cs and hit F5

 Microsoft.Samples.PublishSubscribe.Program.Start(null);

At this point you will be prompted to provide a ServiceNamespace and Issuer Name and Key. You can create your own namespace from https://windows.azure.com/

6blog

Once a new namespace has been created you can retrieve the key from the Properties section by clicking View under Default Key:

7blog

These values can now be used in the Console application:

8blog

At this point you can run thru the different scenarios showcased by the sample. Following are some additional resources:

Looking forward to your feedback / questions / concerns / suggestions.


Rick Garibay (@rickggaribay) reported Azure AppFabric Service Bus Brokered Messaging GA & Rude CTP Diffs on 5/14/2011 (missed when posted):

imageToday at the Build conference in Anaheim California, Satya Nadella, President Server and Tools business announced general availability of the production release of AppFabric Queues and Topics, otherwise known as Brokered Messaging.

Brokered Messaging introduces durable queue capabilities and rich, durable pub-sub with topics and subscriptions that compliment the existing Relayed Messaging capabilities.

I covered Brokered Messaging following the May CTP release of Queues and followed up shortly with an overview and exploration of Topics (please see some other great resources at the end of this post).

Since then, there was a June CTP release which included the new AppFabric Application and no visible changes to Brokered Messaging, however since its release, the AppFabric Messaging team has been hard at work refining the API and behaviors based on feedback from Advisors, MVPs and the community at large.

Since I’ve already covered Queues and Topics in the aforementioned posts, I’ll dive right in to some terse examples which demonstrate the API changes. Though not an exhaustive review of all of the changes, I’ve covered the types that your most likely to come across and will cover Queues, Topics and Subscriptions extensively in my upcoming article in CODE Magazine which will also include more in-depth walk-throughs of the .NET Client API, REST API and WCF scenarios.

Those of you who have worked with the CTPs will find some subtle and not so subtle changes, but all in all I think all of the refinements are for the best and I think you’ll appreciate them as I have. For those new to Azure AppFabric Service Bus Brokered Messaging, you’ll benefit most from reading my first two posts based on the May CTP (or any of the resources at the end of this post) to get an idea of the why behind queues and topics and then come back here to explore the what and how.

A Quick Note on Versioning

In the CTPs that preceded the release of the new Azure AppFabric Service Bus features, a temporary assembly called “Microsoft.ServiceBus.Messaging.dll” was added to serve a container for new features and deltas that were introduced during the development cycle. The final release includes a single assembly called “Microsoft.ServiceBus.dll” which contains all of the existing relay capabilities that you’re already familiar with as well as the addition of support for queues and topics. If you are upgrading from the CTPs, you’ll want to get ahold of the new Microsoft.ServiceBus.dll version 1.5 which includes everything plus the new queue and topic features.

The new 1.5 version of the Microsoft.ServiceBus.dll assembly targets the .NET 4.0 framework. Customers using .NET 3.5 can continue using the existing Microsoft.ServiceBus.dll assembly (version 1.0.1123.2) for leveraging the relay capabilities, but must upgrade to .NET 4.0 to take advantage of the latest features presented here.

.NET Client API

Queues

image

Below is a representative sample for creating, configuring, sending and receiving a message on a queue:

Administrative Operations

  1: 
  2:             // Configure and create NamespaceManager for performing administrative operations
  3:             NamespaceManagerSettings settings = new NamespaceManagerSettings();
  4:             TokenProvider tokenProvider = settings.TokenProvider = TokenProvider.CreateSharedSecretTokenProvider(issuer,key);
  5:             
  6:             NamespaceManager manager = new NamespaceManager(ServiceBusEnvironment.CreateServiceUri("sb", serviceNamespace, string.Empty), settings);
  7: 
  8:             // Check for existence of queues on the fabric
  9:             var qs = manager.GetQueues();
 10: 
 11:             var result = from q in qs
 12:                          where q.Path.Equals(queueName, StringComparison.OrdinalIgnoreCase)
 13:                          select q;
 14: 
 15:             if (result.Count() == 0)
 16:             {
 17:                 Console.WriteLine("Queue does not exist");
 18: 
 19:                 // Create Queue
 20:                 Console.WriteLine("Creating Queue...");
 21: 
 22:                 manager.CreateQueue(new QueueDescription(queueName) { LockDuration = TimeSpan.FromSeconds(5.0d) });
 23:                 
 24:             }

Runtime Operations

  1:             // Create and Configure Messaging Factory to provision QueueClient
  2:             MessagingFactorySettings messagingFactorySettings = new MessagingFactorySettings();
  3:             messagingFactorySettings.TokenProvider = settings.TokenProvider = TokenProvider.CreateSharedSecretTokenProvider(issuer, key);
  4:             MessagingFactory messagingFactory = MessagingFactory.Create(ServiceBusEnvironment.CreateServiceUri("sb", serviceNamespace, string.Empty), messagingFactorySettings);
  5:             
  6:             QueueClient queueClient = messagingFactory.CreateQueueClient(queueName, ReceiveMode.PeekLock);
  7: 
  8:             Order order = new Order();
  9:             order.OrderId = 42;
 10:             order.Products.Add("Kinect", 70.50M);
 11:             order.Products.Add("XBOX 360", 199.99M);
 12:             order.Total = order.Products["Kinect"] + order.Products["XBOX 360"];
 13: 
 14:             // Create a Brokered Message from the Order object
 15:             BrokeredMessage msg = new BrokeredMessage(order);
 16: 
 17:             /***********************
 18:             *** Send Operations  ***
 19:             ************************/
 20: 
 21:             queueClient.Send(msg);
 22: 
 23:             /**************************
 24:              *** Receive Operations ***
 25:             ***************************/
 26:             
 27:             BrokeredMessage recdMsg;
 28:             Order recdOrder;
 29: 
 30:             // Receive and lock message
 31:             recdMsg = queueClient.Receive();
 32: 
 33:             if(recdMsg != null)
 34:             {
 35:                 // Convert from BrokeredMessage to native Order
 36:                 recdOrder = recdMsg.GetBody<Order>();
 37: 
 38:                 Console.ForegroundColor = ConsoleColor.Green;
 39:                 Console.WriteLine("Received Order {0} \n\t with Message Id {1} \n\t and Lock Token:{2} \n\t from {3} \n\t with total of ${4}", recdOrder.OrderId, recdMsg.MessageId, recdMsg.LockToken, "Receiver 1", recdOrder.Total);
 40:                 recdMsg.Complete();
 41:             }
 42:             queueClient.Close();

Note that MessageSender and MessageReceiver are now optional. Here’s an example that shows PeekLocking a message, simulating an exception and trying again:

  1: 
  2:             // Alternate receive approach using agnostic MessageReceiver
  3:             MessageReceiver receiver = messagingFactory.CreateMessageReceiver(queueName);            // Recieve, complete, and delete message from the fabric
  4: 
  5:             try
  6:             {
  7:                 // Receive and lock message
  8:                 recdMsg = receiver.Receive();
  9: 
 10:                 // Convert from BrokeredMessage to native Order
 11:                 recdOrder = recdMsg.GetBody<Order>();
 12: 
 13:                 // Complete read, release and delete message from the fabric
 14:                 receiver.Complete(recdMsg.LockToken);
 15: 
 16:                 Console.ForegroundColor = ConsoleColor.Green;
 17:                 Console.WriteLine("Received Order {0} \n\t with Message Id {1} \n\t and Lock Token:{2} \n\t from {3} \n\t with total of ${4} \n\t at {5}", recdOrder.OrderId, recdMsg.MessageId, recdMsg.LockToken, "Receiver 2", recdOrder.Total, DateTime.Now.Hour + ":" + DateTime.Now.Minute + ":" + DateTime.Now.Second);
 18:             }
 19:             catch
 20:             {
 21:                 // Should processing fail, release the lock from the fabric and make message available for later processing.
 22:                 if (recdMsg != null)
 23:                 {
 24:                     receiver.Abandon(recdMsg.LockToken);
 25:                     
 26:                     Console.ForegroundColor = ConsoleColor.Red;
 27:                     Console.WriteLine("Message could not be processed.");
 28: 
 29:                 }
 30:             }
 31:             finally
 32:             {
 33:                 receiver.Close();
 34:             }

As shown below, this sample results in order 42 being received by the QueueClient:

image

Topics

image

Below is a representative sample for creating, configuring, sending and receiving a message on a topic:

Administrative Operations

  1:             // Configure and create NamespaceManager for performing administrative operations
  2:             NamespaceManagerSettings settings = new NamespaceManagerSettings();
  3:             settings.TokenProvider = TokenProvider.CreateSharedSecretTokenProvider(issuer, key);
  4:             NamespaceManager manager = new NamespaceManager(ServiceBusEnvironment.CreateServiceUri("sb", serviceNamespace, string.Empty), settings);
  5: 
  6:             // Check for existence of topics on the fabric
  7:             var topics = manager.GetTopics();
  8: 
  9:             var result = from t in topics
 10:                          where t.Path.Equals(topicName, StringComparison.OrdinalIgnoreCase)
 11:                          select t;
 12: 
 13:             if (result.Count() == 0)
 14:             {
 15:                 Console.WriteLine("Topic does not exist");
 16: 
 17:                 // Create Queue
 18:                 Console.WriteLine("Creating Topic...");
 19: 
 20:                 TopicDescription topic = manager.CreateTopic(topicName);
 21:             }
 22: 
 23:             // Create Subscriptions for InventoryServiceSubscription and CreditServiceSubscription and associate to OrdersTopic:
 24:             SubscriptionDescription inventoryServiceSubscription = new SubscriptionDescription(topicName, "InventoryServiceSubscription");
 25:             SubscriptionDescription creditServiceSubscription = new SubscriptionDescription(topicName, "CreditServiceSubscription");
 26: 
 27: 
 28:             // Set up Filters for NorthAmericaFulfillmentServiceSubscription
 29:             RuleDescription northAmericafulfillmentRuleDescription = new RuleDescription();
 30:             northAmericafulfillmentRuleDescription.Filter = new SqlFilter("CountryOfOrigin = 'USA' OR CountryOfOrigin ='Canada' OR CountryOfOrgin ='Mexico'");
 31:             northAmericafulfillmentRuleDescription.Action = new SqlRuleAction("set FulfillmentRegion='North America'");
 32: 
 33: 
 34:             // Create Subscriptions
 35:             SubscriptionDescription northAmericaFulfillmentServiceSubscription = new SubscriptionDescription(topicName, "NorthAmericaFulfillmentServiceSubscription");
 36: 
 37:             // Delete existing subscriptions
 38:             try { manager.DeleteSubscription(topicName, inventoryServiceSubscription.Name); } catch { };
 39:             try { manager.DeleteSubscription(topicName, creditServiceSubscription.Name); } catch { };
 40:             try { manager.DeleteSubscription(topicName, northAmericaFulfillmentServiceSubscription.Name); } catch { };
 41: 
 42:             
 43:             // Add Subscriptions and Rules to Topic
 44:             manager.CreateSubscription(inventoryServiceSubscription);
 45:             manager.CreateSubscription(creditServiceSubscription);
 46:             manager.CreateSubscription(northAmericaFulfillmentServiceSubscription, northAmericafulfillmentRuleDescription);
 47:             

Runtime Operations

  1:             // Create and Configure Messaging Factory to provision TopicClient
  2:             MessagingFactorySettings runtimeSettings = new MessagingFactorySettings();
  3:             runtimeSettings.TokenProvider = TokenProvider.CreateSharedSecretTokenProvider(issuer, key);
  4:             MessagingFactory messagingFactory = MessagingFactory.Create(ServiceBusEnvironment.CreateServiceUri("sb",serviceNamespace,String.Empty),runtimeSettings);
  5:             
  6:             // Create Topic Client for sending messages to the Topic:
  7:             TopicClient client = messagingFactory.CreateTopicClient(topicName);
  8:          
  9:             /***********************
 10:              *** Send Operations ***
 11:              ***********************/
 12: 
 13:             // Prepare BrokeredMessage and corresponding properties
 14:             Order order = new Order();
 15:             order.OrderId = 42;
 16:             order.Products.Add("Kinect", 70.50M);
 17:             order.Products.Add("XBOX 360", 199.99M);
 18:             order.Total = order.Products["Kinect"] + order.Products["XBOX 360"];
 19: 
 20:             // Set the body to the Order data contract
 21:             BrokeredMessage msg = new BrokeredMessage(order);
 22:             
 23:             // Set properties for use in RuleDescription
 24:             msg.Properties.Add("CountryOfOrigin", "USA");
 25:             msg.Properties.Add("FulfillmentRegion", "");
 26:             
 27:             // Send the message to the OrdersTopic
 28:             client.Send(msg);
 29:             client.Close();
 30:             
 31:            /**************************
 32:              *** Receive Operations ***
 33:            ****************************/
 34:         
 35:             BrokeredMessage recdMsg;
 36:             Order recdOrder;
 37: 
 38:             // Inventory Service Subscriber
 39:             SubscriptionClient inventoryServiceSubscriber = messagingFactory.CreateSubscriptionClient(topicName, "InventoryServiceSubscription",ReceiveMode.PeekLock);
 40:             
 41:             // Read the message from the OrdersTopic
 42:             while ((recdMsg = inventoryServiceSubscriber.Receive(TimeSpan.FromSeconds(5))) != null)
 43:             {   
 44:                 // Convert from BrokeredMessage to native Order
 45:                 recdOrder = recdMsg.GetBody<Order>();
 46: 
 47:                 // Complete read, release and delete message from the fabric
 48:                 inventoryServiceSubscriber.Complete(recdMsg.LockToken);
 49: 
 50:                 Console.ForegroundColor = ConsoleColor.Green;
 51:                 Console.WriteLine("Received Order {0} \n\t on {1} \n\t with Message Id {2} \n\t and Lock Token {3}.", recdOrder.OrderId, "Inventory Service Subscriber", recdMsg.MessageId, recdMsg.LockToken);
 52:             }
 53:             inventoryServiceSubscriber.Close();
 54: 
 55:             // Credit Service Subscriber
 56:             SubscriptionClient creditServiceSubscriber = messagingFactory.CreateSubscriptionClient(topicName, "CreditServiceSubscription");
 57: 
 58:             // Read the message from the OrdersTopic
 59:             recdMsg = creditServiceSubscriber.Receive();
 60: 
 61:             // Convert from BrokeredMessage to native Order
 62:             recdOrder = recdMsg.GetBody<Order>();
 63: 
 64:             // Complete read, release and delete message from the fabric
 65:             creditServiceSubscriber.Complete(recdMsg.LockToken);
 66: 
 67:             Console.ForegroundColor = ConsoleColor.Green;
 68:             Console.WriteLine("Received Order {0} \n\t on {1} \n\t with Message Id {2} \n\t and Lock Token {3}.", recdOrder.OrderId, "Credit Service Subscriber", recdMsg.MessageId, recdMsg.LockToken);
 69: 
 70:             creditServiceSubscriber.Close();
 71: 
 72:             // Fulfillment Service Subscriber for the North America Fulfillment Service Subscription
 73:             SubscriptionClient northAmericaFulfillmentServiceSubscriber = messagingFactory.CreateSubscriptionClient(topicName, "northAmericaFulfillmentServiceSubscription");
 74:             
 75:             // Read the message from the OrdersTopic for the North America Fulfillment Service Subscription
 76:             recdMsg = northAmericaFulfillmentServiceSubscriber.Receive(TimeSpan.FromSeconds(5));
 77: 
 78:             
 79:            if(recdMsg != null)
 80:             {
 81:                 // Convert from BrokeredMessage to native Order
 82:                 recdOrder = recdMsg.GetBody<Order>();
 83: 
 84:                 // Complete read, release and delete message from the fabric
 85:                 northAmericaFulfillmentServiceSubscriber.Complete(recdMsg.LockToken);
 86: 
 87:                 Console.ForegroundColor = ConsoleColor.Green;
 88:                 Console.WriteLine("Received Order {0} \n\t on {1} \n\t with Message Id {2} \n\t and Lock Token {3}.", recdOrder.OrderId, "North America Fulfillment Service Subscriber", recdMsg.MessageId, recdMsg.LockToken);
 89:             }
 90:             else
 91:             {
 92:                 Console.ForegroundColor = ConsoleColor.Yellow;
 93:                 Console.WriteLine("No messages for North America found.");
 94:             }
 95:             
 96:             northAmericaFulfillmentServiceSubscriber.Close();

When running this sample, you’ll see that I have received Order 42 on my Inventory, Credit and North America Fulfillment Service subscriptions:

image

WCF

One of the great things about the WCF programming model is that it abstracts much of the underlying communication details and as such, other than dropping in a new assembly and and refactoring the binding and configuration, it is not greatly affected by the API changes from the May/June CTP to GA.

As I mentioned, one thing that has changed is that the ServiceBusMessagingBinding has been renamed to NetMessagingBinding. I’ll be covering and end to end example of using the NetMessagingBinding in my upcoming article in CODE Magazine.

REST API

The REST API is key to delivering these new capabilities across a variety of client platforms and remains largely unchanged, however one key change is how message properties are handled. Instead of individual headers for each, there is now one header with all the properties JSON encoded. Please refer to the updated REST API Reference doc for details. I’ll also be covering and end-to-end example of using the REST API to write an read to/from a queue in my upcoming article in CODE Magazine.

More Coming Soon

As I mentioned, in my upcoming article in CODE Magazine, I’ll cover the Why, What, and How behind Azure AppFabric Service Bus Brokered Messaging including end to end walkthroughs with the .NET Client API, REST API and WCF Binding. The November/December issue should be on newsstands (including Barnes and Noble) or your mailbox towards the end of October. You can also find the article online at http://code-magazine.com

Resources

You can learn more about this exciting release as well as download the GA SDK version 1.5 by visiting the following resources:


<Return to section navigation list>

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Brian Swan (@brian_swan) explained How to Get Diagnostics Info for Azure/PHP Applications–Part 1 in a 9/19/2011 post to the Windows Azure’s Silver Lining blog:

imageIn this post, I’ll look at why getting diagnostic data for applications in the cloud is important for all applications (not just PHP), provide a short overview of what Windows Azure Diagnostics is, and show how to get diagnostics data for PHP applications (although much of what I look at is applicable regardless of language). In this post I’ll focus on how to use a configuration file (the diagnostics.wadcfg file) to get diagnostics, while in Part 2 I’ll look at how to to this programmatically.

Why get diagnostic data?

imageThere are lots of reasons to collect and analyze data about applications running in the cloud (or any application, for that matter). One obvious reason is to help with troubleshooting. Without data about your application, it’s very difficult to figure out why something is broken when it breaks. However, gathering diagnostic data for applications running in the cloud takes on added importance. Without diagnostic data, it becomes difficult (perhaps impossible) to take advantage of scalability, a basic value proposition of the could. In order to know when to add or subtract instances from a deployment, it is essential to know how your application is performing. This is what Windows Azure Diagnostics allows you to do. (Look for more posts soon about how to scale an application based on diagnostics.)

What is Windows Azure Diagnostics?

In understanding how to get diagnostic data for my PHP applications running in Azure, it first helped me to understand what this thing called “Windows Azure Diagnostics” is. So, here’s the 30,000-foot view that, hopefully, will provide context for the how-to section that follows…

Windows Azure Diagnostics is essentially an Azure module that monitors the performance (in the broad sense) of role instances. When a role imports the Diagnostics module (which is specified in the ServiceDefinition.csdef file for a role), the module does two things:

  1. The Diagnostics module creates a configuration file for each role instance and stores these files in your Blob storage (in a container called wad-control-container). Each file contains information (for example) about what diagnostic information should be gathered, how often it should be gathered, and how often it should be transferred to your Table storage. You can specify these settings (and others) in a diagnostics.wadcfg file that is part of your deployment (the Diagnostics module will look for this file when it starts). Regardless of whether you include the diagnostics.wadcfg file with your deployment, you can create or make changes programmatically after deployment (the Diagnostics module will create and/or update the configuration files that are in Blob storage).
  2. According to settings in the configuration files in Blob storage, the Diagnostics module begins writing data to local storage and transferring it to Table storage (to enable the transfer you have to supply storage connection string information in your ServiceConfiguration.cscfg file).

So that’s the high-level view (import a module, configure via a file or programmatically, diagnostics info is written to your Azure storage account). Now for the details…

How to get diagnostic data using a configuration file

There are two basic ways you can get diagnostics data for a Windows Azure application: using a configuration file and programmatically. In this post, I’ll examine how to use a configuration file (I’ll look at how to get diagnostics programmatically in Part 2). I’ll assume you have followed this tutorial, Build and Deploy a Windows Azure PHP Application, and have a ready-to-package PHP application.

Note: Although I will walk through this configuration in the context of a PHP application, the configuration steps are the same for any Azure deployment, regardless of language.

Step 1: Import the Diagnostics Module

To specify that a role should import the Diagnostics module, you need to edit your ServiceDefinition.csdef file. After you run the default scaffolder in the Windows Azure SDK for PHP (details in the tutorial link above), you will have a skeleton Azure PHP application with a directory structure like this:

image

Open the ServiceDefinition.cscfg file in your favorite XML editor and make sure the <Imports> element has this child element:

<Import moduleName="Diagnostics"/>

That all that’s necessary to import the Diagnostics module.

Step 2: Enable transfer to Azure storage

To allow the Diagnostics module to transfer data from local storage to your Azure storage account, you need to provide your storage account name and key in the ServiceConfiguration.cscfg file. Again in your favorite XML editor, open this file and make sure the <ConfigurationSettings> element has the following child element::

<Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" 
         value="DefaultEndpointsProtocol=https;AccountName=your_account_name;AccountKey=your_account_key"/>

Obviously, you’ll have to fill in your storage account name and key.

Step 3: Edit the configuration file (diagnostics.wadcfg)

Now you can specify what diagnostic information you’d like to collect. Notice that included in the directory structure shown above is the diagnostics.wadcfg file. Open this file in an XML editor and you’ll begin to see exactly what you can configure. I’ll point out some of the important pieces and provide one example.

In the root element (<DiagnosticMonitorConfiguration>), you can configure two settings with the configurationChangePollInterval and overallQuotaInMB attributes:

  1. The frequency with which the Diagnostic Monitor looks at the configuration files in your blob storage for updates (see the What is Windows Azure Diagnostics section above for more information) is controlled by the configurationChangePollInterval.
  2. The amount of local storage set aside for storing diagnostic information is controlled by the overallQuotaInMB attribute. When this quota is reached, old entries are deleted to make room for new ones.

In the example here, these settings are 1 minute and 4GB respectively:

<DiagnosticMonitorConfiguration xmlns="http://schemas.microsoft.com/ServiceHosting/2010/10/DiagnosticsConfiguration" 
    configurationChangePollInterval="PT1M" 
    overallQuotaInMB="4096">

Note: if you want to increase the value of the overallQuotaInMB setting, you may also need to edit your ServiceDefinition.csdef file prior to deployment – details here.

The following table describes what each of the child elements in the default diagnostics.wadcfg file does. (For more detailed information, see Windows Azure Diagnostics Configuration Schema.)

Element

Description

<DiagnosticInfrastructureLogs>

Logs information about the diagnostic infrastructure, the RemoteAccess module, and the RemoteForwarder module.

<Directories>

Defines the buffer configuration for file-based logs that you can define.

<Logs>

Defines the buffer configuration for basic Windows Azure logs.

<PerformanceCounters>

Logs information about how well the operating system, application, or driver is performing. For a list of performance counters that you can collect, see List of Performance Counters for Windows Azure Web Roles.

<WindowsEventLog>

Logs events that are typically used for troubleshooting application and driver software.

The bufferQuotaInMB attribute on these elements controls how much local storage is set set aside for each diagnostic (the sum of which cannot exceed the value of the overallQuotaInMB attribute). If the value is set to zero, no cap is set for the particular diagnostic, but the total collected at any one time will not exceed the overall quota.

The scheduledTransferPeriod attribute controls the interval at which information is transferred to your Azure storage account.

As an example, this simple configuration file collects information about specified performance counters (the sampleRate attribute specifies how often the counter should be collected):

<DiagnosticMonitorConfiguration xmlns="http://schemas.microsoft.com/ServiceHosting/2010/10/DiagnosticsConfiguration" 
    configurationChangePollInterval="PT1M" 
    overallQuotaInMB="4096">
   <PerformanceCounters bufferQuotaInMB="0" scheduledTransferPeriod="PT5M">
      <PerformanceCounterConfiguration 
         counterSpecifier="\Processor(_Total)\% Processor Time" sampleRate="PT1M" />
      <PerformanceCounterConfiguration 
         counterSpecifier="\Memory\Available Mbytes" sampleRate="PT1M" />
      <PerformanceCounterConfiguration 
         counterSpecifier="\TCPv4\Connections Established" sampleRate="PT1M" />
   </PerformanceCounters>
  </DiagnosticMonitorConfiguration>
Step 4: Package and deploy your project

Now, you are ready to package and deploy your project. Instructions for doing so are here: Build and Deploy a Windows Azure PHP Application.

Step 5: Collect diagnostic information

After you deploy your application to Windows Azure, the Diagnostics module will begin writing data to your storage account (it may take several minutes before you see the first entry). In the case of performance counters (as shown in the example configuration file above), diagnostic information is written to a table called WADPerformanceCountersTable in Table storage. To get this information, you can query the table using the Windows Azure SDK for PHP like this:

define("STORAGE_ACCOUNT_NAME", "Your_storage_account_name");
define("STORAGE_ACCOUNT_KEY", "Your_storage_account_key");
$table = new Microsoft_WindowsAzure_Storage_Table('table.core.windows.net', STORAGE_ACCOUNT_NAME, STORAGE_ACCOUNT_KEY);
$metrics = $table->retrieveEntities('WADPerformanceCountersTable');
$i = 1;
foreach($metrics AS $m) {   
    echo $i.". ".$m->DeploymentId . " - " . $m->RoleInstance . " - " . $m->CounterName . ": " . $m->CounterValue . "<br/>"; 
    $i++;
}

Of course, getting this data can be somewhat trickier if you have several roles writing data to your storage account. And, this begs the question, what do I do with this data now that I’ve got it? So, it looks like I have plenty to cover in future posts.


Michael Washam (MWashamMS) described Windows Azure Diagnostics and PowerShell – Performance Counters in a 9/19/2011 post:

imageWith the introduction of Windows Azure PowerShell Cmdlets 2.0 we have added a lot of functionality around managing Windows Azure Diagnostics. This is a 2nd article in a series that covers various aspects of diagnostics and PowerShell. Click here to see the previous article on using tracing and the Windows Azure Log.

imageJust as in the other articles you will need to add the PowerShell Snapin (or module):

  Add-PsSnapin WAPPSCmdlets

Next I have a handy helper function called GetDiagRoles that I use for all of my diagnostic examples:

Initialization and Helper Function

  $storageAccountName = "YourStorageAccountName"
  $storageAccountKey = "YourStorageAccountKey"
  $deploymentSlot = "Production"
  $serviceName = "YourHostedService"
  $subscriptionId = "YourSubscriptionId"
  # Thumbprint for your cert goes below
  $mgmtCert = Get-Item cert:\CurrentUser\My\D7BECD4D63EBAF86023BB4F1A5FBF5C2C924902A 

  function GetDiagRoles {
    Get-HostedService -ServiceName $serviceName -SubscriptionId $subscriptionId -Certificate $mgmtCert | `
        Get-Deployment -Slot $deploymentslot | `
        Get-DiagnosticAwareRoles -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
   }

I have another helper function I have written that basically creates a new PerformanceCounterConfiguration object. You will need an array of these to pass to the Set-PerformanceCounter cmdlet. The function takes the performance counter specifier (more on this shortly) and the sample rate in seconds for each counter.

Helper Function that Initializes a Counter Object

function CPC($specifier, $SampleRate) {
   $perfCounter = New-Object Microsoft.WindowsAzure.Diagnostics.PerformanceCounterConfiguration
   $perfCounter.CounterSpecifier = $specifier
   $perfCounter.SampleRate = [TimeSpan]::FromSeconds($SampleRate)
   return $perfCounter
}

From there it is up to you to pass the performance counters you are interested in. I’ve written another function that wraps up a the functionality of creating a PowerShell array, populating it with the counters I am interested in and sets them for the diagnostic aware role I pass in from GetDiagRoles. Note: I’m also comparing the current role name to the name of one of my web roles so I’m only adding web related counters on instances that I care about. I retrieved the counter names by typing in: typeperf.exe /qx while logged into my Windows Azure roles via RDP. This way I can actually get counters specific to the machines in the service (and the processes such as RulesEngineService.exe (my sample NT service). Finally, it makes a call to the Set-PerformanceCounter cmdlet with the array of performance counters, and configures diagnostics to transfer these to Windows Azure Storage every 15 minutes.

Helper Function that Configures Counters for a Role

 function SetPerfmonCounters {
    $sampleRate = 15
    $counters = @()
    $counters += CPC "\Processor(_Total)\% Processor Time" $sampleRate
    $counters += CPC "\Memory\Available MBytes" $sampleRate
    $counters += CPC "\PhysicalDisk(*)\Current Disk Queue Length" $sampleRate
    $counters += CPC "\System\Context Switches/sec" $sampleRate
    # Process specific counters (retrieved via typeperf.exe /qx)
    $counters += CPC "\Process(RulesEngineService)\% Processor Time" $sampleRate
    $counters += CPC "\Process(RulesEngineService)\Private Bytes" $sampleRate
    # $RoleName is populated from a call to GetDiagRoles in the next snippet.
    if($RoleName -eq "SomeWebRole") {
        $counters += CPC "\W3SVC_W3WP(*)\Requests / Sec" $sampleRate
        $counters += CPC "\ASP.NET\Requests Queued" $sampleRate
        $counters += CPC "\ASP.NET\Error Events Raised" $sampleRate
        $counters += CPC "\ASP.NET\Request Wait Time" $sampleRate
        # Process specific counters (retrieved via typeperf.exe /qx
        $counters += CPC "\Process(W3WP)\% Processor Time" $sampleRate
        $counters += CPC "\Process(W3WP)\Private Bytes" $sampleRate
     }
     $input | Set-PerformanceCounter -PerformanceCounters $counters -TransferPeriod 15
}

Now linking all of this together to configure performance logging is pretty simple. In the snippet below I call GetDiagRoles which returns a collection of all of the diagnostic aware roles in my service. Which I then pipe individually to the SetPerfmonCounters function previously created.

Set Counters for each Role

  GetDiagRoles | foreach {
       $_ | SetPerfmonCounters
  }

Now I am successfully logging perfmon data and transferring it to storage. What’s next? Well analysis and clean up of course! To analyze the data we have added another cmdlet called Get-PerfmonLogs that downloads the data and optionally converts it to .blg for analysis. The snippet below creates a perfmon log foreach role in your service using the BLG format. Note the Get-PerfmonLog cmdlet also supports -From and -To (or -FromUTC or -ToUTC) so you can selectively download performance counter data.

Download Perfmon Logs for Each Role

   GetDiagRoles | foreach {
     $Filename = "c:\DiagData\" + $_.RoleName + "perf.blg" $_ | Get-PerfmonLog -LocalPath $Filename -Format BLG
   }

Once the data is downloaded you can utilize the Clear-PerfmonLogs to clean up the previously downloaded perfmon counters from Windows Azure Storage. Note all of the Clear-* diagnostic data cmdlets support a -From and -To parameter to allow you to manage what data to delete.

Delete Perfmon Data from Storage for Each Role

 GetDiagRoles | foreach {
     $_ | Clear-PerfmonLog -From "9/19/2011 6:00:00 AM" -To "9/19/2011 9:00:00 AM"
}

Finally, one last example. If you ever have the need to completely clear out and reset your perfmon counters there is a -Clear switch that gives you this functionality.

Reset Perfmon Logs for Each Role

  GetDiagRoles | foreach {
       $_ | Set-PerfmonLog -Clear
  }

See Michael’s earlier post below.


HPC in the Cloud reported Microsoft Platform Powers Mission-Critical Operations in Financial Services on 9/19/2011:

TORONTO, September 19 -- Today at SIBOS 2011, Microsoft Corp. announced that a growing number of financial services customers are benefiting from the significant gains in agility, operational efficiency and cost savings achieved by moving to the high-performance Windows Server operating system and Microsoft SQL Server database.

These customers have not only reduced the cost of running their core processes, they have also realized substantial benefits by making these core processes part of a dynamic IT infrastructure that enables them to understand and serve customers better, bring new products to market more quickly, continually improve operations, and collaborate with an evolving set of partners in their global value chains.

"Microsoft is making a long-term commitment to supporting the mission-critical business applications of our financial services customers," said Karen Cone, general manager, Worldwide Financial Services, Microsoft. "Our customers are testament to this commitment to delivering a solid foundation for mission-critical workloads, with the dependability, performance and flexibility required to achieve sustainable competitive advantage in today's financial services industry."

Legacy System Modernization
Skandinavisk Data Center (SDC), which services the banking businesses of more than 150 financial institutions in Denmark, Sweden, Norway and the Faroe Islands, is expecting to save $20 million annually by moving its core banking system from its mainframe platform to Windows Server and SQL Server. With client growth and cost reduction at the forefront of SDC's business imperatives, it required a solution to help minimize spending while retaining and gaining new member banks. By migrating from the mainframe to a Windows platform, it is estimated that SDC will reduce operational costs for its core banking system by 30 percent, giving it a competitive edge. The migration of online transactions in the core banking system was completed this fall, reducing operational costs by more than 20 percent annually. As the next and final step, the systems database will be migrated from DB2 to SQL Server.

imageIn addition, with partner Temenos Group AG, Microsoft is supporting banks across the globe with the TEMENOS T24 (T24) core banking system optimized for SQL Server. Microsoft and Temenos recently completed a high-performance benchmark that measured the high-end scalability of T24 on SQL Server and Windows Server 2008 Datacenter. The model bank environment, created to reflect tier-one retail banking workloads, consisted of 25 million accounts and 15 million customers across 2,000 branches. At peak performance, the system processed more than 3,400 transactions per second in online testing and averaged more than 5,200 interest accrual and capitalizations per second during close of business processing. The testing demonstrated near-linear scalability (95 percent) in building up toward the final hardware configuration. Banks such as Sinopac in Taiwan and Rabobank Australia and New Zealand are among the first to benefit from expertise and capabilities developed at the Microsoft and Temenos competency center. Furthermore, Microsoft and Temenos recently announced that Mexican financial institutions are live on T24 on Microsoft's cloud platform, Windows Azure. Banks that select T24 on a Microsoft platform for on-premises deployment today can therefore be confident of a road map to the cloud. [Emphasis added.]

In the past, only mainframes could run and maintain mission-critical trading solutions that require a lot from the database infrastructure: high-availability, redundancy, transaction and data integrity, consistency, predictability, and the balancing of proactive prevention with effective recovery. One such solution, the SunGard Front Arena, is a global capital markets solution that delivers electronic trading and position control across multiple asset classes and business lines. Integrating sales and distribution, trading and risk management, and settlement and accounting, Front Arena helps capital markets and businesses around the world improve performance, transparency and automation. Front Arena was designed to handle very large data flows and, in high-volume environments such as equities exchange trading, Front Arena customers routinely enter as many as 130,000 trades per day. As a result, in March and April 2011, engineers from SunGard and Microsoft worked together to confirm the performance and scalability of Front Arena on Microsoft SQL Server 2008 R2 at the Microsoft Platform Adoption Center in Redmond, Wash. The team designed a benchmark test to emulate a real-world, enterprise-class financial workload, running the software on industry-standard hardware typically found in datacenters today. Front Arena running on SQL Server 2008 R2 exceeded the goals set by the team, confirming that SQL Server 2008 R2 delivers the performance, scalability and value companies demand from their trading platform.

Mitigating Risk
The Financial Crime Risk Management solution suite from Fiserv Inc., built on the Microsoft platform, provides a comprehensive portfolio of fraud risk mitigation and anti-money laundering capabilities. AML Manager, a key component of the solution suite, will adopt the newest release of Microsoft SQL Server, code-named "Denali," in the future.

Transforming the Reconciliation Process
Saxo Bank, a specialist in trading and investment, has determined that its transaction volumes have increased with the help of SunGard's Ambit Reconciliation solution, deployed on Windows Server and SQL Server. The solution provides a real-time matching and reconciliation platform on which Saxo Bank has consolidated its reconciliation and exception management operations across all trading platforms at the bank. According to Saxo Bank, it is now able to process more transactions per day compared with four years ago, with fewer staff to manage the increase in transaction volumes.

Improving Payments Efficiency and Reliability
In a recent report based on its Uptime Meter, a six-month availability aggregate, Stratus demonstrated that its fault-tolerant servers running Windows Server and S1 Corp.'s payments solutions are achieving "six nines" (99.9999 percent) of availability. Both hardware- and software-related incidents are included in the measurement. The combination of mission-critical technologies from S1, Stratus and Microsoft provides a compelling alternative to more costly mainframe solutions.

The success of Microsoft's mission-critical strategy depends not only on the technology and guidance provided by Microsoft, but also on the products and services provided by its large ecosystem of partners. A new generation of solutions from independent software vendors -- across banking, capital markets and insurance -- combined with Microsoft's development tools and technologies, is now delivering the dynamic environment needed to realize the benefits of next-generation, mission-critical platforms from on-premises to the cloud.


Michael Washam (MWashamMS) described Getting Started with Windows Azure Diagnostics and PowerShell in a 9/16/2011 post from the //BUILD/ Windows conference:

imageIn my BUILD session Monitoring and Troubleshooting Windows Azure Applications I mention several code samples that make life much easier for implementing diagnostics with the Windows Azure PowerShell Cmdlets 2.0. This is the first in a series that walk you through configuring end-to-end diagnostics for your Windows Azure Application with PowerShell.

imageTo get started with the Windows Azure PowerShell Cmdlets you will first need to add the WAPPSCmdlets snapin to your script:

Add-PsSnapin WAPPSCmdlets

After the snapin is loaded you will need to get access to the roles that you want to configure or retrieve diagnostic information about.

 $storageAccountName = "YourStorageAccountName"
 $storageAccountKey = "YourStorageAccountKey"
 $deploymentSlot = "Production"
 $serviceName = "YourHostedService"
 $subscriptionId = "YourSubscriptionId"
 # Thumbprint for your cert goes below
 $mgmtCert = Get-Item cert:\CurrentUser\My\D7BECD4D63EBAF86023BB4F1A5FBF5C2C924902A 

 function GetDiagRoles {
   Get-HostedService -ServiceName $serviceName -SubscriptionId $subscriptionId -Certificate $cert | `
   Get-Deployment -Slot $deploymentslot | `
   Get-DiagnosticAwareRoles -StorageAccountName $storageAccount -StorageAccountKey $storageKey
 }

GetDiagRoles performs the following steps:
1) Retrieves your hosted service
2) Retrieves the correct deployment (staging or production)
3) Returns a collection of all of the roles that have diagnostics enabled.

For example the following code will print all of the role names that are returned:

  GetDiagRoles | foreach {
    write-host $_.RoleName
  }

Note: The $_ operator basically means the “current” object in the collection.

Of course we want to configure diagnostics and not just print out role names.
The following example sets the diagnostic system to transfer any trace logs that have a loglevel of Error to storage every 15 minutes.

  GetDiagRoles | foreach {
    # Sets Windows Azure Diagnostics to transfer any trace messages from your code
    # that are errors to storage over every 15 minutes.
    $_ | Set-WindowsAzureLog -LogLevelFilter Error -TransferPeriod 15
  }

This functionality makes it extremely useful to instrument your code to capture errors or just trace execution of your code to look for logic errors.

try
{
  // something that could throw an exception
}
catch(Exception exc)
{
  System.Diagnostics.Trace.TraceError(DateTime.Now + " " + exc.ToString());
}

How to retrieve the data? Once the diagnostic system has had time to transfer any trace logs to storage it’s as simple as the following examples:

# Dump out all tracelogs in storage to the console
GetDiagRoles | Get-WindowsAzureLog
# Dump all tracelogs in storage that
# have exception in the message to the console
GetDiagRoles | Get-WindowsAzureLog | `
	  Where { $_.Message -like "*exception*" }
# Save all trace logs to the file system (as .csv)
GetDiagRoles | foreach {
  $FileName = "c:\DiagData\" + $_.RoleName + "azurelogs.csv"
  Get-WindowsAzureLog -LocalPath $FileName
}

Of course there are -From and -To arguments for all of the Get-* diagnostic cmdlets so you can filter by certain time ranges etc.
A few other goodies we have added are the ability to clear your diagnostic settings (reset them to none) and also the ability to clean out your diagnostic storage.

# Clear all settings for the Windows Azure Log
GetDiagRoles | foreach {
  $_ | Set-WindowsAzureLog -Clear
}
# Clear all data for the Windows Azure Log
Clear-WindowsAzureLog -StorageAccountKey $key -StorageAccountName $name

Look for more posts from me on how to do more advanced Windows Azure diagnostics such as perfmon logs and file system logging.


Michael Washam (MWashamMS) posted Announcing the release of Windows Azure Platform PowerShell Cmdlets 2.0 from the //BUILD/ Windows conference on 9/16/2011:

imageEven while we are here at BUILD we are working hard to make deploying and managing your Windows Azure applications simpler. We have made some significant improvements to the Windows Azure Platform PowerShell Cmdlets and we are proud to announce we are releasing them today on CodePlex: http://wappowershell.codeplex.com/releases.

imageIn this release we focused on the following scenarios:

  • Automation of Deployment Scenarios
  • Windows Azure Diagnostics Management
  • Consistency and Simpler Deployment

As part of making the cmdlets more consistent and easier to deploy we have renamed a few cmdlets and enhanced others with the intent of following PowerShell cmdlet design guidelines much closer. From a deployment perspective we have merged the Access Control Service PowerShell Cmdlets with the existing Windows Azure PowerShell Cmdlets to have both in a single installation.

In addition to those changes we have added quite a few new and powerful cmdlets in this release:


Toddy Mladenov posted Demystifying physicalDirectory or How to Configure the Site Entry in the Service Definition File for Windows Azure on 9/11/2011 (missed when posted):

imageIf you played a bit more with the sites configuration in Windows Azure you may have discovered some inconsistent behavior between what Visual Studio does and what the cspack.exe command line does when it relates to physicalDirecroty attribute. I certainly did! Here is the problem I encountered while trying to deploy PHP on Windows Azure.

Project Folder Structure

imageI was following the instructions on Installing PHP on Windows Azure leveraging Full IIS Support but decided to leverage the help of Visual Studio instead building the package by hand. Not a good idea for this particular scenario :( After creating my cloud solution in Visual Studio I ended up with the following folder structure:

+ PHPonAzureSol

+ PHPonAzure

- ServiceConfiguration.cscfg

- ServiceDefinition.csdef

+ PHPRole

+ bin

- install-php.cmd

- install-php-azure.cmd

+ PHP-Azure

- php-azure.dll

+ Sites

+ PHP

- index.php

+ WebPI-cmd

Where:

  • PHPonAzureSol was my VS solution folder
  • PHPonAzure was my VS project folder containing the CSDEF and CSCFG files
  • and PHPRole was my VS project folder containing the code for my web role

The PHPRole folder contained the WebPI command line tool needed to install PHP in the cloud stored in the WebPI-cmd subfolder; the PHP extensions for Azure in the PHP-Azure subfolder; the installation scripts in the bin subfolder; and most importantly my PHP pages in Sites\PHP subfolder (in this case I had simple index.php page containing phpinfo()).

Configuring Site Entry in the CSCFG File

Of course my goal was to configure the site to point to the folder where my PHP files were stored. In this particular case this was the PHPonAzureSol\PHPRole\Sites\PHP folder if you follow the structure above. This is simply done by adding the physicalDirectory attribute to the Site tag in CSDEF. Here is how my Site tag looked like:

<Site name="Web" physicalDirectory="..\PHPRole\Sites\PHP">
<Bindings>
<Binding name="Endpoint1" endpointName="Endpoint1" />
</Bindings>
</Site>

My expectation was that with this setting in CSDEF IIS will be configured to point to the content that comes from the physicalDirectory folder. Hence if I type the URL of my Windows Azure hosted service I should be able to see the index.php page (i.e. http://[my-hosted-service].cloudapp.net should point to your PHP code).

Visual Studio handling of physicalDirectory attribute

Of course when I used Visual Studio to pack and deploy my Web Role I was unpleasantly surprised. It seems Visual Studio ignores the physicalDirectory attribute from your CSDEF file, and points the site to your Web Role’s approot folder (or the content from PHPRole folder if you follow the structure above). Thus if I wanted to access my PHP page I had to type the following URL:

http://[my-hosted-service].cloudapp.net/Sites/PHP/index.php

Not exactly what I wanted :(

The reason for this is that Visual Studio calls cspack.exe with additional options (either /sitePhysicalDirectories or /sites) that overwrite the physicalDirectory attribute from CSDEF. As of now I am not aware of a way to change this behavior in VS.

Update (9-12-2011): As it seems VS ignores the physicalDirectory attribute ONLY if your web site is called Web (i.e. name="Web" as in the example above). If you rename the site to something else (name="PHPWeb" for example) you will end up with the expected behavior described below. Unfortunately name="Web" is the default setting, and this may result in unexpected behavior for your application.

cspack.exe handling of physicalDirectory attribute

Solution to the problem is to call cspack.exe from the command line (without the above mentioned options of course:)).

There are few gotchas about how you call cspack.exe using the folder structure that Visual Studio creates. After few trial-and-errors where I received several of those errors:

Error: Could not find a part of the path '[some-path-here]'.

I figured out that you should call cspack.exe from the solution folder (PHPonAzureSol in the above structure). Once I did this everything worked fine and I was able to access my index.php by just typing my hosted service’s URL.

How physicalDirectory attribute works?

For those of you interested how the physicalDirectory attribute works here is a simple explanation.

MSDN documentation for How to Configure a Web Role for Multiple Web Sites points out that physicalDirectory attribute value is relative to the location of the Service Configuration (CSCFG) file. This is true in the majority of the cases however I think the following two clarifications are necessary:

  1. Because the attribute is present in the Service Definition (CSDEF) file the correct statement is that physicalDirectory attribute value is relative to the location of the Service Definition (CSDEF) file instead. Of course if you use Visual Studio to build your folder structure you can always assume that the Service Configuration (CSCFG) and the Service Definition (CSDEF) files are placed in the same folder. If you build your project manually you should be careful how you set the physicalDirectory attribute. This is of course important if you want to use relative paths in the attribute.
  2. This one I think is much more important than the first one, and it states that you can use absolute paths in the physicalDirectory attribute. The physicalDirectory attribute can contain any valid absolute path on the machine where you build the package. This means that you can point cspack.exe to include any random folder from your local machine as your site’s root.

Here is how this works.

What cspack.exe does is to take the content of the folder configured in physicalDirectory attribute and copy it under [role]/sitesroot/[num] folder in the package. Here is how my package structure looked like (follow the path in the address line):

image

During deployment IIS on the cloud VM is configured to point the site to sitesroot\[num] folder, and serve the content from there. Here is how it is deployed in the cloud:

image

And here is the IIS configuration for this cloud VM:

image

You might like:


<Return to section navigation list>

Visual Studio LightSwitch and Entity Framework 4.1+

Paul Patterson posted An Introduction to LightSwitch – Edmonton Dot Net User Group Presentation – September 12, 2011 on 9/13/2011 (missed when posted):

image222422222222The following slide deck was presented at the September 12, 2011 Edmonton .Net User Group meetup…

…please forgive me for the shameless header image of me at the presentation. I am usually quite shy :)

Slide 1

  • So, as mentioned, my name is Paul Patterson.
  • I work for an organization named Quercus Solutions Inc. We are a technology based company specializing in Microsoft technologies such as .Net, SharePoint, Office 365, and Microsoft Azure.
  • First off, by show of hands, how many of you have had an opportunity to play with LightSwitch at all?
  • Cool. So most, if not all, of the information I have is probably already familiar to you – which is good because if I fail on something, I will call on you to bail me out. J
  • So, what I am going to present to you is a summary of what Microsoft Visual Studio LightSwitch is, and what it may mean to you as a professional software developer, as well as what it may mean to you as a “non-programmer”. Probably more so for the non-programmer.
  • With this presentation, I will be talking with the perspective of the non-professional software developer – those departmental people who may look at LightSwitch as an option to solving a business problem, for example.

Slide 2

  • So, just so you know what to expect, here is a quick look at an agenda.
  • I’ll first give you a quick introduction to LightSwitch, in which will include a peak at the technologies that make LightSwitch tick.
  • The introduction will include a quick demo where I’ll put some what is presented into practice.
  • Next I’ll touch on some extensibility points about LightSwitch.
  • And then, with time permitting, I’ll skim over a few deployment points, and possibly show a quick deployment scenario for you.
  • As far as questions go; at the risk of not having enough time to get through the presentation and demos, if we could hold on to the questions until the end, we might be able to get through the entire presentation.
  • Having said that, I have to caveat that LightSwitch is a huge topic. We could easily spend an entire day going through all the fine details of LightSwitch. Given the 1 hour time box here, I have to cover the higher level points, so I totally expect some questions; just if we can hold off until the end, that would serve us best – thanks.

Slide 3

  • So, according to Microsoft… in using Microsoft Visual Studio LightSwitch 2011, we, “…will be able to build professional quality business applications quickly and easily, for the desktop and the cloud.”
  • Okay! First off, I am not a designated LightSwitch “Champion”. I am just a curious fella who happened onto something that tweaked my interest a few years back.
  • Before you start laying into me with some subjectivity, you should first understand where I coming from, and how I think.
  • I generally like to find things that help me take care of a task in as short a time as possible.
  • This probably came from the various roles as the technical go-to guy within the non-technical departments I’ve worked in.
  • A lot of my early experience to solving business problems involved the use of Excel and Access.
  • When I first read about this tool that Microsoft was working on, one that had the potential to do something fast and easy; I was intrigued.
  • I started watching the development of LightSwitch very early on, even before the first beta was released.
  • I believe it was at sometime in 2008, maybe 2007, that Microsoft made people aware that it was working on this new tool codenamed KittyHawk.
  • Rumour had it that the KittyHawk team included some former FoxPro people – which would make sense because of the timing of FoxPro’s retirement.
  • Anyways it was sometime early last year that I read about this LightSwitch tool that Microsoft was readying for beta testing.
  • After culling through some forums and interweb rumour mills, I took it to task to keep a diligent eye on this thing – hence the start of my blog PaulSPatterson.com.
  • So, since early last year I have been keeping my ear to the ground, listening and watching how this product has evolved in to what it is today.

Slide 4

  • So why did I tell you all that!
  • I am going on my intuition and gut instincts that LightSwitch is going to have a relatively large impact on the industry. Maybe not tomorrow, or even within the next year, but something is telling me to keep an eye on this tool.
  • Software developers tend to keep some technologies close to their chest.
  • All I am saying is to keep an open mind about LightSwitch, and don’t discount the obvious – such as the value proposition the product has.

Slide 5

  • Back to the agenda, let’s talk about the technology behind LightSwitch…

Slide 6

  • LightSwitch is a part of the Visual Studio family of products.
  • It essentially sits as a SKU between Visual Studio Pro and the free Express products.
  • I believe the current retail price for LightSwitch is about $200.00.
  • When you install LightSwitch, if you already have Visual Studio 2010 (Professional or better), it automagically integrates with Visual Studio, making its templates available for selection from the new project templates dialogs.
  • If you don’t have Visual Studio installed, LightSwitch installs as a stand-alone tool; using the same familiar Visual Studio IDE.
  • Using LightSwitch, it is possible to create and deploy an application without writing a single line of code.
  • As such, you can already begin to imagine the value proposition that this will have with the non-developer types.
  • Like I said before, LightSwitch is data centric, and all that someone has to do is provide some data, select to add some screens for the data, and presto, you have an application ready to show off to all your work buddies.
  • It really is just that easy! (That is Shell Busey, home improvement guy!). Yes, I just dated myself

Slide 7

  • LightSwitch uses “best practices” in how it creates applications.
  • For example, LightSwitch applications are built on a classic three-tier architecture where each tier runs independently of the others and performs a specific role in the application.
  • Here is an example 3-tier architecture model. The presentation tier (or “UI”). The logic tier which is the liaison between the presentation tier and the… data storage tier; which is responsible for the application data.

Slide 8

  • We can map specific technologies used in LightSwitch to this architecture.
  • The presentation tier is a Silverlight 4.0 application.
  • It can run as a Windows desktop application or any hosted in a browser that supports Silverlight
  • Which by the way can be done on Mac, I’ve done it, I just can’t remember if it was Chrome or Safari that I got it to work in.
  • For the logic tier, WCF RIA DomainServices is used.
  • The logic tier process can be hosted locally (on the end-user’s machine), on an IIS server, or in Windows Azure.
  • For the data tier, a LightSwitch application’s primary application storage (for development) is SQL Server (SQL Express) technologies.
  • This database access is accomplished via an Entity Framework provider, and custom build WCF RIA DomainServices.
  • There are also opportunities to consume other data sources, which are exposed, typically, via WCF RIA services – oData is a good example.
  • The idea is that LightSwitch removes the complexity of building three-tier applications by making specific technology choices for you so that you can concentrate on the business logic and not the plumbing.
  • Silverlight!! LightSwitch builds out applications using Silverlight.
  • When you create an application using LightSwitch, you are essentially creating an application that uses Silverlight technologies.

Slide 9

  • Back to the agenda…
  • So, next I want to talk a little about what is meant by Screens over Data.

Slide 10

  • It all starts with the data.
  • Data is the heart and foundation of developing with LightSwitch.
  • Most everything we do in LightSwitch revolves around the data.
  • In a nutshell, you tell LightSwitch what to use, and then you create the “screens” that fit over the data. More to come on that later…
  • A LightSwitch application can connect to two types of data: local or internal data and external data.
  • With local data, SQL Server Express is used behind the scenes.
  • When you start designing entities in LightSwitch, which I’ll show you an example of in a second, you are using SQL Express.
  • External data can be consumed from SQL Server databases, SharePoint lists or any WCF RIA Service exposed data.
  • With external data sources, LightSwitch can perform data management, such as CRUD operations, however it cannot make any schema changes on the data source.
  • Note that WCF RIA Services can expose a lot of different types of data sources.
  • If you can wrap a data source in a WCF RIA Service, chances are you can consume the data source in LightSwitch; such as OData, for example – which I have done, as shown in my blog where I consumed some City of Edmonton data to view bus stop information.
  • Also, LightSwitch can also connect to more than one data source at a time, and internally, define relationships between external data sources and internal data entities, if any.

Slide 11

  • So where are we at with the agenda?

[/arrow_list

Slide 12

DEMO TIME!!

  • Launching LightSwitch
  • The IDE – Just like Visual Studio (because it IS Visual Studio)
  • New Project Dialog
  • LightSwitch Start Screen – shows how data is the center of attention
    • Create new table and Attach to external Data Source items
  • Create a new table.
    • Example Customer Table…
    • Explore the Table Designer
      • Field Name
      • Different Data Types
      • Required Checkbox
      • Explore the field properties panel
        • General
          • Choice List
          • Appearance
            • Custom Validation
            • Create custom validation on date field…
            • Private Sub DateAdded_Validate(results As EntityValidationResultsBuilder)
              
              If DateAdded.HasValue Then
              
              If DateAdded > Date.Today Then
              
              results.AddPropertyError("The date added must be today or in the past.")
              
              End If
              
              End If
              
              End Sub
  • Add a screen for the Customer table
    • Explore the add new screen dialog
    • Select List and Details Screen template
      • Select Customers data.
        • Note how the data is “pluralized”
    • Explore the Screen Designer
    • Not your familiar GUI designer.
    • Screen Members List
    • Screen Content Tree
    • Run the application
      • Show Customers screen
      • Add a customer
        • Show phone number formatting
        • Show phone number drop down selection
        • Show date validation
        • Show save data feature
        • Show debug mode designer features…
          • Demonstrate real-time customization
            • Change labels of detail fields
            • Add Description to field to show field tooltip.
  • Explore the Solution Explorer.
    • Folder structure
      • Data Sources
      • Screens
      • Add a new “Address” Table
      • Update the AddressType field to use a ChoiceList
      • Add a relationship to the Customer table.
        • Demonstration the add relationship dialog
        • Show resulting screen designer with the relationship.
  • Delete the existing screen
  • Add a new List and Details screen
    • Select the Customers data for the screen.
    • Show how the Customer addresses is available for selection.
    • Select the addresses to show on the screen.
  • Review the screen designer showing the additional entity collection for the addresses.
  • Run the application and review the new address collection on the screen.
  • Create a new table named City, using just the CityName as a field.
  • Edit the Address table by removing the City field, and then add the relationship to the new City table.
  • Open the CustomerListDetail screen and show how the screen has removed the City field.
    • Drag and drop the field from the Addresses collection to the Addresses data grid.
    • Also, move the Address Type item to the top of the list of items.
  • Create a new Editable Grid screen that will be used to maintain the list of cities.
  • Run the application.
    • Ask if anyone notices anything about what gets displayed…
      • Two things, the address type field, and
      • The address record that was added earlier was deleted.
        • This is because of the edit to the address entity. There was a field removed, and a relationship created, which basically refreshed the model.
  • Show the available navigation on the left Task menu.
  • Select the editable Cities grid and then add some cities.
  • Go back to the Customer List Detail and add some addresses for the customer.
  • Close the app
  • Review the project properties
    • General Properties
      • Shell: The placement and behaviour of elements on a screen
      • Theme: The look and feel (colors and things like that). CSS for the most part.
      • Extensions
        • Used for implementing custom shells, themes, controls, business entities, and whatever your creative heart desires.
        • Access Control
          • Review the different types of authentication that can be implemented
          • Application Types
            • Desktop and Web
            • Application Server.
            • Screen Navigation
  • Create a new group named Lookups
    • Add the Editable Cities Grid screen to the Lookups group
    • Remove the Editable Cities Grid screen from the Tasks group
  • Run the application and demonstrate what has changed.

Slide 13

  • Agenda

Slide 14

  • Now, instead of talking about developing with LightSwitch, we’ll talk about developing for LightSwitch.
  • Thinking about the target market for LightSwitch, extensions are meant to provide the means for LightSwitch developers to enable additional capabilities, with very little effort.
  • Extensions can be created to include additional features and capabilities for;
    • Themes and Shells,
    • Screen Templates,
    • Custom Business Types,
    • Data Sources,
    • Controls, And
    • Starter Kits and Toolkits
  • Extension development is a topic on its own, and worthy of another presentation if you like.
  • There is already a market for extensions, and the extension ecosystem is growing.
  • There are plenty of free extensions available, as well as paid for premium extensions.
  • The major vendors, including our sponsor Telerik, already have products that can be implemented into LightSwitch development.

Slide 15

  • Agenda

Slide 16

DEMO TIME AGAIN

  • Open Extension Manager in Visual Studio
    • Open Online Gallery in extension manager and filter by “LightSwitch”
  • Install a Theme extension
    • Enable the extension and use it and show it in the application as it runs

Slide 17

  • Agenda

Slide 18

  • LightSwitch applications can be deployed as an application that runs on a desktop, or via the web.

Slide 19

  • Agenda

Slide 20

  • Open EDMUGLSDemoapplication and show Application types
    • Review Desktop Client
    • Web Client
    • Application Server
  • From Application Type, choose Web client.
    • Click the Publish button
      • Client Configuration: Web
      • Application Server Configuration: IIS Server
      • Publish Output: Remote Publish to a Server Now
      • Database Connections:
      • Specify a Certificate: nothing
      • Summary: Click the publish button.
  • Go to the server and review IIS
  • Review the newly installed SQL database.
  • Launch the application from a web browser
  • Add some customers.
  • Now let’s add some security…
  • Back in project, select the Access Control tab.
  • Select to use Forms Authentication
  • Go back into the Application Type tab and select Publish
  • Review the new Authentication information.
    • Select Yes, create the Application Administrator….
      • UserName: Application.Administrator
      • Password: P@ssw0rd
      • Select Publish
  • Go to the web site and login using the application administrator
  • Create two users, Paul.Patterson and Joe.User.
  • Assign Paul.Patterson with the administrator permission.
  • Close the browser and then reopen and login as Paul.Patterson
  • Note the administrator feature in the task bar
  • Close the browser and then reopen and login is Joe.User.
  • Note that the administrator feature is no longer there.
  • Time Permitting; Create role based security
    • Create an AddressType table.
    • Create an EditableAddressTypesGrid
    • Create a new LookupTables permission in the AccessControl tab of the application properties.
    • Edit the CanRun code:
      Private Sub EditableAddressTypesGrid_CanRun(ByRef result As Boolean)
      result = User.HasPermission(Permissions.LookupTables)
      
      End Sub
  • Publish the application
  • Open the application in the web browser(login as Paul.Patterson)
  • Create a Role named Application Settings
    • Assign the LookupTables permission to the role
  • Save
  • Assign the Application Settings role to the Paul.Patterson user.
  • Close the browser, and the launch it again and login in as Paul.Patterson.
  • The screen should now be available.

Slide 21

  • Question and Answer time.

[/arrow_list

Slide 22

  • My blog at http://www.PaulSPatterson.com
  • Microsoft Visual Studio LightSwitch 2011 Site – http://www.microsoft.com/visualstudio/lightswitch
  • MSDN LightSwitch site: http://msdn.microsoft.com/lightswitch
  • Michael Washington’s web site: http:/.lightswitchhelpwebsite.com
  • Many more resources and references on my own blog.

LightSwitch posts have a tendency to run long, as does the OakLeaf blog’s.


Return to section navigation list>

Windows Azure Infrastructure and DevOps

Bruce Kyle reported White Paper Series Helps You Get a Start on Windows Azure Pricing in a 9/19/2011 post to the US ISV Evangelism blog:

imageThe Windows Azure Platform Pricing Calculator is designed to help application developers get a rough initial estimate of their Windows Azure usage costs. But where to start? What sort of numbers do you plug in?

A series of white papers from MSDEV helps give you a starting point for figuring it all out. Start with Getting a Start on Windows Azure Pricing and pick a scenario that is similar to your app. Then, use simple sliders in the pricing calculator to plug information about your application —like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need. Set the sliders and the calculator kicks out the estimated monthly cost to run the application on a pay-as-you-go model or through special offers.

image

imageThe series is based on several scenarios that will give you a start on figuring out pricing for your application :

  • imageWindows Azure Pricing Scenario: Asset Tracking Application. If you’re developing an asset tracking application to run on Windows Azure, how do you estimate its monthly cost? The Windows Azure team recently launched a new pricing Azure pricing calculator designed to help developers estimate the cost of running their applications on Azure.
  • Windows Azure Pricing Scenario: E-Commerce Web Site. Estimating the monthly cost of running your e-commerce application on Windows Azure just got easier. The Windows Azure team recently launched a new pricing calculator designed to help developers estimate the cost of running their applications on Azure. You use simple sliders to plug information about your application into the calculator—like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need.
  • Windows Azure Pricing Scenario: Sales Training Application. If you’re developing a sales training application rich in video and other media, how do you estimate its monthly cost to run on Windows Azure? The Windows Azure team recently launched a new pricing calculator designed to help developers estimate the cost of running their applications on Azure. You use simple sliders to plug information about your application into the calculator—like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need. offers.
  • Windows Azure Pricing Scenario: Social Media Application. Estimating the monthly cost of running your social media applications on Windows Azure just got easier. The Windows Azure team recently launched a new pricing calculator designed to help developers estimate the cost of running their applications on Azure. You use simple sliders to plug information about your application into the calculator—like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need.

The SD Times on the Web blog posted Zeichick's Take: With Windows 8, Microsoft may have its mojo back on 9/16/2011:

imageSomething funny happened to me down at Microsoft’s Build conference, held this week in Anaheim. Something rare. Something unusual.

I wanted what I saw on the keynote stage, and I wanted it bad.

I’m talking about the new look-and-feel of Windows 8. The Metro user interface. The seamless transition that it encourages between devices in many different form factors: desktops, servers, tablets and phones. The user experience looks fresh and compelling, and frankly is the most innovative update that I’ve seen to a Microsoft desktop operating system since Windows 95.

imageAs mentioned above, it’s rare for me to have that type of reaction. I didn’t have it upon seeing the first iPhone, for example. In fact, Apple has only done that to me twice, with the MacBook Air and the iPad. (Both of which I purchased promptly when they appeared in stores.)

In fact, I can only think of a few other times I had that reaction. Upon seeing the launch of a particular version of Mathematica (I forget which version). The launch of the Cobalt Cube, an innovative small-business server that Sun Microsystems acquired and killed. Steve Jobs demonstrating the second-generation NeXT pizza-box workstation. Not many others.

Downloading and installing the Windows Developer Preview, including tools, onto one of my lab machines is on my to-do list. (Microsoft gave every paid attendee at Build a Samsung tablet with the Win8 beta and tools preinstalled, but those were not offered to press attendees like yours truly.)

What about the developer angle? Microsoft appears to be making it easy to retrofit existing Windows applications to behave nicely within the new Metro user experience; in fact, the company claims that every app that runs under Windows 7 will run under Windows 8. (Presumably, that’s for Intel x32/x64 apps and not for ARM applications.) The Metro experience is driven by JavaScript with HTML, but can also be implemented using C#, C++ or Visual Basic using XAML. No rocket science there. ...

Read more: Next Page, 2


Scott M. Fulton III (@SMFulton3) reported from Build 2011: Windows 8 Scales the Cloud Down to Fit in a Tablet in a 9/16/2011 post to the ReadWriteCloud blog:

imageIn a way, Azure was the star of Build 2011 and folks here in Anaheim didn't even really know it. Whatever form the Metro apps delivery system takes in the final shipping version of Windows 8 (with a likely timeframe now of Q1 2013), its most impressive and maybe the most important aspect is the inclusion of apps that learn what functions they can provide to the user from the cloud in real-time, and then manage those functions locally on the user's behalf. Put more simply: adaptive apps. [Emphasis added.]

110913 Keynote 11.jpg
ReadWriteWeb at Microsoft Build 2011

imageChris Jones, Microsoft's Senior Vice President for Windows Live, may become the company's newest star if he can pull this off. Windows Live has had trouble scratching out an identity for itself; but as Jones perceives it, Windows 8 could give Live new life, as a kind of cloud-based servant for the operating system. Jones began his Day 1 keynote demo on Tuesday by showing off services such as mail and scheduling.

110913 Keynote 10.jpg

imageIf you weren't paying much attention to that point, you might have thought how ordinary it seems to have a mobile platform run mail and scheduling. If so, you would miss the underlying meanings here:

1. Microsoft is at least experimenting with the idea of folding Outlook from Office into Windows. As Jones said repeatedly, and showed directly, this mail app has Exchange ActiveSync built in. So do Windows Phones, of course, but making that feature meaningful only to folks who use Outlook and Exchange, as opposed to folks who use Windows, is thinking too narrowly.

110913 Keynote 08.jpg

2. Windows Live is experimenting with the notion of providing services directly to Windows without thinking it has to shove its brand into everyone's face. The brand distinctions (Windows 7, Windows Phone, Windows Live) aren't working for Microsoft as well as its services. Perhaps the way to get people to use Windows Live is to fold it into Windows 8 along with Outlook.

Metro-style mail, Jones told attendees, is entirely HTML and JavaScript-based. What he could have said is, the same expertise used to make Windows Live Mail into a Web page has been put to better use making an app.

Here's one of those revelations from Chris Jones that folks may have missed: "All my mail accounts [are] in one place, and because they're all stored in the cloud, I just type my Live ID into this PC and they all just come down into the system. I don't have to worry about setting things up any more, because all of the settings are done through Live."

That's an allusion to Microsoft's innovative Access Control System for Windows 8, which is facilitated through a connection to the Azure Portal where ACS runs.

Jones took this connection one big step further with his demo of the Photos app. Again, there's no "Windows Live" branding here; the brand is you. As the photo at the top of this article shows, the services with which a Windows 8 user shares photos are branded with one of those photos - Jones uses his own family (lovely, by the way) as an example. Those connections with Facebook, Flickr and whatever else are all done in the background because the user logged in with the Live ID first, and because ACS handles all the rest of the authentication process in the background. It's single sign-on, but this time only once.

So when Jones happens to share photos from his phone with Facebook, those photos appear on the Windows 8 PC - even as the "Facebook" category itself. No manual syncing; a zero-click process. You've shared your photos once, and there they are.

Then Jones extended the notion of sharing photos to sharing entire folders - access to remote PCs via SkyDrive without having to go through SkyDrive.

110913 Keynote 09.jpg

"Every Windows 8 user's got a SkyDrive," said Jones. "Every Windows Phone user's got a SkyDrive. In fact, if you've got a Live ID, you've got a SkyDrive and it's there for you to put your personal files and the things you want to share. It's also accessible to developers, and that's an important thing because it lets you as a developer access SkyDrive the way you might have accessed the local file system." Photos that happen to be on a user's SkyDrive simply appear in the Photos app, again without manually syncing.

After years of wondering what Windows Live services should eventually become, this may finally be it: the background service that rises to the foreground.


<Return to section navigation list>

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

image

No significant articles today.


<Return to section navigation list>

Cloud Security and Governance

image

No significant articles today.


<Return to section navigation list>

Cloud Computing Events

Jim O’Neil (@jimoneil) reported Special Multi-User Group Meeting in Waltham–Sept. 21 in a 9/19/2011 post:

imageThree of our local user groups are joining forces this week to host an FOTC (Friend of the Community) and one of my Evangelist predecessors – Thom Robbins – as he presents “A Case Study in Building for Today’s Web – Kentico CMS” on Wednesday, September 21st, at the Microsoft Office on Jones Road in Waltham.

You can RSVP here.

image

It’s a perfect topic to unite the interests of our Boston Azure User Group, New England ASP.NET Professionals, and the Boston .NET Architecture Study Group:

Building software is a set of smart choices to meet the needs of your customers and the possibilities of technology. Today’s Web demands that customers have a choice in how they deploy their applications. With over 7,000 websites in 84 countries, Kentico CMS for ASP.Net is delivered as a single code base for use as a cloud, hosted, or on-premise solution. With over 34 out of the box modules and everything built on a SQL Server backend – How did we do it? What tradeoffs did we make? In this session we will answer that question and look at how to build a rich and compelling website using Windows Azure Cloud

Thom Robbins is the Chief Evangelist for Kentico Software. He is responsible for evangelizing Kentico CMS for ASP.NET with Web developers, Web designers and interactive agencies. Prior to joining Kentico, Thom joined Microsoft Corporation in 2000 and served in a number of executive positions. Most recently, he led the Developer Audience Marketing group that was responsible for increasing developer satisfaction with the Microsoft platform. Thom also led the .NET Platform Product Management group responsible for customer adoption and implementation of the .NET Framework and Visual Studio. Thom was also a Principal Developer Evangelist working with developers across New England implementing .NET based solutions. A regular speaker and writer, he currently resides in Seattle with his wife and son.

Special thanks to Thom for taking time out of his schedule to present on his old turf, and also to Bill Wilder, Naziq Huq, Dean Serrentino, Teresa DeLuca, and Robert Hurlbut for coordinating their groups to make this happen. I’m looking forward to both the talk and the new connections that attendees of the various groups will undoubtedly make at the meeting – hope to see you there!


Rob Tiffany (@robtiffany) reported on 9/15/2011 the Windows Azure at Seattle Interactive Conference to be held 11/2 and 11/3/2011 at The Conference Center at the WA State Convention Center, Downtown Seattle:

Join the Windows Azure team at Seattle Interactive Conference (Nov 2 -3, 2011) for two days of technical content and one-on-one advice and assistance from product experts. Cloud Experience track is for experienced developers and who want to learn how to leverage the cloud for mobile, social and web app scenarios. No matter what platform or technology you choose to develop for, these sessions will provide you with a deeper understanding of cloud architecture, back end services and business models so you can scale for user demand and grow your business.

image

Learn more about the Cloud Experience Track at SIC, and view the speaker list. Registration for the Seattle Interactive Conference is $350, and includes full access to conference sessions and activities.

SIC is developing a world-class speaker roster comprised of online technology’s most successful and respected personalities, alongside earlier-stage entrepreneurs who are establishing themselves as the leaders of tomorrow. SIC isn’t just about telling a story, it’s about truly sharing a story in ways that provide all attendees with a thought provoking experience and actionable lessons from the front lines.

Our confirmed speakers include:

Wade Wegner, Microsoft

Wade Wegner is a Technical Evangelist at Microsoft, responsible for influencing and driving Microsoft’s technical strategy for the Windows Azure Platform.

Rob Tiffany, Microsoft

Rob Tiffany is an Architect at Microsoft focused on combining wireless data technologies, device hardware, mobile software, and optimized server and cloud infrastructures together to form compelling solutions.

Steve Marx, Microsoft

Steve Marx is a Technical Product Manager for Windows Azure.


Nick Harris, Microsoft

Nick Harris is a Technical Evangelist at Microsoft specializing in Windows Azure.

Scott Densmore, Microsoft

Scott Densmore works as a Senior Software Engineer at Microsoft.


Nathan Totten, Microsoft

Nathan Totten is a Technical Evangelist at Microsoft specializing in Windows Azure and web development.

I hope to see everyone there!


<Return to section navigation list>

Other Cloud Computing Platforms and Services

Matthew Weinberger (@MattNLM) reported Zenoss Launches ZenPack Server Monitoring Tool for OpenStack in a 9/19/2011 post to the TalkinCloud blog:

imageOpenStack, the free, open source cloud computing platform, is building a lot of momentum in the industry. It seems as though small service providers, multinational data center service providers and technology giants such as Intel alike are finding a lot of value in an open cloud standard. Unsurprisingly, a cloud developer community has sprung up around the platform with an eye on filling the gaps.

Take Zenoss, for example, which just launched a monitoring solution for OpenStack servers. Zenoss specializes in virtual, physical and cloud infrastructure monitoring and management, and this release extends its reach to OpenStack. Administrators using the Zenoss ZenPack for OpenStack can see server health, performance and inventory, ensuring application stability.

Oh, and did I mention ZenPack is free? And according to the press release, it gives the ability to monitor servers across providers, infrastructures and deployment types (physical, virtual, etc.).

According to Zenoss, it’s just meeting the demand of the rising number of OpenStack users. But my question is this: Right now, OpenStack has the attention and affections of the FOSS community. But will there be an opportunity for cloud ISVs to develop a business around building out the OpenStack experience and feature set?

Read More About This Topic

Jeff Barr (@jeffbarr) reported Now Available: Windows Server 2008 R2 Cluster Compute and Cluster GPU in a 9/18/2011 post:

imageYou can now run Microsoft Windows Server 2008 R2 on the EC2 Cluster Compute and Cluster GPU instances. Just to reiterate, here are the specs for these compute-intensive beasts:

Cluster Compute Quadruple Extra Large:

  • image23 GB of memory
  • 33.5 EC2 Compute Units (2 x Intel Xeon X5570, quad-core “Nehalem” architecture)
  • 1690 GB of instance storage
  • 64-bit platform
  • I/O Performance: Very High (10 Gigabit Ethernet)

Cluster GPU Quadruple Extra Large:

  • image22 GB of memory
  • 33.5 EC2 Compute Units (2 x Intel Xeon X5570, quad-core “Nehalem” architecture)
  • 2 x NVIDIA Tesla “Fermi” M2050 GPUs
  • 1690 GB of instance storage
  • 64-bit platform
  • I/O Performance: Very High (10 Gigabit Ethernet)

These instances provide you with plenty of RAM, cycles, and network performance for heavy-duty workloads. With this release, you can now run Microsoft Windows on every one of the eleven EC2 instance types, from the Micro on up.

You can select the Windows Server 2008 R2 AMI from the AWS Management Console:


Jeff Barr announced FISMA Moderate for AWS in a 9/15/2011 post (missed when posted):

imageOur compliance team has been working non-stop to make sure that AWS qualifies for and receives a number of important certifications and accreditations. In the last year or so I have blogged about SAS 70 Type II, FISMA Low, and ISO 27001.

imageAfter receiving our FISMA Low level certification and accreditation, we took the next step and started to pursue the far more stringent FISMA Moderate level. This work has been completed and the door is now open for a much wider range of US Government agencies to use AWS as their cloud provider. Based on detailed security baselines established by the National Institute of Standards and Technology (NIST), FISMA Moderate certification and accreditation required us to address an extensive set of security configuration and controls.

We receive requests for many different types of reports and certifications and we are doing our best to prioritize and to respond to as many of them as possible. Please let me know (comments are fine) which certifications would let you make even better use of AWS.

You can read about our security certfications and practices at the AWS Security Center. To learn more about how our team works with agencies of the federal government, visit our Federal Government page.

Relevant AWS jobs include:


Derrick Harris (@derrickharris) reported Facebook gives devs easy access to Heroku cloud on 9/15/2011 (missed when posted):

A new integration from Facebook and Heroku gives Facebook developers direct access to Heroku’s cloud Platform-as-a-Service offering for hosting their applications. The goal is to make life as easy as possible for developers by eliminating hassles associated with actually running an application once it’s written. And it’s likely just a first step for Heroku when it comes to integrating with popular specialized development platforms.

Writing Facebook applications is actually easy enough, Heroku co-founder Adam Wiggins told me during an interview, but Facebook has been working to ease the burden of running them. When Heroku launched its Facebook App Package last year, I asked if it could become the official cloud of Facebook. Maybe it has done just that.

Now there’s an option within the Facebook development platform to launch an app on Heroku. In fact, with the push of a button, apps are up and running on Heroku without ever taking developers off the Facebook site. It’s only afterward that developers have to log in to Heroku to set a password and add additional tools or services.

As with all things cloud computing, the benefit for developers is not just automating the hosting process but also the promise of being able to handle unexpected traffic spikes. For individual developers, Wiggins said, the integration will let their apps keep up should they suddenly get popular. Businesses get the same benefit in terms of elasticity, he added, which lets them make social media inroads without buying and provisioning pools of servers in advance.

Facebook, of course, has a large developer community and represents a potentially significant source of new customers for Heroku, but Wiggins said it’s probably just a stepping stone for more-specialized integrations. He noted mobile apps as an area with particular promise — a possibility I highlighted in a recent GigaOM Pro report (subscription required) — but added that specialized offerings, in general, are easier to sell to specific developer bases than are general-purpose offerings.

Facebook developers can start using the Heroku integration on Thursday, and Heroku personnel will be at next week’s F8 conference to address questions or issues that arise in the meantime.


<Return to section navigation list>

0 comments: