It turns out that using table storage in Windows Azure 1.0 is quite easily done with a minimal amount of configuration. In the new Azure cloud service Visual Studio project template, configuration is significantly simpler than what I ended up with in a previous blog post where I was able to get the new WCF RIA Services working with the new Windows Azure 1.0 November 2009 release and the AspProvider code from the updated demo.
And while the project runs as expected on my own machine, I can’t seem to get it to run in the cloud for some reason. I have not solved that problem. Since I can’t seem to get much of a real reason for the failure via the Azure control panel, I’ve decided to start at the beginning, taking each step to the cloud rather than building entirely on the local simulation environment before taking it to the cloud for test.
I started with a clean solution. No changes except to the web role’s Default.aspx page with some static text. Publish to the cloud and twenty to thirty minutes later (too long in my opinion) the page will come up, deployment is complete and the equivalent of a “hello world” app is running in the Azure cloud.
The next step I want to experiment with is the simplest use possible of Azure table storage. In the previous project, there was a lot of configuration. In this new project, there was very little. I wondered how much of the configuration from the previous project, largely an import from the updated AspProviders demo project, was a partially unnecessary legacy of CTP bits. It turns out, nearly all of it.
Here’s the only configuration you need for accessing Azure storage:
<?xml version="1.0"?>
<ServiceConfiguration serviceName="MyFilesCloudService" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
<Role name="MyFiles">
<Instances count="1" />
<ConfigurationSettings>
<!-- Add your storage account information and uncomment this to target Windows Azure storage.
<Setting name="DataConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myacct;AccountKey=heygetyourownkey" />
<Setting name="DiagnosticsConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myacct;AccountKey=heygetyourownkey" />
-->
<!-- local settings -->
<Setting name="DataConnectionString" value="UseDevelopmentStorage=true" />
<Setting name="DiagnosticsConnectionString" value="UseDevelopmentStorage=true" />
</ConfigurationSettings>
</Role>
</ServiceConfiguration>
The thumbnails sample does provide one important piece of code that does not come with the standard Visual Studio template. It goes in the WebRole.cs file and makes the use of the CloudStorageAccount class’s static FromConfigurationSetting method which returns the needed CloudStorageAccount instance. To use that method, the SetConfigurationSettingPublisher method must have already been called. Hence this code placed in the WebRole class like this:
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
DiagnosticMonitor.Start("DiagnosticsConnectionString");
// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
RoleEnvironment.Changing += RoleEnvironmentChanging;
#region Setup CloudStorageAccount Configuration Setting Publisher
// This code sets up a handler to update CloudStorageAccount instances when their corresponding
// configuration settings change in the service configuration file.
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
// Provide the configSetter with the initial value
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
RoleEnvironment.Changed += (sender, arg) =>
{
if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
.Any((change) => (change.ConfigurationSettingName == configName)))
{
// The corresponding configuration setting has changed, propagate the value
if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
{
// In this case, the change to the storage account credentials in the
// service configuration is significant enough that the role needs to be
// recycled in order to use the latest settings. (for example, the
// endpoint has changed)
RoleEnvironment.RequestRecycle();
}
}
};
});
#endregion
return base.OnStart();
}
private void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
// If a configuration setting is changing
if (e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))
{
// Set e.Cancel to true to restart this role instance
e.Cancel = true;
}
}
}
Once you have the set the configuration setting publisher, you can use the FromConfigurationSetting method in the creation of your CloudTableClient and then check for the existence of a table, creating it if it does not already exist in your repository code. Here’s a minimal example that I used and published successfully to my Azure account.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using Microsoft.WindowsAzure.StorageClient;
using Microsoft.WindowsAzure;
using System.Data.Services.Client;
namespace MyDemo
{
public class AdventureRow : TableServiceEntity
{
public string IpAddress { get; set; }
public DateTime When { get; set; }
}
public class AdventureRepository
{
private const string _tableName = "Adventures";
private CloudStorageAccount _account;
private CloudTableClient _client;
public AdventureRepository()
{
_account = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
_client = new CloudTableClient(_account.TableEndpoint.ToString(), _account.Credentials);
_client.RetryPolicy = RetryPolicies.Retry(3, TimeSpan.FromSeconds(1));
_client.CreateTableIfNotExist(_tableName);
}
public AdventureRow InsertAdventure(AdventureRow row)
{
row.PartitionKey = "AdventurePartitionKey";
row.RowKey = Guid.NewGuid().ToString();
var svc = _client.GetDataServiceContext();
svc.AddObject(_tableName, row);
svc.SaveChanges();
return row;
}
public List<AdventureRow> GetAdventureList()
{
var svc = _client.GetDataServiceContext();
DataServiceQuery<AdventureRow> queryService = svc.CreateQuery<AdventureRow>(_tableName);
var query = (from adv in queryService select adv).AsTableServiceQuery();
IEnumerable<AdventureRow> rows = query.Execute();
List<AdventureRow> list = new List<AdventureRow>(rows);
return list;
}
}
}
Just as the StorageClient has moved from “sample” to first class library in the Azure SDK, I suspect that the AspProviders sample may already be on that same path to glory. In it’s current form, it represents something new and something old, the old having not been entirely cleaned up, lacking the elegance it deserves and will likely get either by Microsoft or some other enterprising Azure dev team.
As for me, I will continue trying to learn, one step at a time, why the Adventure code I blogged about previously will run locally but not on the cloud as expected. Who knows, it could be as simple as a single configuration gotcha, but figuring it out one step at a time will be a great learning opportunity and as I get the time, I’ll share what I learn here.