Logging in Azure: Part 1

Migrating an existing application to Windows Azure can be fairly simple. Many of the technologies used on-premise (like IIS or SQL) can be directly transferred to Azure services. There are however a few components of the applications architecture that need to be revisited. One of those components may be logging. There are a few application diagnostics features that will work the same in the cloud, like Tracing, Profiling, and Windows Event Logs for example, with the help of the Diagnostics module. One diagnostic feature that is often customized is Logging.

Many applications simply use a rolling file logger that may write to the local directory or a shared drive. This process may work well when you have a persistent server but in Windows Azure all the Web and Worker Roles are stateless so writing to the local file system becomes a problem.

There are many solutions to this problem. Over the next few posts I am going to go through what the solutions are, how to implement them, and the considerations between the different solutions. All of them will use a common logging framework called log4net.

So far the solutions will include (subject to change):

1. Synchronizing a log file to blob storage
2. Using a custom log4net appender to write directly to table storage
3. Logging to the Trace log and synchronizing to table storage

These solutions are in order of complexity (in my opinion). Each topic will outline the structure of the solution, how to implement it, and the pros/cons of the solution compared to the others.

1. Synchronizing a log file to blob storage

This solution seems like the simplest because much of the work is taken care of for you. That being said, there were a few “gotchas” the first time I implemented this.

Setting up log4net

I used a basic Worker Role as an example, to start create a new Azure project with a Worker Role.

To obtain the log4net .dll you can just install the log4net NuGet package. Once you have that add the following configuration to the app.config file.

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <configSections>
    <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net" />
  </configSections>
  <log4net>
    <appender name="FileAppender" type="log4net.Appender.FileAppender">
      <file value="logslog-file.txt" />
      <appendToFile value="true" />
      <layout type="log4net.Layout.PatternLayout">
        <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
      </layout>
    </appender>
    <root>
      <level value="INFO" />
      <appender-ref ref="FileAppender" />
    </root>
  </log4net>
</configuration>

This will setup a basic log file appender to write the log4net logs to a file. To test out the logger replace the Run method in the WorkerRole.cs file with the following.

private ILog _logger;

public override void Run()
{
    XmlConfigurator.Configure();

    _logger = LogManager.GetLogger(GetType());

    _logger.Info("Starting Worker Role");

    while (true)
    {
        Thread.Sleep(10000);
        _logger.Info("Waiting 10 Seconds");
    }
}

Setting up the Azure Diagnostics Module

Now for the tricky part, configuring the Windows Azure Diagnostics module.

public override bool OnStart()
{
    var logFilePath = 
	Path.Combine(Environment.CurrentDirectory, "logs");

    var logDir = new DirectoryConfiguration
                        {
                            Container = "wad-log4net",
                            DirectoryQuotaInMB = 100,
                            Path = logFilePath
                        };

    var diagnostics = DiagnosticMonitor.GetDefaultInitialConfiguration();
    diagnostics.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
    diagnostics.Directories.DataSources.Add(logDir);

    CloudStorageAccount account = CloudStorageAccount.Parse(
        RoleEnvironment.GetConfigurationSettingValue(
            "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"));

    DiagnosticMonitor.Start(account, diagnostics);

    return base.OnStart();
}

So the diagnostics module will synchronize a directory (in this case /logs) to blob store. Whatever files exist in that directory will be uploaded every minute (defined by the ScheduledTransferPeriod) and replace the files that were in the blob. If the diagnostics completed successfully you should have a file in the wad-control-container blob container with a directory node that matches this information (note: there will be another crash dump directory that is configured by default). Once 1 minute has passed, diagnostics should create a “wad-log4net” (or name of your choosing) container with the sync’d files.

The contents of the directory will not be duplicated in the blob container. For this reason I’d suggest looking into using a Rolling File Log Appender with log4net so you do not loose any logs when 1 file becomes too large.

If you are having issues with the diagnostics setup on your local machine watch the emulator console for “[Diagnostics]” and “[MonAgentHost]” trace logs.

Note: I used a Worker Role for simplicity, if you are using a Web Role the log4net configuration should go in the Global.asax.cs file and the Diagnostics Module config should stay in the OnStart. There is no dependency on log4net you can use any framework to generate log files.

Pros / Cons

Pros: This solution can easily be setup for applications that currently use file logging with minimal effort and almost no code changes. Also, this solution uses the diagnostics module which contains a rich set of synchronization features and capabilities which you will not have to write yourself.

Cons: The logs themselves are contained in files which can make issue discovery more difficult. The following solutions write messages to table storage which can be easily queried for certain messages.

Stay tuned this week for the remaining logging solutions.

Please refer to this GIST for the source code: https://gist.github.com/2395186

Thank you.

10 thoughts on “Logging in Azure: Part 1”

      1. You can override the same OnStart virtual method in a Web Role as in a Worker Role. If it does not have an override in the default project just add the code in. Also, if you don’t have a WebRole.cs file you can add a new class inheriting from WebRole and it should get picked up.

        I hope that helps.

  1. I have written a logger for Windows Azure – QLog. It uses Azure Table to store logs and comes with logs viewer QLogBrowser. It is available via NuGet. I think that it may be a bit more comfortable to use than log4net and other solutions.

  2. Hı,

    Firstly, thanks for posting. I have a problem. I can synchronize logs to the blob storage, but after each new deployment of the project to the azure I lost older logs. How can I keep old logs in blob storage?

Leave a Reply