Azure Security Logging – part I: defining your logging strategy

In this first blog post in a series about Azure Security Logging, we will give a general overview of the types of logs available for Azure services including their storage options. We will also discuss how to define a security logging strategy in Azure. In the upcoming blog posts, we will go into detail about the logging capabilities for specific Azure resources and what security related data they contain.

Logging in Azure

In Azure, adequate security logging is not enabled by default and this is crucial for doing forensics, incident response and threat hunting in the cloud.

All resources in Azure generate activity logs and diagnostic logs (when enabled) which contain important security related data. Compute resources, resources that your application runs on such as Azure App Service or Virtual Machines, can additionally generate application logs.

Activity logs

Subscription-level control-pane events on the Azure platform are logged to the activity log and provide insight into what exactly has occurred. All write Operations (PUT, POST, DELETE) against the Azure API are logged, but read operations (GET) are not. Activity logs are enabled by default and stored in the Azure Activity Log for 90 days unless they are exported.

Diagnostic logs

Each Azure service provides detailed data about the operation of that service and when enabled, this is logged to diagnostic logs. The content of these logs is different for each Azure service and resource type. Diagnostic logs are not enabled by default.

Application logs

Applications running on Azure Compute resources should also generate security logs but will not be covered in this blog series since the scope is on Azure infrastructure.


The possibilities for storing and exporting logs varies for each service but in general there are 3 options. These options are discussed below.

Export to a storage account

Exporting logs to a storage account is available for most Azure resources and when enabled it writes activity or diagnostics log to a Storage Account in the same or different subscription. It automatically creates a storage container in Azure Blob Storage and stores the logs in a JSON format per subscription and resource. A retention period of max one year can be chosen to rotate logs automatically or they can be stored indefinite.

This is the cheapest options for storing logs at €0.0085 per GB for local redundant cold storage which is ideal for security log files only accessed when doing DFIR.

Export to Event Hub

Azure Event Hub is one the message bus services available on Azure and is created for streaming time-ordered lightweight events that can be consumed by multiple sources.

It can be used to make logs available for consumption by other Azure services such as a Log Analytics workspace in other subscriptions or to another log analytics platform such as an ELK stack.

Export to Log Analytics

Azure Log Analytics is a part of the Azure Monitor service and focuses on storage and analysis of log data using its own query language. Data is stored in a Log Analytics Workspace where custom queries can be created. Alerts can be created on top of these queries which in their turn can trigger actions such as emails, Logic Apps or Automation Runbooks.

Visualization of the data in a workspace can be done with Power BI by importing the results of a log query into a Power BI dataset.

This is the most expensive of the options at €2.522 per GB for data ingestion and €0.11 per GB per month for data retention.


Logging Strategy

Depending on your goals for logging and the structure of you Azure subscriptions, choosing your log storage options can differ.

Using multiple Azure subscriptions is a best practice for isolating access to Azure resources and application data between teams or projects. A separate subscription for the security team is recommended. However, if you want to collect all security related logs in this subscription, there are some limitations. Exporting logs to a Storage Account in a different subscription is possible but exporting them to a Log Analytics Workspace is not. An Event Hub and Logic app can make the log data available to other subscriptions.1

For doing effective Digital Forensics and Incident Response (DFIR) in the cloud, the retention time of all logs should be at least one year. This doesn’t have to be directly available and searchable like in a Log Analytics Workspace. Using cheap storage in a Storage Account is the best option for long term storage and you can import this data into a Log Analytics Workspace (or any other log analysis platform2) when needed.

When you also want to start Threat Hunting in the cloud you need all recent log data available for querying. Using Azure Log Analytics is a good cloud native, scalable option but the data does not have to be available indefinite, one month should be enough. Older data can be reimported from the Storage Account when necessary.3

Before developing your own threat hunting queries, have a look at Azure Security Center.4 Microsoft already did a good job creating threat hunting capabilities based on Azure Log Analytics. If this does not meet the requirements for your organisation, you can start creating your own by querying the data in a Log Analytics Workspace.

Security logging in Azure is only effective if every critical Azure resource has it properly configured, and logs are aggregated in one place. Make sure it is enabled in all subscriptions by creating an Azure Policy to check the compliance of this security control.

Conclusion

Azure activity and diagnostic logs for all (non-development) Azure resources need to be stored for at least one year for doing effective DFIR. At a minimum, the data should be stored in a Storage Account located in a separate subscription only accessible by the security team.

If you want to do threat hunting, the data needs to be available in a log analytics platform. When using Azure Log Analytics, all data needs to be collected in one security workspace across all subscriptions. Since this export option is not available out-of-the-box, Azure Event Hub and Logic Apps need to be used but this adds an additional cost. Depending on the knowledge within your organisation or if you have a multi-cloud strategy, an 3rd party log analytics might be a better option.

Audit the compliance of your logging strategy with Azure Policy on all your Azure subscriptions to ensure it is implemented according to your corporate standard.

In the upcoming blog posts in this series, we will go into more detail about the logging capabilities for specific Azure resources and what security related data they contain! Stay tuned!


1 https://docs.microsoft.com/en-us/azure/azure-monitor/platform/collect-activity-logs-subscriptions

2 https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-visualize-nsg-flow-logs-open-source-tools

3 https://azure.microsoft.com/en-us/blog/query-azure-storage-analytics-logs-in-azure-log-analytics/

4 https://docs.microsoft.com/en-us/azure/security-center/security-center-detection-capabilities

4 thoughts on “Azure Security Logging – part I: defining your logging strategy

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s