Store Terraform State Securely in Azure

Terraform uses its state file to store information about what it has deployed, this is how it knows what to do when you run commands like plan or destroy, it uses this state file to determine what it has done up to now, and what impact your request will have. One of the downsides of this state file, however, is that you will end up storing secrets in it, and these will be in plain text. Things like passwords and keys you pass into your Terraform script will be stored, but also any resources you create with your template could have their credentials stored in the state file.

Ideally, in my view, Terraform would provide some way to either not store these secrets in the state file, or some form of encrypting specific items in the file, so they are not in plain text. The conversation around changing Terraform to do this has been ongoing for a long time now, you can find it on Github here if you’re interested.

In the meantime, we are stuck with the current implementation, so we need to find ways to secure this state file and ensure that it is kept protected at all times.

A Note on Terraform Cloud

If you talk to Hashicorp, they would likely tell you that using Terraform Cloud to store your state file is the answer to the problem. While this may be one way to resolve this issue, many people are not going to be able to outsource the storage of this file to Terraform cloud. If Terraform Cloud is something you can make use of, it may very well help resolve this issue, but for the rest of this article we will assume that is no the case and we need to focus on Azure specific solutions.

Remote State

The first step to securing the state file is recognising you need to move to using remote state. By default, Terraform will store the state file on your machine next to templates. This may work for simple workloads, but storing the state file on your computer has two key downsides:

  1. Security - now your state file, full of secrets, is only as secure as your laptop. If you get compromised, or you accidentally put the state file in a location where others can get it, or even check it into GitHub, then your secrets will be exposed
  2. Collaboration - using a local state file only works if you’re the only one working on a project. As soon as someone else wants to work on it, you need to be able to share the state file.

Given this, most people find that very quickly they need to look at using Remote State, storing the state file outside of their machine on a shared resource. Terraform provides many different options for where you can store you state file, you can find them listed here. For an Azure-based setup, there is a single option, which is to use Azure Blob Storage. By using Azure Blob storage as your backend, you place your state file in a storage container, which can then be referenced by anyone using the templates, so long as they have credentials to access it.

Once you have decided to move to a remote state in Azure Storage, then we need to ensure that is configured securely.


As mentioned, we cannot encrypt individual items in the state file, so instead, we will want to look at encrypting the state file as a whole. This does not protect us against someone who gains access to the storage account from downloading and reading the file, but it at least prevents someone from gaining access to the backend. Azure Storage supports encryption at rest either with a Microsoft managed key or your own key. Storage Encryption is now enabled by default, but you should make sure it is enabled, and if you want to use your own key enable that.

We also want to make sure that the state data is encrypted during transit, and so we want to force the use of HTTPs on the storage account. This is achieved through the “Require Secure Transfer” option. Again this is enabled by default unless you are using a custom domain, but you should make sure it is enabled. Azure storage requires the use of TLS 1.2 by default.

Container Configuration

Your state files need to be stored in a container within the storage account. When you create this container, you need to make sure it is built with private access enabled (this is the default). Allowing anonymous public access would make the state accessible to anyone.

Network Access Restrictions

The next step is to ensure that access to the storage account over the network is restricted to only those who need to use it. Storage accounts can be locked down using two methods:

  1. Service Endpoints/Storage Firewall
  2. Private Link

Private link is going to provide you with the most secure option, as this allows you to effectively join your storage account to your virtual network and access it using a private IP address. You can then restrict all other access to the storage account, so the only way this can be accessed is over your virtual network. However, this assumes that all the clients who want to deploy the Terraform templates are on your vNet, or connected to it using VPN or Express Route.

If your clients are not all connected to your private networks, then you can instead use Service Endpoints. This allows you to apply a firewall to your storage account and lock it down to specific IP addresses. Traffic is still connecting to an internet-facing endpoint in the storage account, but it is at least restricted to specific IP’s.

User Access Restrictions

Now that we have restricted which networks can access the state storage, we will also want to restrict which users can access the state. Because the container we are using is configured as private, then the user running the deployment will need to authenticate to the storage service. Terraform supports the following means of Authentication:

  1. Authenticating using the CLI as a user or a Service Principal
  2. Managed Service Identity
  3. Storage Account Key
  4. SAS Token

I would recommend you avoid using the storage account key option. The storage account key provides full admin access to the storage account, which is far more than is needed to undertake these operations. You can find full details of how to configure the backends here.

User or Service Principal

This method relies on you having already authenticated to the Azure CLI in the same command line as where you are running the Terraform. This requires the account you are using to have at least the “storage account key operator role” as behind the scenes it is grabbing the storage account key to access the resource.

Currently, Terraform does not support the use of the newer Azure AD authentication to a storage account.

Managed Service Identity

If you are automating your Terraform deployments, then you may want to look at using Managed identity. You can assign an identity to the machine you are running your deployments from. Using this method, you can avoid having to supply credentials to your Terraform template.

SAS Token

If none of the above methods work for you then can use a SAS token to access the storage account. You will want to ensure that your token is explicitly scoped to the container you are using for your deployments, and create it with as short a life as possible.

If you are going to use a SAS token you want to avoid storing this in your configuration file. Instead, you can store this in an environment variable on the machine you are running your deployments on when you need it.

Monitoring and Audit

Now that you have your storage account configured for secure access, you will want to configure monitoring to detect any unexpected changes that may indicate a security breach.

Azure Security Centre

Azure Security Centre can provide advanced threat monitoring for Azure Storage which can detect unusual attempts to access your storage account.

Activity Logs

Azure Activity Logs can be used to record actions that occur in a storage account to allow to keep track of any changes and determine who has made the changes and what they have done.