Auto-Generating Terraform State Storage in Azure
5 min read

Auto-Generating Terraform State Storage in Azure

In my latest Azure/Terraform post, I touched on how I solved the “Chicken and Egg” problem with Terraform: how you need cloud resources in order to store Terraform state, but you can’t use Terraform to generate those cloud resources. This post details the solution to that problem.

The Problem

The “Chicken and Egg” problem with Terraform, for me, can be succinctly defined with four points:

  1. I want to write Terraform to generate resources in a cloud provider.
  2. I don’t want to run Terraform locally.
  3. I want to store my Terraform state in the same cloud provider.
  4. How do I generate the cloud provider’s resources that will store my Terraform state?

The Solution

Instead of relying on Terraform to generate the cloud provider’s resources that will store your Terraform state, you instead use the cloud provider’s native tools to generate and manage those resources. As I am currently working in Azure, I will use the Azure’s az CLI tool. This type of solution is also relevant to AWS’s aws CLI and GCP’s gcloud CLI, if you want to use those instead.

The Script

In each of my repositories that house Terraform definitions (which is every repository at the end of the day), I have an extra script: create-storage.sh.

This is a Bash script because I develop using .NET Core, and my build/deploy machines are all Linux-based in Azure DevOps. This could just as easily be a PowerShell script if it needed to be executed in a Windows environment.

Heads Up!: If you want the most recent version of the script, please go here instead of trusting this static website.

#!/bin/sh

# Heads up! You need to define the following environment variables:
# RESOURCE_GROUP_NAME for the resource group that will contain the Azure Storage Account that will house your Terraform state files
# STORAGE_ACCOUNT_NAME for the name of the Azure Storage Account
# KEYVAULT_NAME to store the Storage Account's access key, so you don't have to manually keep track of it
# LOCATION for the location of the Azure resources
# KEYVAULT_SECRET_NAME for the name of the secret in Key VAult of the Storage Account's access key
# CONTAINER_NAME for the Azure Blob Storage's container that will hold the Terraform state file(s)

RESOURCE_GROUP_NAME=mm-terraform-rg
STORAGE_ACCOUNT_NAME=mmterraform
KEYVAULT_NAME=mm-terraform-kv

# Create resource group
az group create --name ${RESOURCE_GROUP_NAME} --location ${LOCATION}

# Create storage account
az storage account create --resource-group ${RESOURCE_GROUP_NAME} --name ${STORAGE_ACCOUNT_NAME} --sku Standard_LRS --encryption-services blob

# Get storage account key
ACCOUNT_KEY=$(az storage account keys list --resource-group ${RESOURCE_GROUP_NAME} --account-name ${STORAGE_ACCOUNT_NAME} --query [0].value -o tsv)

# Create Key Vault
az keyvault create --name ${KEYVAULT_NAME} --resource-group ${RESOURCE_GROUP_NAME} --location ${LOCATION}

# Store account key in secret
az keyvault secret set --name ${KEYVAULT_SECRET_NAME} --vault-name ${KEYVAULT_NAME} --value $ACCOUNT_KEY

# Create blob container
az storage container create --name ${CONTAINER_NAME} --account-name ${STORAGE_ACCOUNT_NAME} --account-key $ACCOUNT_KEY

Environment Variables

There are a few environment variables that are required with the script above.

Variable Purpose
RESOURCE_GROUP_NAME The name of the resource group that will house all of the resources generated for the Terraform state storage
LOCATION The location of the resource group (and subsequent resources located within the resource group)
STORAGE_ACCOUNT_NAME The name of the Azure Storage Account that we will be creating blob storage within
CONTAINER_NAME The name of the Azure Storage Container in the Azure Blob Storage. This will actually hold the Terraform state files
KEYVAULT_NAME The name of the Azure Key Vault to create to store the Azure Storage Account key. This allows us to automate the running of Terraform files without having to store the key ourselves
KEYVAULT_SECRET_NAME The name of the secret that will store the Azure Storage Account key

Pay special note to the fact that we are generating an Azure Key Vault and Azure Key Vault Secret to manage the storage of the Azure Storage Account’s key. While this isn’t technically necessary, and we could just query the Azure Storage Account itself for the key anytime we needed it (as seen in the Get storage account key section of the script), Azure DevOps has tight integration with Azure Key Vault, and this step simplifies our future deployment of Terraform resources.

The Process

The functionality in the file itself is pretty similar. The file itself isn’t special, as it’s mostly the same as what Azure provides in their docs. The interesting bits that are special are the exporting of the Storage Account’s key into Key Vault for later use in an Azure Pipeline during deploy-time.

The Pipeline

Having this file exist is useful, but it doesn’t truly tie together delivery of the resources and the “auto-generating” part in a real-world scenario. To accomplish this, I use Azure Pipelines in the Azure DevOps offering.

Build Pipeline

As Azure Pipelines can use YAML to define the build pipelines, I’ve got an azure-pipelines.yml file in the root of each repository. Within that YAML file, I’ve got the following snippet:

variables:
  AZURE_SUBSCRIPTION: 'xxx'
  APPLICATION: 'xxx'
  CONTAINER_NAME: $(ENVIRONMENT_PREFIX)terraform
  KEYVAULT_NAME: $(APPLICATION)-terraform-kv
  KEYVAULT_SECRET_NAME: $(ENVIRONMENT_PREFIX)-storage-account-key
  LOCATION: 'eastus'
  RESOURCE_GROUP_NAME: $(APPLICATION)-terraform-rg
  STORAGE_ACCOUNT_NAME: $(APPLICATION)terraform
  TF_IN_AUTOMATION: 'true'

stages:
- stage: Setup
  jobs:
  - job: SetupDevelopmentStorage
    variables:
      ENVIRONMENT_PREFIX: 'd'
      ENVIRONMENT_NAME: 'development'
    displayName: 'Setup Development Storage'
    steps:
    - task: AzureCLI@1
      displayName: 'Run Setup Script'
      inputs:
        azureSubscription: $(AZURE_SUBSCRIPTION)
        scriptPath: './create-storage.sh'
  - job: SetupStagingStorage
    variables:
      ENVIRONMENT_PREFIX: 's'
      ENVIRONMENT_NAME: 'staging'
    displayName: 'Setup Staging Storage'
    steps:
    - task: AzureCLI@1
      displayName: 'Run Setup Script'
      inputs:
        azureSubscription: $(AZURE_SUBSCRIPTION)
        scriptPath: './create-storage.sh'

  - job: SetupProductionStorage
    variables:
      ENVIRONMENT_PREFIX: 'p'
      ENVIRONMENT_NAME: 'production'
    displayName: 'Setup Production Storage'
    steps:
    - task: AzureCLI@1
      displayName: 'Run Setup Script'
      inputs:
        azureSubscription: $(AZURE_SUBSCRIPTION)
        scriptPath: './create-storage.sh'

Nothing too special in this build pipeline, to be clear. It is just executing the create-storage.sh script with different environment-specific parameters being passed in. You can see that I use the environment-specific parameters in the definition of the other environment variables (e.g., CONTAINER_NAME has $(ENVIRONMENT_PREFIX) at the start of its definition). This lets me have separate containers in the same Azure Storage Account that house environment-specific Terraform state files. Alternatively, I could just use differently-named Terraform state files, but I like consistency for the file names themselves.

Deploy Pipeline

This is where the magic happens. As of the time of this writing, I don’t currently use multi-stage pipelines, so this is all done within the Classic Release Pipelines web UI.

The first step in the Release Pipeline is to retrieve the Key Vault secret that was stored from the create-storage.sh script via the AzureKeyVault step. This is then stored within an Azure Pipelines variable named after the secret name itself. For example, if the secret name is d-storage-account-key, the Azure Pipeline’s variable will also be d-storage-account-key. We will see this in use in a future step.

For the Terraform-specific steps, I use the very wonderful Terraform Build and Release Tasks by Charles Zipp found on Azure Marketplace.

The first Terraform step I use is merely to install Terraform to the agent, using the TerraformInstaller step. As of this writing, I am using Terraform v0.12.3, but I’m sure that’ll change soon.

The second Terraform step is to run terraform init by using the TerraformCLI step. I pass in the following Command Options: -backend-config="access_key=$(d-storage-account-key)" -backend-config="storage_account_name=$(APPLICATION)terraform" -backend-config="container_name=$(ENVIRONMENT_PREFIX)terraform" -backend-config="key=$(APPLICATION).tfstate".

As I mentioned previously, you’ll see that I’m using the $(d-storage-account-key) Azure Pipeline variable to retrieve the access key. Straightforward, but something to keep in mind if you’re building it out.

The final Terraform step is to run terraform apply by using the TerraformCLI step again. The only Command Options I pass in are the environment-specific Terraform variables: -var-file="./environments/$(ENVIRONMENT_NAME)/terraform.tfvars". The usage of these variables can be seen in the previous Azure/Terraform post.

tl;dr

Use native cloud provider CLI tools to generate and store access keys for use later in build and deploy pipelines. Pretty straightforward, yeah?