Skip to main content

A reusable GitHub Actions workflow for Terraform

·1216 words·6 mins
Terraform Github Actions Ci/Cd

Writing a reusable workflow in GitHub is a great way to DRY (Don’t Repeat Yourself).

I recently wanted to write a reusable workflow that could handle a given set of Terraform configuration files. I will call such a collection of Terraform configuration files a Terraform project, or simply a project, from now on. In this recent time in my life I had several Terraform projects in a single git repository that I wanted to deploy individually.

Disclaimer: in my experience this is actually not a common approach, you would rather combine all your Terraform projects into a single project and let Terraform handle the relationships between all your resources. However, in this particular situation I wanted to split it up into several projects.

The challenge with a reusable workflow is in handling inputs and outputs. To be fair there are other challenges as well, but this is the one I will focus on in this article. All your Terraform projects have a different set of input variables and a different set of output values. Each project is unique. My goal in the rest of this article is to demonstrate one possible way to set up a reusable workflow that can handle arbitrary input and outputs. Let us get started!

The Terraform projects
#

First of all, let us create our Terraform projects. For demonstration purposes each project will be simple. I will use two projects, the first I call architecture01.

// main.tf
terraform {
  required_version = "> 1.3"
  required_providers {
    random = {
      source  = "hashicorp/random"
      version = "3.4.3"
    }
  }
}

resource "random_pet" "first_pet" {
  prefix = var.first_pet_prefix
}

resource "random_pet" "second_pet" {
  prefix = var.second_pet_prefix
}
// variables.tf
variable "first_pet_prefix" {
  type = string
}

variable "second_pet_prefix" {
  type = string
}
// outputs.tf
output "infrastructure_output" {
  value = jsonencode({
    first_pet_id  = random_pet.first_pet.id
    second_pet_id = random_pet.second_pet.id
  })
}

The second project I call (unsurprisingly) architecture02.

// main.tf
terraform {
  required_version = "> 1.3"
}
// variables.tf
variable "pet_ids" {
  type = list(string)
}
// outputs.tf
output "infrastructure_output" {
  value = jsonencode({
    message = "Hello ${var.pet_ids[0]} and ${var.pet_ids[1]}!"
  })
}

The first point I want to make has to do with the outputs from each Terraform project. In the two snippets above we can see that there is only a single output defined (called infrastructure_output in both projects). This is important for the workflow to handle arbitrary outputs. Now we are ready to start writing our GitHub Actions workflows!

The workflows
#

The reusable Terraform workflow looks like this

name: Reusable Terraform workflow

on:
  workflow_call:
    inputs:
      working-directory:
        type: string
        required: true
      parameters:
        type: string
        required: false
    outputs:
      infrastructure_output:
        description: Output from Terraform
        value: ${{ jobs.terraform.outputs.infrastructure_output }}

jobs:
  terraform:
    runs-on: ubuntu-latest
    defaults:
      run:
        working-directory: ${{ inputs.working-directory }}
    steps:
      - uses: actions/checkout@v3
      - uses: hashicorp/setup-terraform@v2
        with:
          terraform_wrapper: false
      - name: Create terraform.tfvars file
        run: echo '${{ inputs.parameters }}' >> terraform.tfvars
      - run: terraform init
      - run: terraform validate
      - run: terraform plan -no-color -out "plan.out"
      - run: terraform apply plan.out -no-color
      - name: Handle output
        id: output-step
        run: |
          outputs=$(terraform output infrastructure_output)
          echo "infrastructure_output=$outputs" >> $GITHUB_OUTPUT          
    outputs:
      infrastructure_output: ${{ steps.output-step.outputs.infrastructure_output }}

Since this is a reusable workflow we add the workflow_call trigger. This trigger is defined with two inputs, one for the working directory and one for input parameters to Terraform. We also add output for the workflow, and as you see I am using the common output value infrastructure_output that I discussed above.

The steps in the workflow are basic Terraform steps and those are not the focus of this article. However, there are two steps that are important for the point I am trying to make. The first step I want to highlight is named Create terraform.tfvars file:

- name: Create terraform.tfvars file
  run: echo '${{ inputs.parameters }}' >> terraform.tfvars

In this step I take the value of the incoming parameters parameter and write it to a file called terraform.tfvars. This file will be automatically picked up by Terraform in later steps, and it is used to populate the Terraform variables with values. Dynamic input handled, check!

The second important step that I want to highlight is the one named Handle output:

- name: Handle output
  id: output-step
  run: |
    outputs=$(terraform output infrastructure_output)
    echo "infrastructure_output=$outputs" >> $GITHUB_OUTPUT    

In this step I use the terraform output command to retrieve the value of the single output value that all of my Terraform projects have. I pipe the value of the output to the $GITHUB_OUTPUT environment variable. This allows me to then export the output from the workflow. Dynamic output handled, check! Well, we are not quite ready.

Now we need to use our reusable workflow from a calling workflow. Let me show you the complete workflow first and then discuss the important details.

name: Run a complex Terraform deployment

on:
  push:
    branches:
      - "main"
    paths:
      - "terraform/**.tf"
  workflow_dispatch:

jobs:
  architecture01:
    name: Set up architecture01
    uses: ./.github/workflows/reusable.yml
    with:
      working-directory: terraform/architecture01
      parameters: |
        first_pet_prefix="snuffles"
        second_pet_prefix="puffles"        

  architecture01-outputs:
    name: Parse outputs from architecture01
    needs: architecture01
    runs-on: ubuntu-latest
    steps:
      - id: output-step
        run: |
          first_pet_id=$(echo ${{ needs.architecture01.outputs.infrastructure_output }} | jq -r .first_pet_id)
          echo "first_pet_id=$first_pet_id" >> $GITHUB_OUTPUT

          second_pet_id=$(echo ${{ needs.architecture01.outputs.infrastructure_output }} | jq -r .second_pet_id)
          echo "second_pet_id=$second_pet_id" >> $GITHUB_OUTPUT          
    outputs:
      first_pet_id: ${{ steps.output-step.outputs.first_pet_id }}
      second_pet_id: ${{ steps.output-step.outputs.second_pet_id }}

  architecture02:
    name: Set up architecture02
    uses: ./.github/workflows/reusable.yml
    needs: architecture01-outputs
    with:
      working-directory: terraform/architecture02
      parameters: |
        pet_ids=[
          "${{ needs.architecture01-outputs.outputs.first_pet_id }}",
          "${{ needs.architecture01-outputs.outputs.second_pet_id }}"
        ]        

The first job (called architecture01) uses my reusable workflow to set up the first Terraform project.

architecture01:
  name: Set up architecture01
  uses: ./.github/workflows/reusable.yml
  with:
    working-directory: terraform/architecture01
    parameters: |
      first_pet_prefix="snuffles"
      second_pet_prefix="puffles"      

I provide the working directory for the project, and I define the expected Terraform input in the parameters parameter (naming things is hard). This is all we need to do to set up the first project.

Now here comes a tricky part. How do we get the output from the first project, so that we can use it in the second project? We could directly access the output in another job by first stating that the other job needs the first job, then access the outputs by ${{ needs.architecture01.outputs.infrastructure_output }}. However, looking back at the definition of the output for my project we see the following

// outputs.tf
output "infrastructure_output" {
  value = jsonencode({
    key = "value",
    ...
  })
}

The output is JSON. So in my calling workflow I decided to add a dedicated job that parses the output JSON.

architecture01-outputs:
  name: Parse outputs from architecture01
  needs: architecture01
  runs-on: ubuntu-latest
  steps:
    - id: output-step
      run: |
        first_pet_id=$(echo ${{ needs.architecture01.outputs.infrastructure_output }} | jq -r .first_pet_id)
        echo "first_pet_id=$first_pet_id" >> $GITHUB_OUTPUT

        second_pet_id=$(echo ${{ needs.architecture01.outputs.infrastructure_output }} | jq -r .second_pet_id)
        echo "second_pet_id=$second_pet_id" >> $GITHUB_OUTPUT        
  outputs:
    first_pet_id: ${{ steps.output-step.outputs.first_pet_id }}
    second_pet_id: ${{ steps.output-step.outputs.second_pet_id }}

This step uses jq to retrieve individual output values from my output JSON. Nothing more, nothing less. Then it creates outputs of its own that I can reference in later jobs.

The last job in the workflow is called architecture02, and it also uses my reusable workflow (see, we’re DRYing). How it works is exactly the same as for architecture01. I just wanted to demonstrate that we can send output from our first project (architecture01) into our second project (architecture02).

The full source code is available at my GitHub:

mattias-fjellstrom/terraform-github-actions

Demonstrate how a reusable workflow for Terraform could look like

HCL
1
0
Mattias Fjellström
Author
Mattias Fjellström
Cloud architect · Author · HashiCorp Ambassador