I continue my exploration of the new Terraform testing framework with this post where I demonstrate how to run permutations of a given test using GitHub Actions. Specifically I’ll dive into an example where our module depend on a different module, and we are required to support a given number of recent versions of this module.
This post follows up on my comprehensive guide to testing in Terraform. The testing framework is new and untested (pun not intended), but it already offers a lot of possibilities that will lead to better Terraform modules. In this post I will explore all the options that are available in the new run block and the new
What to do when you are not allowed to use recursive module calls in your declarative infrastructure-as-code? Use imperative infrastructure-as-code!
HashiCorp Developer AI is here in private beta. What is the Developer AI? Think ChatGPT for HashiCorp. It lives inside the documentation pages on developer.hashicorp.com and can assist you with any questions you might have related to HashiCorp products. In this post I take a first quick look at how Developer AI works and what it can do.
A Comprehensive Guide to Testing in Terraform: Keep your tests, validations, checks, and policies in order
You have many options to use when it comes to testing and validating your Terraform configurations and modules. The newest addition to these options is the native Terraform testing framework. This in combination with custom conditions, check blocks, and policies allow for creating robust infrastructure-as-code. This post is a comprehensive guide to testing and validation in Terraform.
This week I attended HashiConf in San Francisco. In this post I summarize my experience and what I thought about the conference!
In this post I compare how to import resources into your Terraform state using the traditional CLI-driven method with the new method using import blocks. This is a new feature in Terraform 1.5 and as we will see an improvement over the traditional method.
What’s up with the new Terraform Cloud pricing model?
Azure Bicep is constantly evolving. The latest release brought a preview of user-defined functions, but in this post I am using modules as a substitute for user-defined functions. I will create Kubernetes deployments through a module, where the module is as close to a function as you can come. I will also be using another experimental feature of Bicep: user-defined types.
In this post I will go through an example of how you can promote releases between different GitOps environments. The sample application I use consists of a simple Helm chart. I will use Argo CD as my GitOps operator. Automation workflows are created using GitHub Actions.