19th August, 2021
13 Min read
Sign up to our Newsletter
Based on Hashicorp documentations we should follow general naming conventions for Terraform modules.
Based on information above all Terraform modules should follow next ruleset • All source code in git
How One AI-Driven Media Platform Cut EBS Costs for AWS ASGs by 48%
All terraform modules should follow structure as shows above
.
├── .gitignore
├── .markdownlint.json
├── .pre-commit-config.yaml
├── LICENSE
├── README.md
├── VERSION
├── examples
│ ├── complete
│ │ ├── main.tf
│ │ ├── outputs.tf
│ │ ├── variables.tf
│ │ └── versions.tf
│ └── minimal
│ ├── main.tf
│ ├── outputs.tf
│ ├── variables.tf
│ └── versions.tf
├── main.tf
├── outputs.tf
├── test
│ ├── go.mod
│ ├── go.sum
│ └── terraform_module_gcp_dns_test.go
├── variables.tf
└── versions.tf
line 2: .gitignore – should contain all extra files. Lets keep git repository clear(required) line 3: .markdownlint.json – linting configuration for Markdown(optional) line 4: .pre-commit-config.yaml – configuration for pre-commit framework(required) line 5: LICENSE file for explanations under which license this module
line 6: README.md – Readme with examples of terraform module usage, requirements, input parameters, outputs,full all important information and warning if applicable (required) line 7: VERSION – Versions history for module with release notes (required) line 8: examples – Examples for module should contain at least minimal(line 14) (required) line 19: main.tf – General module file. Contains resources,locals and data-sources to create all resources (required)
line 20: outputs.tf – Contains outputs from the resources created in main.tf (required) line 21: test – All unit/integration tests for Terraform module (required)
line 25: variables.tf – contains declarations of variables used in main.tf (required) line 26: versions.tf – contains all versions for Terraform itself and for providers in module (required)
Don’t repeat resource type, don’t use partially or completely
Bad
resource “aws_instance” “webserver_aws_instance” {}
Bad
resource “aws_instance” “webserver_instance” {}
Good
resource “aws_instance” “webserver” {}
Resource name should be named this if there is no more descriptive and general name available, or if resource module creates single resource of this type (eg, there is single resource
of type aws_nat_gateway and multiple resources of typeaws_route_table, so aws_nat_gateway should be named this and aws_route_table should have more descriptive names – like private, public, database).
Bad
resource “aws_instance” “webserver” {
ami = “123456”
count = “2”
}
Good
resource “aws_instance” “webserver” {
count = “2”
ami = “123456”
}
count = var.instances_per_subnet * length(module.vpc.private_subnets) or
count = var.instance_count
Tags for resources is required
Bad
resource “aws_instance” “webserver” {
count = “2”
tags = {
Name = “…”
}
}
Good
resource “aws_instance” “webserver” {
count = “2”
subnet_id = “…”
vpc_security_group_ids = “..”
tags = {
Name = “…”
}
}
When writing a module that accepts variable inputs, make sure to use the same names as the upstream to avoid confusion and ambiguity.
Avoid introducing any other syntaxes commonly found in other languages such as CamelCase or pascalCase. For consistency we want all variables to look uniform. This is also inline with the HashiCorp naming conventions.
All variable inputs that enable/disable a setting should be
formatted …._enabled (e.g. encryption_enabled). It is acceptable for default values to be either false or true.
All modules should incorporate feature flags to enable or disable functionality. All feature flags should end in _enabled and should be of type bool.
All variable inputs need a description field. When the field is provided by an upstream provider (e.g. terraform-aws-provider), use the same wording as the upstream docs.
Modules should be as turnkey as possible. The default value should ensure the most secure configuration (E.g. with encryption enabled).
All variable inputs for secrets must never define a default value. This ensures that terraform is able to validate user input. The exception to this is if the secret is optional and will be generated for the user automatically when left null or “” (empty).
Using <<-EOT (as opposed to <<EOT without the –) ensures the code can be indented inline with the other code in the project. Note that EOT can be any uppercase string (e.g. CONFIG_FILE)
There are better ways to achieve the same outcome using terraform interpolations or resources For JSON, use a combination of a local and the jsonencode function.
For YAML, use a combination of a local and the yamlencode function.
For IAM Policy Documents, use the native iam_policy_document resource.
Use instead the template_file resource and move the configuration to a separate template file.
Using proper datatypes in terraform makes it easier to validate inputs and document usage.
All outputs must have a description set. The description should be based on (or adapted from) the upstream terraform provider where applicable. Avoid simply repeating the variable name as the output description.
Avoid introducing any other syntaxes commonly found in other languages such as CamelCase or pascalCase. For consistency we want all variables to look uniform. It also makes code more
consistent when using outputs together with terraform remote_state to access those settings from across modules.
Secrets should never be outputs of modules. Rather, they should be written to secure storage such as AWS Secrets Manager, AWS SSM Parameter Store with KMS encryption, or S3 with KMS encryption at rest. Our preferred mechanism on AWS is using the SSM Parameter Store. Values written to SSM are easily retrieved by other terraform modules, or even on the command-line using tools like chamber by Segment.io.
We are very strict about this in “root” modules (or the top-most module), because these sensitive outputs are easily leaked in CI/CD pipelines (see tfmask for masking secrets in output only as a last resort). We are less sensitive to this in modules that are typically nested inside of other modules.
Rather than outputting a secret, you may output plain text indicating where the secret is stored, for example RDS master password is in SSM parameter /rds/master_password. You may also want to have another output just for the key for the secret in the secret store, so the key is available to other programs which may be able to retrieve the value given the key.
We prefer to keep terraform outputs symmetrical as much as possible with the upstream resource or module, with exception of prefixes. This reduces the amount of entropy in the code or possible ambiguity, while increasing consistency. Below is an example of what *not to do. The expected output name is user_secret_access_key. This is because the other IAM user outputs in the upstream module are prefixed with user_, and then we should borrow the upstream’s output name of secret_access_key to become user_secret_access_key for consistency.
For Terraform it’s required to use remote state
We recommend not commingling state in the same bucket. This could cause the state to get overridden or compromised. Note, the state contains cached values of all outputs. Wherever possible, keep stages 100% isolated with physical barriers (separate buckets, separate organizations)
For Terraform it’s required to use versioning for remote state
Checkov is a static code analysis tool for infrastructure-as-code.
It scans cloud infrastructure provisioned using Terraform, Terraform
plan, Cloudformation, Kubernetes, Dockerfile, Serverless or ARM Templates and detects security and compliance misconfigurations using graph-based scanning.
Checkov also powers Bridgecrew, the developer-first platform that codifies and streamlines cloud security throughout the development lifecycle. Bridgecrew identifies, fixes, and prevents misconfigurations in cloud resources and infrastructure-as-code files.
Github
tfsec uses static analysis of your terraform templates to spot potential security issues. Now with terraform CDK support.
Github
terraform-compliance is a lightweight, security and compliance focused test framework against terraform to enable negative testing capability for your infrastructure-as-code.
Github
Terrascan is a static code analyzer for Infrastructure as Code. Terrascan allow you to:
Github
Regula is a tool that evaluates CloudFormation and Terraform infrastructure-as-code for potential AWS, Azure, and Google Cloud security and compliance violations prior to deployment.
Docs
Github
Find security vulnerabilities, compliance issues, and infrastructure misconfigurations early in the development cycle of your infrastructure-as-code with KICS by Checkmarx.
Docs
Github
A Pluggable Terraform Linter
TFLint is a framework and each feature is provided by plugins, the key features are as follows:
Github
A command line tool to validate configuration files using rules specified in YAML. The configuration files can be one of several formats: Terraform, JSON, YAML, with support for Kubernetes. There are built-in rules provided for Terraform, and custom files can be used for other formats.
Github
Testing hashicorp terraform
Testing experiment for Terraform
Chef InSpec is an open-source framework for testing and auditing your applications and infrastructure. Chef InSpec works by comparing the actual state of your system with the desired state that you express in easy-to-read and easy-to-write Chef InSpec code. Chef InSpec detects violations and displays findings in the form of a report, but puts you in control of remediation.
Moving Security and Sanity Left by Testing Terraform with InSpec
Hashicorp blog
Github
Terratest is a Go library that provides patterns and helper functions for testing infrastructure, with 1st-class support for Terraform, Packer, Docker, Kubernetes, AWS, GCP, and more. Docs
Github
Ones again but more about BDD testing
Docs
The creation of rspec-terraform was initially intended to smooth the creation and sharing of common Terraform modules. Some sort of basic testing would ensure a stable and clearly defined interface for each module.
Github
A declarative test framework for Terraform
Github
A framework for managing and maintaining multi-language pre-commit hooks. Main site
Installations
From pip
pip install pre-commit
Non-administrative installation:
curl https://pre-commit.com/install-local.py | python – In a python project, add the following to your requirements.txt (or requirements-dev.txt): pre-commit
From brew
brew install pre-commit
Initializing
cd tf/module/path
Create .pre-commit-config.yaml in module path
Simple example for terraform
#pre commit framework
—
default_language_version:
python: python3
repos:
– repo: git://github.com/pre-commit/pre-commit-hooks rev: v3.2.0
hooks:
– id: check-json
– id: check-merge-conflict
– id: trailing-whitespace
– id: end-of-file-fixer
– id: check-yaml
– id: check-added-large-files
– id: pretty-format-json
args:
– –autofix
– id: detect-aws-credentials
– id: detect-private-key
– repo: git://github.com/Lucas-C/pre-commit-hooks rev: v1.1.9
hooks:
– id: forbid-tabs
exclude_types:
– python
– javascript
– dtd
– markdown
– makefile
– xml
exclude: binary|\.bin$
– repo: git://github.com/jameswoolfenden/pre-commit-shell rev: 0.0.2
hooks:
– id: shell-lint
– repo: git://github.com/igorshubovych/markdownlint-cli rev: v0.23.2
hooks:
– id: markdownlint
– repo: git://github.com/adrienverge/yamllint rev: v1.24.2
hooks:
– id: yamllint
name: yamllint
description: This hook runs yamllint. entry: yamllint
language: python
types: [file, yaml]
– repo: git://github.com/jameswoolfenden/pre-commit rev: v0.1.33
hooks:
– id: terraform-fmt
– id: checkov-scan
language_version: python3.7
– id: tf2docs
language_version: python3.7
Now you can init git if you don’t init yet
git init
Now need to install hooks from .pre-commit-config.yaml run
pre-commit
pre-commit install
If you want run pre-commit without commiting you changes
git add .
pre-commit run -a
After running command above you could see output like
Check
JSON………………………………………………………Passed Check for merge
conflicts…………………………………………Passed Trim Trailing
Whitespace………………………………………….Passed Fix End of
Files…………………………………………………Passed Check
Yaml………………………………………………………Passed Check for added large
files……………………………………….Passed
Pretty format
JSON……………………………………………….Passed Detect AWS
Credentials……………………………………………Passed Detect Private
Key……………………………………………….Passed No-tabs
checker………………………………………………….Passed Shell Syntax Check……………………………..(no files to check)Skipped
markdownlint…………………………………………………….Pass ed
yamllint………………………………………………………..Pass ed
terraform
fmt……………………………………………………Passed checkov…………………………………………………………Pass ed
Terraform version manager inspired by rbenv
Github
The tfswitch command line tool lets you switch between different versions of terraform. If you do not have a particular version of terraform installed, tfswitch lets you download the version you desire. The installation is minimal and easy. Once installed, simply select the version you require from the dropdown and start using terraform.
Github
A Terrafile is a simple YAML config that gives you a single, convenient location that lists all your external module dependencies.
The idea is modelled on similar patterns in other languages – e.g. Ruby with its Gemfile (technically provided by the bundler gem).
Additionally, tfile supports modules from the Terraform Registry, as well as local modules and from git.
terraform-google-lb:
source: “GoogleCloudPlatform/lb-http/google”
version: “4.5.0”
terraform-aws-vpc:
source: https://github.com/terraform-aws-modules/terraform-aws-vpc.git version: v2.64.0
provider: aws
Github
The Terrafile is where you can define additional modules to add to your Terraspace project. The modules can be from your own git repositories, other git repositories, or the Terraform Registry.
Docs
Terraspace is a Terraform Framework that optimizes for infrastructure-as-code happiness. It provides an organized structure, conventions over configurations, keeps your code DRY, and adds convenient tooling. Terraspace makes working with Terraform easier and more fun.
Docs
Github
Terragrunt is a thin wrapper that provides extra tools for keeping your configurations DRY, working with multiple Terraform modules, and managing remote state.
Docs
Github
Pretf is a completely transparent, drop-in Terraform wrapper that generates Terraform configuration with Python. It requires no configuration and no changes to standard Terraform projects to start using it.
Docs
CDK (Cloud Development Kit) for Terraform allows developers to use familiar programming languages to define cloud infrastructure and provision it through HashiCorp Terraform.
Github
Scaffolding / Boilerplate generator for new Terraform module projects
Github
Generate CloudFormation / Terraform / Troposphere templates from your existing AWS resources
Github
A CLI tool that generates tf/json and tfstate files based on existing infrastructure (reverse Terraform).
Github
Blast Radius is a tool for reasoning about Terraform dependency graphs with interactive visualizations.
Docs
Github
Terraform is intricate but extremely powerful if you know all the ins-and-outs.
Our job at GlobalDots is to do that in order to provide the best possible DevOps service to our fast-growing business customers.
Contact us to learn how we can help you grow lean and smart, while keeping your precious resources focused on coding.