terraform import reverse engenerring - ghdrako/doc_snipets GitHub Wiki
Resource:
Terraform is able to import existing infrastructure. This allows you take resources you've created by some other means and bring it under Terraform management.
Bringing existing infrastructure under Terraform’s control involves five main steps:
- Identify the existing infrastructure to be imported.
- Import the infrastructure into your Terraform state.
- Write a Terraform configuration that matches that infrastructure.
- Review the Terraform plan to ensure that the configuration matches the expected state and infrastructure.
- Apply the configuration to update your Terraform state.
terraform import [options] ADDRESS ID
terraform import -var project=${PROJECT_ID} google_storage_bucket.media ${PROJECT_ID}-media
terraform import -var project=${PROJECT_ID} google_storage_bucket.source ${PROJECT_ID}-source
Import will find the existing resource from ID
and import it into your Terraform state at the given ADDRESS
.
ADDRESS
must be a valid resource address. Because any resource address is valid, the import command can import resources into modules as well as directly into the root of your state.
ID
is dependent on the resource type being imported
The current implementation of Terraform import can only import resources into the state. It does not generate configuration. A future version of Terraform will also generate configuration. Because of this, prior to running terraform import it is necessary to write manually a resource configuration block for the resource, to which the imported object will be mapped.
- To import a resource, first write a resource block for it in your configuration, establishing the name by which it will be known to Terraform
resource "aws_instance" "example" {
# ...instance configuration...
}
- terraform import can be run to attach an existing instance to this resource configuration
terraform import aws_instance.example i-abcd1234
This command locates the AWS EC2 instance with ID i-abcd1234. Then it attaches the existing settings of the instance, as described by the EC2 API, to the name aws_instance.example of a module. In this example the module path implies that the root module is used. Finally, the mapping is saved in the Terraform state.
Import into Module - import an AWS instance into the aws_instance
resource named bar
into a module named foo
terraform import module.foo.aws_instance.bar i-abcd1234
Import into Resource configured with count
terraform import 'aws_instance.baz[0]' i-abcd1234
Import into Resource configured with for_each
terraform import 'aws_instance.baz["example"]' i-abcd1234 # linux
terraform import 'aws_instance.baz[\"example\"]' i-abcd1234 # powershell
Reverse enginiering steps in terraform:
- Use the VMware MOB as a source of truth to get the point-in-time values of the VM object (https:///mob).
- Identify the key mandatory and optional objects for a typical VMware environment. These are key for our reverse-engineering approach.
- Write an import logic script to programmatically fetch these object values from the MOB and create a file in a specified format (HCL or JSON) using these objects, which Terraform can understand.The file that is generated in this step is called a configuration file.
- Do a
terraform import
with the configuration file and validate if the import was successful. - After a successful import, perform a change on the VMware VM leveraging Terraform automation by changing one of the parameters in the configuration file.
gcloud beta resource-config
This readymade tool already has reverse-engineering logic built in, which in turn talks to the Google Cloud Platform and fetches the required Terraform configuration of the resources running inside a GCP project. This allows IT administrators to use Terraform in their day-to-day IT automation.
The utility allows you to bulk export project resources into a *.tf file, which can be readily used for further import via the Terraform tool.
Here are the importing steps:
- Define the execution environment and set the GCP project, region, and zone.
- Here is the command for the bulk export of the Terraform configuration files for resources in a GCP project:
gcloud beta resource-config bulk-export --project=<Project Name> --path=./instances
--resource-format=Terraform --resource-types=storage.cnrm.cloud.google.com/ComputeInstance -q
- In this bulk export, copy the required resource *.tf file to another directory where you want to run import.
- Run
terraform init
. - Import your required resource with this command:
terraform import google_compute_instance.<VMname> <ProjectName>/<Region>/<VMName>
- Run
terraform plan
and verify that no changes are suggested. If no changes are suggested, that signifies a successful import.
here are open-source and Microsoft-supported tools available that facilitate this job for administrators. The following are a few tools that allow the easy import of existing resources into Terraform:
- Azure Terrafy (aztfy) With Azure Terrafy, you can quickly and easily turn the existing Azure infrastructure into Terraform HCL and import it to a Terraform state. After you have completed importing your infrastructure, you can manage it with your standard IaC processes. Learn more here: https://techcommunity.microsoft.com/t5/azure-tools-blog/announcing-azure-terrafy-and-azapi-Terraform-provider-previews/ba-p/3270937 Here is the link to the GitHub source code link of the tools: https://github.com/Azure/Terraform Here is a sample CLI command that imports a resource by the resource ID:
aztfy resource <resource id>`
Test:~$ aztfy resource /subscriptions/XXXXX-XXXX-
XXXX-XXXX-XXXXXXX/resourcegroups/test_group/
providers/Microsoft.Compute/virtualMachines/test
Test:~$ ls
main.tf provider.tf Terraform.tfstate
- Azure Export: Another tool that is offered by Microsoft is called Azure Export for Terraform. This tool is designed to help reduce friction when translating between Azure and Terraform concepts. For scenarios related to escape hatches or the removal of escape hatches, Azure Export for Terraform allows the use of Azure preview features with Terraform. To learn more about the tool, please visit the following link: https://learn.microsoft.com/en-us/azure/developer/Terraform/azure-export-for-Terraform/export-Terraform-overview
Here is the Azure Export GitHub page: https://github.com/Azure/aztfexport/releases
aztfexport [command] [option] <scope>
By default, Azure Export for Terraform collects the telemetry data to improve the user experience. Azure Export for Terraform does not collect any private or personal data. However, you can still easily disable this process with the following command:
aztfexport config set telemetry_enabled false
There are open-source tools available that allow reverse engineering on
the AWS platform as well. One example of such a tool is Terraformer
. This
is a tool developed by Waze, a subsidiary of Google; however, it is not an
official product of Google. Terraformer is an open-source tool that can be
modified and used across all major platforms including Amazon, Google,
AWS, IBM Cloud, and Alibaba Cloud.
Terraformer uses Terraform providers and is designed to easily support
newly added resources. To upgrade resources with new fields, all you need
to do is upgrade the relevant Terraform providers.
Here is the link to the GitHub source code for the tool:
https://github.com/GoogleCloudPlatform/Terraformer
Here are the steps:
- Install Terraformer following the instructions at the provided GitHub link.
- Clone the GitHub repository and go to Terraformer.
- Build the modules with the provider you choose.
- Run the import command to start importing.
Here is the sample CLI command to import all EC2 instances in region us-east-1 for AWS:
Terraformer import aws --resources=ec2_instance --regions=ap-southeast-2
Terraformer imports the existing AWS resource directly into Terraform and creates the state file. Here are the capabilities of the Terraformer open-source tool:
- It can generate the TF/JSON + tfstate file from the existing infrastructure for the supported objects of each resource on the respective platform.
- The state file it generates can be uploaded to the cloud bucket directly.
- The import is supported by the resource name and its type.
- Users can save TF/JSON files using a custom folder tree pattern.