terraform import reverse engenerring - ghdrako/doc_snipets GitHub Wiki

Resource:

Terraform is able to import existing infrastructure. This allows you take resources you've created by some other means and bring it under Terraform management.

Bringing existing infrastructure under Terraform’s control involves five main steps:

  • Identify the existing infrastructure to be imported.
  • Import the infrastructure into your Terraform state.
  • Write a Terraform configuration that matches that infrastructure.
  • Review the Terraform plan to ensure that the configuration matches the expected state and infrastructure.
  • Apply the configuration to update your Terraform state.
terraform import [options] ADDRESS ID
terraform import -var project=${PROJECT_ID} google_storage_bucket.media ${PROJECT_ID}-media
terraform import -var project=${PROJECT_ID} google_storage_bucket.source ${PROJECT_ID}-source

Import will find the existing resource from ID and import it into your Terraform state at the given ADDRESS.

ADDRESS must be a valid resource address. Because any resource address is valid, the import command can import resources into modules as well as directly into the root of your state.

ID is dependent on the resource type being imported

The current implementation of Terraform import can only import resources into the state. It does not generate configuration. A future version of Terraform will also generate configuration. Because of this, prior to running terraform import it is necessary to write manually a resource configuration block for the resource, to which the imported object will be mapped.

  1. To import a resource, first write a resource block for it in your configuration, establishing the name by which it will be known to Terraform
resource "aws_instance" "example" {
  # ...instance configuration...
}
  1. terraform import can be run to attach an existing instance to this resource configuration
terraform import aws_instance.example i-abcd1234

This command locates the AWS EC2 instance with ID i-abcd1234. Then it attaches the existing settings of the instance, as described by the EC2 API, to the name aws_instance.example of a module. In this example the module path implies that the root module is used. Finally, the mapping is saved in the Terraform state.

Import into Module - import an AWS instance into the aws_instance resource named bar into a module named foo

terraform import module.foo.aws_instance.bar i-abcd1234

Import into Resource configured with count

terraform import 'aws_instance.baz[0]' i-abcd1234

Import into Resource configured with for_each

terraform import 'aws_instance.baz["example"]' i-abcd1234 # linux
terraform import 'aws_instance.baz[\"example\"]' i-abcd1234 # powershell

Reverse enginiering steps in terraform:

  1. Use the VMware MOB as a source of truth to get the point-in-time values of the VM object (https:///mob).
  2. Identify the key mandatory and optional objects for a typical VMware environment. These are key for our reverse-engineering approach.
  3. Write an import logic script to programmatically fetch these object values from the MOB and create a file in a specified format (HCL or JSON) using these objects, which Terraform can understand.The file that is generated in this step is called a configuration file.
  4. Do a terraform import with the configuration file and validate if the import was successful.
  5. After a successful import, perform a change on the VMware VM leveraging Terraform automation by changing one of the parameters in the configuration file.

Google Cloud Utility for Terraform Import

 gcloud beta resource-config

This readymade tool already has reverse-engineering logic built in, which in turn talks to the Google Cloud Platform and fetches the required Terraform configuration of the resources running inside a GCP project. This allows IT administrators to use Terraform in their day-to-day IT automation.

The utility allows you to bulk export project resources into a *.tf file, which can be readily used for further import via the Terraform tool.

obraz Here are the importing steps:

  1. Define the execution environment and set the GCP project, region, and zone.
  2. Here is the command for the bulk export of the Terraform configuration files for resources in a GCP project:
gcloud beta resource-config bulk-export --project=<Project Name> --path=./instances
--resource-format=Terraform --resource-types=storage.cnrm.cloud.google.com/ComputeInstance -q
  1. In this bulk export, copy the required resource *.tf file to another directory where you want to run import.
  2. Run terraform init.
  3. Import your required resource with this command:
terraform import google_compute_instance.<VMname> <ProjectName>/<Region>/<VMName> 
  1. Run terraform plan and verify that no changes are suggested. If no changes are suggested, that signifies a successful import.

Microsoft Azure Cloud Utility for Terraform Import

here are open-source and Microsoft-supported tools available that facilitate this job for administrators. The following are a few tools that allow the easy import of existing resources into Terraform:

aztfy resource <resource id>`
Test:~$ aztfy resource /subscriptions/XXXXX-XXXX-
XXXX-XXXX-XXXXXXX/resourcegroups/test_group/
providers/Microsoft.Compute/virtualMachines/test
Test:~$ ls
main.tf provider.tf Terraform.tfstate

Here is the Azure Export GitHub page: https://github.com/Azure/aztfexport/releases

aztfexport [command] [option] <scope>

By default, Azure Export for Terraform collects the telemetry data to improve the user experience. Azure Export for Terraform does not collect any private or personal data. However, you can still easily disable this process with the following command:

aztfexport config set telemetry_enabled false

Amazon AWS Cloud Utility for Terraform Import

There are open-source tools available that allow reverse engineering on the AWS platform as well. One example of such a tool is Terraformer. This is a tool developed by Waze, a subsidiary of Google; however, it is not an official product of Google. Terraformer is an open-source tool that can be modified and used across all major platforms including Amazon, Google, AWS, IBM Cloud, and Alibaba Cloud. Terraformer uses Terraform providers and is designed to easily support newly added resources. To upgrade resources with new fields, all you need to do is upgrade the relevant Terraform providers. Here is the link to the GitHub source code for the tool: https://github.com/GoogleCloudPlatform/Terraformer Here are the steps:

  1. Install Terraformer following the instructions at the provided GitHub link.
  2. Clone the GitHub repository and go to Terraformer.
  3. Build the modules with the provider you choose.
  4. Run the import command to start importing.

Here is the sample CLI command to import all EC2 instances in region us-east-1 for AWS:

Terraformer import aws --resources=ec2_instance --regions=ap-southeast-2

Terraformer imports the existing AWS resource directly into Terraform and creates the state file. Here are the capabilities of the Terraformer open-source tool:

  • It can generate the TF/JSON + tfstate file from the existing infrastructure for the supported objects of each resource on the respective platform.
  • The state file it generates can be uploaded to the cloud bucket directly.
  • The import is supported by the resource name and its type.
  • Users can save TF/JSON files using a custom folder tree pattern.
⚠️ **GitHub.com Fallback** ⚠️