GithubHelp home page GithubHelp logo

googlecloudplatform / terraformer Goto Github PK

View Code? Open in Web Editor NEW
11.8K 143.0 1.6K 67.7 MB

CLI tool to generate terraform files from existing infrastructure (reverse Terraform). Infrastructure to Code

License: Apache License 2.0

Go 99.20% HCL 0.80%
cloud terraform terraform-configurations gcp google-cloud hcl golang infrastructure-as-code aws kubernetes

terraformer's Introduction

Terraformer

tests linter Go Report Card AUR package Homebrew

A CLI tool that generates tf/json and tfstate files based on existing infrastructure (reverse Terraform).

  • Disclaimer: This is not an official Google product
  • Created by: Waze SRE

Waze SRE logo

Table of Contents

Demo GCP

asciicast

Capabilities

  1. Generate tf/json + tfstate files from existing infrastructure for all supported objects by resource.
  2. Remote state can be uploaded to a GCS bucket.
  3. Connect between resources with terraform_remote_state (local and bucket).
  4. Save tf/json files using a custom folder tree pattern.
  5. Import by resource name and type.
  6. Support terraform 0.13 (for terraform 0.11 use v0.7.9).

Terraformer uses Terraform providers and is designed to easily support newly added resources. To upgrade resources with new fields, all you need to do is upgrade the relevant Terraform providers.

Import current state to Terraform configuration from a provider

Usage:
   import [provider] [flags]
   import [provider] [command]

Available Commands:
  list        List supported resources for a provider

Flags:
  -b, --bucket string         gs://terraform-state
  -c, --connect                (default true)
  -С, --compact                (default false)
  -x, --excludes strings      firewalls,networks
  -f, --filter strings        compute_firewall=id1:id2:id4
  -h, --help                  help for google
  -O, --output string         output format hcl or json (default "hcl")
  -o, --path-output string     (default "generated")
  -p, --path-pattern string   {output}/{provider}/ (default "{output}/{provider}/{service}/")
      --projects strings
  -z, --regions strings       europe-west1, (default [global])
  -r, --resources strings     firewall,networks or * for all services
  -s, --state string          local or bucket (default "local")
  -v, --verbose               verbose mode
  -n, --retry-number          number of retries to perform if refresh fails
  -m, --retry-sleep-ms        time in ms to sleep between retries

Use " import [provider] [command] --help" for more information about a command.

Permissions

The tool requires read-only permissions to list service resources.

Resources

You can use --resources parameter to tell resources from what service you want to import.

To import resources from all services, use --resources="*" . If you want to exclude certain services, you can combine the parameter with --excludes to exclude resources from services you don't want to import e.g. --resources="*" --excludes="iam".

Filtering

Filters are a way to choose which resources terraformer imports. It's possible to filter resources by its identifiers or attributes. Multiple filtering values are separated by :. If an identifier contains this symbol, value should be wrapped in ' e.g. --filter=resource=id1:'project:dataset_id'. Identifier based filters will be executed before Terraformer will try to refresh remote state.

Use Type when you need to filter only one of several types of resources. Multiple filters can be combined when importing different resource types. An example would be importing all AWS security groups from a specific AWS VPC:

terraformer import aws -r sg,vpc --filter Type=sg;Name=vpc_id;Value=VPC_ID --filter Type=vpc;Name=id;Value=VPC_ID

Notice how the Name is different for sg than it is for vpc.

Migration state version

For terraform >= 0.13, you can use replace-provider to migrate state from previous versions.

Example usage:

terraform state replace-provider -auto-approve "registry.terraform.io/-/aws" "hashicorp/aws"
Resource ID

Filtering is based on Terraform resource ID patterns. To find valid ID patterns for your resource, check the import part of the Terraform documentation.

Example usage:

terraformer import aws --resources=vpc,subnet --filter=vpc=myvpcid --regions=eu-west-1

Will only import the vpc with id myvpcid. This form of filters can help when it's necessary to select resources by its identifiers.

Field name only

It is possible to filter by specific field name only. It can be used e.g. when you want to retrieve resources only with a specific tag key.

Example usage:

terraformer import aws --resources=s3 --filter="Name=tags.Abc" --regions=eu-west-1

Will only import the s3 resources that have tag Abc. This form of filters can help when the field values are not important from filtering perspective.

Field with dots

It is possible to filter by a field that contains a dot.

Example usage:

terraformer import aws --resources=s3 --filter="Name=tags.Abc.def" --regions=eu-west-1

Will only import the s3 resources that have tag Abc.def.

Planning

The plan command generates a planfile that contains all the resources set to be imported. By modifying the planfile before running the import command, you can rename or filter the resources you'd like to import.

The rest of subcommands and parameters are identical to the import command.

$ terraformer plan google --resources=networks,firewall --projects=my-project --regions=europe-west1-d
(snip)

Saving planfile to generated/google/my-project/terraformer/plan.json

After reviewing/customizing the planfile, begin the import by running import plan.

$ terraformer import plan generated/google/my-project/terraformer/plan.json

Resource structure

Terraformer by default separates each resource into a file, which is put into a given service directory.

The default path for resource files is {output}/{provider}/{service}/{resource}.tf and can vary for each provider.

It's possible to adjust the generated structure by:

  1. Using --compact parameter to group resource files within a single service into one resources.tf file
  2. Adjusting the --path-pattern parameter and passing e.g. --path-pattern {output}/{provider}/ to generate resources for all services in one directory

It's possible to combine --compact --path-pattern parameters together.

Installation

Both Terraformer and a Terraform provider plugin need to be installed.

Terraformer

From a package manager

  • Homebrew users can use brew install terraformer.
  • MacPorts users can use sudo port install terraformer.
  • Chocolatey users can use choco install terraformer.

From releases This installs all providers, set PROVIDER to one of google, aws or kubernetes if you only need one.

  • Linux
export PROVIDER=all
curl -LO "https://github.com/GoogleCloudPlatform/terraformer/releases/download/$(curl -s https://api.github.com/repos/GoogleCloudPlatform/terraformer/releases/latest | grep tag_name | cut -d '"' -f 4)/terraformer-${PROVIDER}-linux-amd64"
chmod +x terraformer-${PROVIDER}-linux-amd64
sudo mv terraformer-${PROVIDER}-linux-amd64 /usr/local/bin/terraformer
  • MacOS
export PROVIDER=all
curl -LO "https://github.com/GoogleCloudPlatform/terraformer/releases/download/$(curl -s https://api.github.com/repos/GoogleCloudPlatform/terraformer/releases/latest | grep tag_name | cut -d '"' -f 4)/terraformer-${PROVIDER}-darwin-amd64"
chmod +x terraformer-${PROVIDER}-darwin-amd64
sudo mv terraformer-${PROVIDER}-darwin-amd64 /usr/local/bin/terraformer
  • Windows
  1. Install Terraform - https://www.terraform.io/downloads
  2. Download exe file for required provider from here - https://github.com/GoogleCloudPlatform/terraformer/releases
  3. Add the exe file path to path variable

From source

  1. Run git clone <terraformer repo> && cd terraformer/
  2. Run go mod download
  3. Run go build -v for all providers OR build with one provider go run build/main.go {google,aws,azure,kubernetes,etc}

Terraform Providers

Create a working folder and initialize the Terraform provider plugin. This folder will be where you run Terraformer commands.

Run terraform init against a versions.tf file to install the plugins required for your platform. For example, if you need plugins for the google provider, versions.tf should contain:

terraform {
  required_providers {
    google = {
      source = "hashicorp/google"
    }
  }
  required_version = ">= 0.13"
}

Or, copy your Terraform provider's plugin(s) from the list below to folder ~/.terraform.d/plugins/, as appropriate.

Links to download Terraform provider plugins:

  • Major Cloud
    • Google Cloud provider >2.11.0 - here
    • AWS provider >2.25.0 - here
    • Azure provider >1.35.0 - here
    • Alicloud provider >1.57.1 - here
  • Cloud
    • DigitalOcean provider >1.9.1 - here
    • Heroku provider >2.2.1 - here
    • LaunchDarkly provider >=2.1.1 - here
    • Linode provider >1.8.0 - here
    • OpenStack provider >1.21.1 - here
    • TencentCloud provider >1.50.0 - here
    • Vultr provider >1.0.5 - here
    • Yandex provider >0.42.0 - here
    • Ionoscloud provider >6.3.3 - here
  • Infrastructure Software
    • Kubernetes provider >=1.9.0 - here
    • RabbitMQ provider >=1.1.0 - here
  • Network
    • Myrasec provider >1.44 - here
    • Cloudflare provider >1.16 - here
    • Fastly provider >0.16.1 - here
    • NS1 provider >1.8.3 - here
    • PAN-OS provider >= 1.8.3 - here
  • VCS
    • GitHub provider >=2.2.1 - here
  • Monitoring & System Management
    • Datadog provider >2.1.0 - here
    • New Relic provider >2.0.0 - here
    • Mackerel provider > 0.0.6 - here
    • Pagerduty >=1.9 - here
    • Opsgenie >= 0.6.0 here
    • Honeycomb.io >= 0.10.0 - here
    • Opal >= 0.0.2 - here
  • Community
    • Keycloak provider >=1.19.0 - here
    • Logz.io provider >=1.1.1 - here
    • Commercetools provider >= 0.21.0 - here
    • Mikrotik provider >= 0.2.2 - here
    • Xen Orchestra provider >= 0.18.0 - here
    • GmailFilter provider >= 1.0.1 - here
    • Vault provider - here
    • Auth0 provider - here
    • AzureAD provider - here

Information on provider plugins: https://www.terraform.io/docs/configuration/providers.html

High-Level steps to add new provider

  • Initialize provider details in cmd/root.go and create a provider initialization file in the terraformer/cmd folder
  • Create a folder under terraformer/providers/ for your provider
  • Create two files under this folder
    • <provide_name>_provider.go
    • <provide_name>_service.go
  • Initialize all provider's supported services in <provide_name>_provider.go file
  • Create script for each supported service in same folder

Contributing

If you have improvements or fixes, we would love to have your contributions. Please read CONTRIBUTING.md for more information on the process we would like contributors to follow.

Developing

Terraformer was built so you can easily add new providers of any kind.

Process for generating tf/json + tfstate files:

  1. Call GCP/AWS/other api and get list of resources.
  2. Iterate over resources and take only the ID (we don't need mapping fields!).
  3. Call to provider for readonly fields.
  4. Call to infrastructure and take tf + tfstate.

Infrastructure

  1. Call to provider using the refresh method and get all data.
  2. Convert refresh data to go struct.
  3. Generate HCL file - tf/json files.
  4. Generate tfstate files.

All mapping of resource is made by providers and Terraform. Upgrades are needed only for providers.

GCP compute resources

For GCP compute resources, use generated code from providers/gcp/gcp_compute_code_generator.

To regenerate code:

go run providers/gcp/gcp_compute_code_generator/*.go

Similar projects

Terraformer Benefits
  • Simpler to add new providers and resources - already supports AWS, GCP, GitHub, Kubernetes, and Openstack. Terraforming supports only AWS.
  • Better support for HCL + tfstate, including updates for Terraform 0.12.
  • If a provider adds new attributes to a resource, there is no need change Terraformer code - just update the Terraform provider on your laptop.
  • Automatically supports connections between resources in HCL files.
Comparison

Terraforming gets all attributes from cloud APIs and creates HCL and tfstate files with templating. Each attribute in the API needs to map to attribute in Terraform. Generated files from templating can be broken with illegal syntax. When a provider adds new attributes the terraforming code needs to be updated.

Terraformer instead uses Terraform provider files for mapping attributes, HCL library from Hashicorp, and Terraform code.

Look for S3 support in terraforming here and official S3 support Terraforming lacks full coverage for resources - as an example you can see that 70% of S3 options are not supported:

Stargazers over time

Stargazers over time

terraformer's People

Contributors

anilkumarnagaraj avatar aqche avatar chenrui333 avatar cucxabong avatar ddelnano avatar dependabot[bot] avatar ghost---shadow avatar jmarhee avatar jsm222 avatar juarezr avatar juno-yu avatar ktogo avatar magodo avatar mcbenjemaa avatar meshuga avatar noinarisak avatar okazu-dm avatar pratikmallya avatar reinoudk avatar rgreinho avatar rotemavni avatar sergeylanzman avatar skarimo avatar starptech avatar sunil29feb avatar t0rr3sp3dr0 avatar teraken0509 avatar therve avatar trois-six avatar yukirii avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

terraformer's Issues

Best Practices

Hi
This is not an issue
First congratulation for the greate tool
I want to know if you have best practices or recommendation for the better use with GCP
For example I must run the import with all the resources at same time or you recommend separate for example networks from instances??

Infinity loop when trying to import not existant Stackdriver

When I'm trying to import projects resource monitoring but never actually use it I receive tons of repeating errors:

$ terraformer import google --projects projectname --resources monitoring
2019/05/10 08:53:38 google importing project projectname
2019/05/10 08:53:38 google importing... monitoring
2019/05/10 08:53:40 error with alert: rpc error: code = InvalidArgument desc = 'projects/projectname' is not a Stackdriver workspace.
2019/05/10 08:53:40 error with alert: rpc error: code = InvalidArgument desc = 'projects/projectname' is not a Stackdriver workspace.
2019/05/10 08:53:40 error with alert: rpc error: code = InvalidArgument desc = 'projects/projectname' is not a Stackdriver workspace.

And it repeats until I stop it with C-c.

Error 403: Required 'compute.instances.list' permission for 'projects/project', forbidden

Hi Guys
Im having this issue running terraformer against another project different that the own the VM with terraformer.

When I run terraformer against resources of the same project is OK
But When I run terraformer against resources of another project I get the follow error

Error 403: Required 'compute.instances.list' permission for 'projects/project', forbidden

The target project has the compute.googleapis.com and the my user has Project Owner permissions

Thanks for your help

Comparison w/ terraforming project

I see that you link to it as a similar project, but it would be helpful to understand what the pros/cons of terraformer vs. terraforming would be for a user.

import google `dns` resource is always unsuccess. (2019/5/5)

I tried to get google dns resource, but failed.
The console output is like below:

cd /path/to/working_directory 

❯ ./terraformer import google --resources=dns --connect=true --projects=MY_PROJECT, --state=bucket
2019/05/05 13:35:52 google importing project MY_PROJECT
2019/05/05 13:35:52 google importing... dns
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
2019/05/05 13:35:55 project: required field is not set
panic: interface conversion: interface {} is nil, not string

goroutine 1 [running]:
github.com/GoogleCloudPlatform/terraformer/gcp_terraforming.(*CloudDNSGenerator).PostConvertHook(0xc000168400, 0x2ff0980, 0xc000168400)
	/path/to/working_directory/gcp_terraforming/clouddns.go:125 +0x6a7
github.com/GoogleCloudPlatform/terraformer/cmd.Import(0x2ff0b60, 0xc0000b9e40, 0xc0004ae650, 0x1, 0x1, 0xc0000c6a50, 0x30, 0x2c20d86, 0x9, 0x7ffeefbff8c2, ...)
	/path/to/working_directory/cmd/import.go:88 +0x1ad
github.com/GoogleCloudPlatform/terraformer/cmd.newCmdGoogleImporter.func1(0xc0004c0780, 0xc000331a90, 0x0, 0x5, 0x0, 0x0)
	/path/to/working_directory/cmd/google.go:37 +0x34a
github.com/spf13/cobra.(*Command).execute(0xc0004c0780, 0xc000331a40, 0x5, 0x5, 0xc0004c0780, 0xc000331a40)
	/Users/ME/go/pkg/mod/github.com/spf13/[email protected]/command.go:762 +0x465
github.com/spf13/cobra.(*Command).ExecuteC(0xc0004c0280, 0xc0000b9440, 0xc0000b9400, 0xc00047bf88)
	/Users/ME/go/pkg/mod/github.com/spf13/[email protected]/command.go:852 +0x2ec
github.com/spf13/cobra.(*Command).Execute(...)
	/Users/ME/go/pkg/mod/github.com/spf13/[email protected]/command.go:800
github.com/GoogleCloudPlatform/terraformer/cmd.Execute(0x0, 0x0)
	/path/to/working_directory/cmd/root.go:34 +0x28
main.main()
	/path/to/working_directory/main.go:25 +0x22

I've upgraded terraform provider google 1.20 -> 2.0, but import google dns has failed as same.

go build -v fails

Running $ go build -v
Getting these errors. help will be much appreciated.

# github.com/terraform-providers/terraform-provider-openstack/openstack
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/data_source_openstack_images_image_v2.go:162:3: unknown field 'Tag' in struct literal of type "github.com/gophercloud/gophercloud/openstack/imageservice/v2/images".ListOpts
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_blockstorage_volume_v1.go:212:3: cannot use d.Get("name").(string) (type string) as type *string in field value
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_blockstorage_volume_v1.go:213:3: cannot use d.Get("description").(string) (type string) as type *string in field value
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_blockstorage_volume_v2.go:223:3: cannot use d.Get("name").(string) (type string) as type *string in field value
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_blockstorage_volume_v2.go:224:3: cannot use d.Get("description").(string) (type string) as type *string in field value
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_blockstorage_volume_v2.go:286:27: not enough arguments in call to "github.com/gophercloud/gophercloud/openstack/blockstorage/v2/volumes".Delete
	have (*gophercloud.ServiceClient, string)
	want (*gophercloud.ServiceClient, string, "github.com/gophercloud/gophercloud/openstack/blockstorage/v2/volumes".DeleteOptsBuilder)
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_compute_instance_v2.go:567:3: undefined: availabilityzones.ServerExt
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_compute_secgroup_v2.go:175:3: cannot use d.Get("description").(string) (type string) as type *string in field value
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_dns_recordset_v2.go:185:26: cannot use d.Get("description").(string) (type string) as type *string in assignment
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_dns_zone_v2.go:191:26: cannot use d.Get("description").(string) (type string) as type *string in assignment
../../../go/pkg/mod/github.com/terraform-providers/[email protected]/openstack/resource_openstack_dns_zone_v2.go:191:26: too many errors

Plugin error during a try of importing project

I use the binary release, and the binary after having cloned the repository. I've got same error :

$ ~/bin/terraformer-linux-amd64 import google --resources=gcs,firewalls,forwardingRules,networks --connect=true --projects=mygcpproject
2019/05/06 09:17:33 google importing project mygcpproject
2019/05/06 09:17:33 google importing... gcs
2019/05/06 09:17:34 plugin error: fork/exec : no such file or directory
2019/05/06 09:17:34 fork/exec : no such file or directory

plugin error: readdirent: invalid argument

Hello!
First things first: I love this project! This is a tool/feature I'm missing in terraform.

I ran into problems right away though.

I'm using the binary https://github.com/GoogleCloudPlatform/terraformer/releases/download/0.7/terraformer-darwin-amd64

with this gcp provider's binary for darwin
https://releases.hashicorp.com/terraform-provider-google/2.5.1/
(I tried the 2.5.0 too with the same result)

When running this or any other command on my project:

terraformer import google --resources=gcs,firewalls  --zone=europe-west1-b --projects=my-secret-project

I'm getting

2019/05/03 14:38:36 plugin error: readdirent: invalid argument
2019/05/03 14:38:36 readdirent: invalid argument

Any ideas?

Start use schema.Schema instead terraform.ResourceProvider

Today terraformer use terraform.ResourceProvider interface for get ProviderSchema with Attributes.
In other interface we can get more Attributes from providers(like deprecation options)
Need use schema.Schema from github.com/hashicorp/terraform/helper/schema

must `import google subnetworks` set ZONE?

Hi
When I forgot ZONE(eg:like -a,-b), error has occured, I think this it unxpected behavior.

$ terraformer import google --resources=subnetworks --connect=true --zone=asia-northeast1 --projects=prj
2019/06/13 12:17:29 google importing project prj
2019/06/13 12:17:29 google importing... subnetworks
2019/06/13 12:17:30 googleapi: Error 400: Invalid value for field 'region': 'asia'. Unknown region., invalid

When i added ZONE, it works well.

$ terraformer import google --resources=subnetworks --connect=true --zone=asia-northeast1-a --projects=prj
2019/06/13 12:17:22 google importing project prj
2019/06/13 12:17:22 google importing... subnetworks
2019/06/13 12:17:26 google Connecting.... 
2019/06/13 12:17:26 google save subnetworks
2019/06/13 12:17:26 [DEBUG] New state was assigned lineage "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
2019/06/13 12:17:26 google save tfstate for subnetworks

subnetworks, addresses occures same error too.

Possible wrong formatting or mistype in import google --help

Import current State to terraform configuration from google cloud

Usage:
   import google [flags]
   import google [command]

Available Commands:
  list        Import current State to terraform configuration

Flags:
  -b, --bucket string        gs://terraform-state
  -c, --connect               (default true)
  -f, --filter strings       google_compute_firewall=id1:id2:id4
  -h, --help                 help for google
  -o, --path-output string    (default "generated")
  -p, --path-patter string   {output}/{provider}/custom/{service}/ (default "{output}/{provider}/{service}/")
      --projects strings     
  -r, --resources strings    firewalls,networks
  -s, --state string         local or bucket (default "local")
  -z, --zone string

Use " import google [command] --help" for more information about a command.

--path-patter looks incorrect for me.
I suspect output format issue or just a simple mistype in the option name.

Importing wildcard DNS records from AWS Route53 fail

Hello,

Terraformer's output is empty when importing wildcard DNS records from Route53.

terraformer import aws --resources=route53 --filter=aws_route53_record='myzoneid_*.example.com._CNAME' --regions=eu-west-3

There is no problem importing this record from terraform import.

Access denied for aws import

Hey there,

I am trying to test this cool tool to import some aws resources into our tf project.
Problem is, we are using cross account assumed roles, with 2FA enabled. For example, my ~/.aws/credentials file looks like this:

[default]
aws_access_key_id = ...
aws_secret_access_key = ...

[profile1]
role_arn = arn:aws:iam::<account-id-1>:role/<role-name-1>
source_profile = default
mfa_serial = arn:aws:iam::<central-account-id>:mfa/<my-username>

[profile2]
role_arn = arn:aws:iam::<account-id-2>:role/<role-name-2>
source_profile = default
mfa_serial = arn:aws:iam::<central-account-id>:mfa/<my-username>

so when trying to import s3 buckets, with this commands: terraformer import aws --resources=s3 --connect=true --regions=eu-west-1
I am getting:

2019/05/08 13:13:47 aws importing region eu-west-1
2019/05/08 13:13:47 aws importing... s3
2019/05/08 13:13:47 AccessDenied: Access Denied
	status code: 403, request id: 7164....5B8F351, host id: 7q8nhD6LnVeniWww9arkIcBQX.................NdzDgMCNBhC610MZM=

Ismy cross-account setup currently supported by terraformer?

Filtering route53 resources documentation

Hello,

I struggled to understand how to use filters with aws route53 resources. It may be useful to write somewhere how to fetch hosted zone IDs (with aws-cli but whithout the /hostedzone/ part) and that records ID are made of <hosted zone id>_<fqdn>_<record type>.

I can make a PR if you tell me where this documentation belongs.

import google monitoring returns empty tf files.

Hi, I'm using project owner account to import. but import google monitoring returns empty .tf files below.

terraformer import google --resources=monitoring --connect=true --zone=asia-northeast1 --projects=my-gcp-prj
2019/06/11 15:56:57 google importing project my-gcp-prj
2019/06/11 15:56:57 google importing... monitoring
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 project: required field is not set
2019/06/11 15:57:00 google Connecting.... 
2019/06/11 15:57:00 google save monitoring
2019/06/11 15:57:00 [DEBUG] New state was assigned lineage  "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
2019/06/11 15:57:00 google save tfstate for monitoring

monitoring_alert_policy.tf

resource "google_monitoring_alert_policy" "projects--my-gcp-prj--alertPolicies--12270733397295926724" {}

resource "google_monitoring_alert_policy" "projects--my-gcp-prj--alertPolicies--18416277687926950534" {}

resource "google_monitoring_alert_policy" "projects--my-gcp-prj--alertPolicies--2641331155153181887" {}

resource "google_monitoring_alert_policy" "projects--my-gcp-prj--alertPolicies--6165830597848982763" {}

monitoring_notification_channel.tf

resource "google_monitoring_notification_channel" "projects--my-gcp-prj--notificationChannels--14664454090038840730" {}

resource "google_monitoring_notification_channel" "projects--my-gcp-prj--notificationChannels--7956856499735477724" {}

resource "google_monitoring_notification_channel" "projects--my-gcp-prj--notificationChannels--9929777985012302021" {}

monitoring_uptime_check_config.tf

resource "google_monitoring_uptime_check_config" "projects--my-gcp-prj--uptimeCheckConfigs--uptime-check-for-api-1559734933" {}

resource "google_monitoring_uptime_check_config" "projects--my-gcp-prj--uptimeCheckConfigs--uptime-check-for-web-1559734602" {}

Do not create "empty" folders when no resources were found

when I do

terraformer import google --resources=addresses,dns

but if any addresses resource were not found in the project, I got an addresses folder with providers.tf and "empty" tf state file.

Can we avoid this by adding CLI parameter like --empty-resource-folders=false?

Add support Grafana Provider

Grafana best open source tool for visualization metrics.
Grafana Provider support only 4 resources

  • grafana_alert_notification
  • grafana_dashboard
  • grafana_data_source
  • grafana_organization

Grafana has api for get ID for each resources.
API docs: https://grafana.com/docs/http_api/
maybe API Token need be pass in flags
It's can be usable for users with a lot of number alerts and dashboards.

Delete optinal numeric zero values

When an optinal numeric value is missing, terraform puts a 0 in it's state and it won't be removed by current delete empty value code block which is just removing the empty strings.

As there are some optinal nemeric values that has a must-be-positive/non-zero constraint, having zero for the missing optinal numeric values is causing the resource to be rejected.

go.sum checksum mismatch (2019/5/4)

I got the error below:

❯ go build -v
go: downloading github.com/ryanuber/columnize v2.1.0+incompatible
go: downloading github.com/mitchellh/colorstring v0.0.0-20150917214807-8631ce90f286
go: downloading github.com/mitchellh/panicwrap v0.0.0-20170106182340-fce601fe5557
go: extracting github.com/ryanuber/columnize v2.1.0+incompatible
go: extracting github.com/mitchellh/colorstring v0.0.0-20150917214807-8631ce90f286
go: extracting github.com/mitchellh/panicwrap v0.0.0-20170106182340-fce601fe5557
go: downloading github.com/hashicorp/go-getter v0.0.0-20181119194526-bd1edc22f8ea
verifying github.com/hashicorp/[email protected]: checksum mismatch
	downloaded: h1:o+/o9FSMvgMmv5Ss+EUutVEwZfcFR1nJKzOqlVJGFPY=
	go.sum:     h1:B2Aqv5hbW6F55OFgXAcSqGhqKi4Jb6IRLkKoA5ohgzU=

please check and correct go.sum.

(workaround : rename go.sum to go.sum_ , then run go build -v again.)

Not able to install terraformer

Hi, I am newbie here.
OS: Windows 10
Following the Readme.md, I am able to clone the git repo. But after that I hit complete road block.

Can you please add documentation for Windows as well.
I have installed go lang as well.
but not sure how to execute this command ?

Run GO111MODULE=on go mod vendor
Run go build -v
Copy your Terraform provider's plugin(s) to folder ~/.terraform.d/plugins/{darwin,linux}_amd64/, as appropriate.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.