aws-samples / aws-codepipeline-terraform-cicd-samples Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT No Attribution
License: MIT No Attribution
codepipeline_kms -> codepipeline_iam_role -> codepipeline_kms
module "codepipeline_kms" {
source = "./modules/kms"
codepipeline_role_arn = module.codepipeline_iam_role.role_arn
tags = {
Project_Name = var.project_name
Environment = var.environment
Account_ID = local.account_id
Region = local.region
}
}
module "codepipeline_iam_role" {
source = "./modules/iam-role"
project_name = var.project_name
create_new_role = var.create_new_role
codepipeline_iam_role_name = var.create_new_role == true ? "${var.project_name}-codepipeline-role" : var.codepipeline_iam_role_name
source_repository_name = var.source_repo_name
kms_key_arn = module.codepipeline_kms.arn
s3_bucket_arn = module.s3_artifacts_bucket.arn
tags = {
Project_Name = var.project_name
Environment = var.environment
Account_ID = local.account_id
Region = local.region
}
}
Suppressing tfsec breaks checkov suppression. suppressing checkov breaks tfsec suppresion. How do I do both?
Getting the following error when setting up the project:
│ Error: creating S3 bucket ACL for tf-validate-project-rpl20230723095148611900000001: AccessControlListNotSupported: The bucket does not allow ACLs
│ status code: 400, request id: V6MDTSSF5QK5H1JT, host id: KME82KNLmvnEfdMNOsbUfC8+bf1F6KiSYfcppi3NnFOZgVwOjdd0qwoHh+Dfsv+sE7xDNIiten8=
│
│ with module.s3_artifacts_bucket.aws_s3_bucket_acl.replication_bucket_acl,
│ on modules/s3/main.tf line 120, in resource "aws_s3_bucket_acl" "replication_bucket_acl":
│ 120: resource "aws_s3_bucket_acl" "replication_bucket_acl" {
│
╵
╷
│ Error: creating S3 bucket ACL for tf-validate-project20230723100746984200000001: AccessControlListNotSupported: The bucket does not allow ACLs
│ status code: 400, request id: ABM8EN00AAZVR2W3, host id: loDvDdpHBCpKuVnLCzztPyQcLiH453qit9bh9rdyZdetBRoY08oFXOlDZmZuNc8Wl9oAo35QbnU=
│
│ with module.s3_artifacts_bucket.aws_s3_bucket_acl.codepipeline_bucket_acl,
│ on modules/s3/main.tf line 199, in resource "aws_s3_bucket_acl" "codepipeline_bucket_acl":
│ 199: resource "aws_s3_bucket_acl" "codepipeline_bucket_acl" {
│
╵
Which alignes well with this following issue: terraform-aws-modules/terraform-aws-s3-bucket#223
As s3 buckets would have ACLs disabled by default from April (Announced in december): https://aws.amazon.com/about-aws/whats-new/2022/12/amazon-s3-automatically-enable-block-public-access-disable-access-control-lists-buckets-april-2023/
Setting the "create_new_role" variable to "false" in the terraform.tfvars file has not effect as it is not passed to the "codepipeline_iam_role" module in the main.tf file.
module "codepipeline_iam_role" {
source = "./modules/iam-role"
project_name = var.project_name
codepipeline_iam_role_name = var.create_new_role == true ? "${var.project_name}-codepipeline-role" : var.codepipeline_iam_role_name
source_repository_name = var.source_repo_name
kms_key_arn = module.codepipeline_kms.arn
s3_bucket_arn = module.s3_artifacts_bucket.arn
tags = {
Project_Name = var.project_name
Environment = var.environment
Account_ID = local.account_id
Region = local.region
}
}
Artifact passing between CodeBuild Projects doesn't work and CodePipeline can fail.
Please see my PR.
cd examples/ci-cd/aws-codepipeline ??? I dont see it
cp -r templates $YOUR_CODECOMMIT_REPO_ROOT
Problem Statement:: I am using the code-pipeline module to create a pipeline and setting run_order value to run parallel action In codepipeline but all the actions in pipelines are getting created sequentially.
I have created my module of pipeline with some little changes in the module mentioned in this repository. here is my terraform files of the module.
terraform {
required_version = "~> 1.4"
required_providers {
aws = {
source = "hashicorp/aws"
version = "5.4.0"
}
}
}
resource "aws_codepipeline" "deployment_pipeline" {
name = var.name
role_arn = var.role_arn
artifact_store {
location = var.s3_bucket_name
type = "S3"
}
stage {
name = "Source"
action {
name = "Source"
category = "Source"
owner = "AWS"
version = "1"
provider = "CodeStarSourceConnection"
output_artifacts = ["source_output"]
configuration = {
FullRepositoryId = var.source_repo_name
BranchName = var.source_repo_branch
ConnectionArn = var.ConnectionArn
}
}
}
dynamic "stage" {
for_each = var.stages
content {
name = "Stage-${stage.value["name"]}"
action {
category = stage.value["category"]
name = "Action-${stage.value["name"]}"
owner = stage.value["owner"]
provider = stage.value["provider"]
input_artifacts = [stage.value["input_artifacts"]]
output_artifacts = [stage.value["output_artifacts"]]
version = "1"
run_order = stage.value["run_order"]
configuration = {
ProjectName = stage.value["project_name"]
}
}
}
}
tags = var.tags
}
variable "name" {
description = "Unique name for this project"
type = string
}
variable "source_repo_name" {
description = "Source repo name of the CodeCommit repository"
type = string
}
variable "source_repo_branch" {
description = "Default branch in the Source repo for which CodePipeline needs to be configured"
type = string
}
variable "ConnectionArn" {
description = "Github Connection ARN"
type = string
}
variable "s3_bucket_name" {
description = "S3 bucket name to be used for storing the artifacts"
type = string
}
variable "role_arn" {
description = "ARN of the codepipeline IAM role"
type = string
}
variable "tags" {
description = "Tags to be attached to the CodePipeline"
type = map(any)
}
variable "stages" {
description = "List of Map containing information about the stages of the CodePipeline"
type = list(map(any))
}
Here I am passing run_order not as a static value but will pass through the parent module.
Now I am referencing this module in my other terraform file which is described here -
module "deployment_pipeline" {
source = "../../modules/codepipeline"
name = "${var.namespace}-${var.environment}-terraform-pipeline"
role_arn = module.codepipeline_role.arn
s3_bucket_name = data.aws_ssm_parameter.artifact_bucket.value
ConnectionArn = data.aws_codestarconnections_connection.existing_github_connection.arn
source_repo_name = var.github_FullRepositoryId
source_repo_branch = var.github_BranchName
stages = [
{ name = "Bootstrap", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 2, project_name = "${module.initial_bootstrap.name}" },
{ name = "Networking", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 3, project_name = "${module.networking_module_build_step_codebuild_project.name}" },
{ name = "Database", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${aws_codebuild_project.rds_module_build_step_codebuild_project.name}" },
{ name = "Elasticache", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${module.elasticache_module_build_step_codebuild_project.name}" },
{ name = "Opensearch", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${module.opensearch_module_build_step_codebuild_project.name}" },
{ name = "ClientVPN", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${module.vpn_module_build_step_codebuild_project.name}" },
{ name = "IAMRole", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 5, project_name = "${module.iam_role_module_build_step_codebuild_project.name}" },
{ name = "EKS", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 6, project_name = "${module.eks_module_build_step_codebuild_project.name}" },
{ name = "EKS-Auth", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${module.eks_auth_module_build_step_codebuild_project.name}" },
{ name = "EKS-Istio", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${module.istio_module_build_step_codebuild_project.name}" },
{ name = "Observability", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${module.eks_observability_module_build_step_codebuild_project.name}" },
{ name = "Opensearch-Ops", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${aws_codebuild_project.os_ops_module_build_step_codebuild_project.name}" },
{ name = "Cognito", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.cognito_module_build_step_codebuild_project.name}" },
{ name = "ControlPlaneApplication", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.control_plane_module_build_step_codebuild_project.name}" },
{ name = "TenantCodebuilds", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.tenant_codebuild_module_build_step_codebuild_project.name}" },
{ name = "Billing", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.billing_module_build_step_codebuild_project.name}" }
]
tags = module.tags.tags
}
Here I am providing run_order value the same for some actions (like Database & Elasticache) so it should create parallel actions in the code pipeline but it is creating sequential actions.
P.S. The Module is not giving any errors. It is creating a pipeline with all the above actions as mentioned.
For any other information, please let me know.
Validation stage gives below output. But in realtime, there is failures for both checkov and tfsec validations.
## VALIDATION Summary ##
--
802 | ------------------------
803 | Terraform Validate : 0
804 | Terraform Format : 0
805 | Terraform checkov : 1
806 | Terraform tfsec : 0
807 | ------------------------
The pipeline execution failing at the terraform destroy stage.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.