GithubHelp home page GithubHelp logo

awslabs / aws-refarch-cross-account-pipeline Goto Github PK

View Code? Open in Web Editor NEW
227.0 51.0 131.0 336 KB

The Cloudformation Templates guides the users to setup a codepipeline in Account-A, CodeCommit in Account-B and Deployment of a Sample Lambda in Account-C. Provides a reference for customers to use AWS CodePipeline as a centralized product to enable CI/CD across multiple accounts.

Home Page: https://aws.amazon.com/blogs/devops/aws-building-a-secure-cross-account-continuous-delivery-pipeline

License: Apache License 2.0

Shell 100.00%

aws-refarch-cross-account-pipeline's Issues

Deploying different lambdas without replacing the previously deployed lambda

I asked my question here: https://stackoverflow.com/questions/52390767/pipeline-replaces-previously-deployed-lambda-when-deploying-new-lambda and here: https://stackoverflow.com/questions/52428945/how-to-re-use-codepipeline-to-deploy-different-lambdas-without-replacing-an-exis

But basically, how do we deploy a second lambda by reusing the same pipeline. To achieve this, I rerun step 4 and 5 of the tutorial but it creates a changeset and replaces my previously deployed lambda function but I want to display all previously deployed lambda not just the latest one.

And we also don't want to duplicate the pipeline for each lambda that we need to deploy. So is there something I'm missing on step 4 and 5 so that it doens't keep creating a changeset and replacing the previously deployed lambda but instead deploys it next to the previous one? Hope that makes sense.
I'm new to the aws cloudformation tool and learning as much as I can; thank you for your prompt response

Insufficient Permissions on S3 Bucket in Source Stage

Hi,
nice article and sources.

I used the single click script and followed the indicated instructions.
But the "App" action in the "Source" stage is in error :

Insufficient permissions
The service role or action role doesn’t have the permissions required to access the Amazon S3 bucket named pre-reqs-artifactbucket-xxxxxxxxx. 
Update the IAM role permissions, and then try again.
Error: Amazon S3:AccessDenied:Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID:...

I can't see precisely which service role or action role as my account is restricted on CloudTrail access.

The bucket policy in the tools account seems to be ok with the real account numbers (but here in github with : dev = 123456789012, tools : 234567890123)

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::123456789012:role/ToolsAcctCodePipelineCodeCommitRole",
                    "arn:aws:iam::234567890123:role/sample-lambda-CodeBuildRole"
                ]
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::pre-reqs-artifactbucket-xxxxxxxxxxxx",
                "arn:aws:s3:::pre-reqs-artifactbucket-xxxxxxxxxxxx/*"
            ]
        }
    ]
}

And the role (trusted entity : tools account 234567890123) in the dev account looks ok too :

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "codecommit:BatchGetRepositories",
                "codecommit:Get*",
                "codecommit:GitPull",
                "codecommit:List*",
                "codecommit:CancelUploadArchive",
                "codecommit:UploadArchive",
                "s3:*"
            ],
            "Resource": "*",
            "Effect": "Allow"
        },
        {
            "Action": [
                "kms:*"
            ],
            "Resource": "arn:aws:kms:eu-west-1:234567890123:alias/codepipeline-crossaccounts",
            "Effect": "Allow"
        }
    ]
}

Any advice on the way to go ?

Missing DependsOn: PipelinePolicy in the CodePipeline resource

There could be race conditions in the creation of the pipeline, pipeline role, pipeline policy if the pipeline resource is missing a DependsOn on the policy.

This is documented at https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-iam-policy.html

If a policy has a Ref to a role and if a resource (such as AWS::ECS::Service) also has a Ref to the
same role, add a DependsOn attribute to the resource so that the resource depends on the policy.
This dependency ensures that the role's policy is available throughout the resource's lifecycle. For
example, when you delete a stack with an AWS::ECS::Service resource, the DependsOn attribute
ensures that the AWS::ECS::Service resource can complete its deletion before its role's policy is
deleted.

codepipeline failed @ codebuild stage

Hi,

Can some someone help me with codebuild stage failed with below error.

Error: Unable to upload artifact None referenced by ImageUri parameter of HelloWorldFunction resource.
Image not found for ImageUri parameter of HelloWorldFunction resource.

I have attache the error screenshot for reference.

codebuild

Code pipeline fails in source stage from github

My aws code pipeline role has s3 full access role. I have configured code pipeline to download code from github. But it fails in source stage with error:

The provided role does not have permissions to perform this action. Underlying error: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: BDA77A60ED10A069; S3 Extended Request ID: nQzv6LKkjAXeL0NjcgysjVj64G/7fVjvkidRS4IYjZrikJa+H1PUBdJXTmu4UD5N2x9zyAJGCdE=)

What makes the Pipeline point to another account?

I've been using CodePipeline for nearly 2 years now and I never created any cross-account CodePipeline. The thing I want to understand is exactly what makes the pipeline on the Tools Account run the CloudFormation template on the Dev or Production account.
Looking through the template, the only thing I could think of was the RoleARN.
Is it the fact that the Role belongs to another account, therefore the cloudformation creation happens in the same account as the role?
Does this mean I can deploy to production account in multiple regions by simply changing the role ARN to point to a different region?

If the question is confusing, here's the final attempt to clarify my question: looking at the following block of code (https://github.com/awslabs/aws-refarch-cross-account-pipeline/blob/master/ToolsAcct/code-pipeline.yaml#L251-L304), what makes the resource be deployed in a different account than the one running the pipeline itself?

Cross-account region specification

Hello,

I've been looking into the repo and AWS tutorials on this cross-account matter. However, I wasn't able to find any reference to region specification when deploying cross-account.

The deployment environment of the account is seen as a single unit, when there could possibly be multiple regions where to deploy in.

Is there an example on how we change the cross-account deployment region rather than the pipeline's default?

I've tried setting up a new region in the deployment stage, as well as another ArtifactBucket for the corresponding region.

    ArtifactStores:
        - Region: "eu-west-1"
          ArtifactStore:
            Type: S3
            Location: !Sub ${BucketName}-eu-west-1
            EncryptionKey:
              Id: !GetAtt KMSKey.Arn
              Type: KMS
        - Region: "ap-southeast-2"
          ArtifactStore:
            Type: S3
            Location: !Sub ${BucketName}-ap-southeast-2

But the end result is either:

  • Failed replicating 'Deployment Artifact': Invalid Arn (if I specify the KMS key in the new bucket)
  • Access Denied on S3 when the KMS key is only specified on the main artifact store, even with Role permissions and bucket policies extended to the new bucket.

Thanks.

BuildProject Type linuxContainer deprecated

When running

$ cfn-lint -t ToolsAcct/code-pipeline.yaml --ignore-checks W

it returns

E3030 You must specify a valid value for Type (linuxContainer). Valid values are ["ARM_CONTAINER", "LINUX_CONTAINER", "LINUX_GPU_CONTAINER", "WINDOWS_CONTAINER", "WINDOWS_SERVER_2019_CONTAINER"] ToolsAcct/code-pipeline.yaml:100:9

because the value linuxContainer is deprecated (though the deployment still works)

Change it to LINUX_CONTAINER instead

The codebuild project is missing a KMSKey parameter

Please add the following line to the code-pipeline.yaml file at the Codebuil project part:
EncryptionKey: !Ref CMKARN

If you do not specify this AWS Codebuild will use the default s3 encryption for the artifacts in the s3 bucket. This is causing issues in the next step when you deploy it via Cloudformation in a different account (this account cannot have access to the default s3 kms key).

I got a below issue when I run my aws code pipeline.

AccessDenied: Access Denied status code: 403, request id: D18C063FBD, host id: q8QSdgeMXd9GzqiHAn9aR7e+Qh5TNjYfRjbXxQP73FZbZWjFj78IXQ= for primary source and source version arn:aws:s3:::pre-reqartifactbucket-1btcbp1ce/Infra-Prov/SourceOutp/EEB5s

I have give full s3 access but still access issue & tried alot.
Please let me know the solution to avoid this.

Originally posted by @jeetugswm in #1 (comment)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.