GithubHelp home page GithubHelp logo

aws / aws-toolkit-azure-devops Goto Github PK

View Code? Open in Web Editor NEW
232.0 37.0 98.0 7.46 MB

AWS Toolkit for Azure DevOps

License: Other

TypeScript 72.21% JavaScript 0.33% PowerShell 27.25% Shell 0.21%
azure-devops cloudformation codedeploy amazon

aws-toolkit-azure-devops's Introduction

Overview

Coverage: codecov

The AWS Toolkit for Azure DevOps adds tasks to easily enable build and release pipelines in Azure DevOps (formerly VSTS) and Azure DevOps Server (previously known as Team Foundation Server (TFS)) to work with AWS services including Amazon S3, AWS Elastic Beanstalk, AWS CodeDeploy, AWS Lambda, AWS CloudFormation, Amazon Simple Queue Service and Amazon Simple Notification Service, and run commands using the AWS Tools for Windows PowerShell module and the AWS CLI.

The AWS Toolkit for Azure DevOps is available from the Visual Studio Marketplace.

This is an open source project because we want you to be involved. We love issues, feature requests, code reviews, pull requests or any positive contribution. Please see the the CONTRIBUTING guide for how to help, including how to build your own extension.

Highlighted Features

  • AWSCLI - Interact with the AWSCLI (Windows hosts only)
  • AWS Powershell Module - Interact with AWS through powershell (Windows hosts only)
  • Beanstalk - Deploy ElasticBeanstalk applications
  • CodeDeploy - Deploy with CodeDeploy
  • CloudFormation - Create/Delete/Update CloudFormation stacks
  • ECR - Push an image to an ECR repository
  • Lambda - Deploy from S3, .net core applications, or any other language that builds on Azure DevOps
  • S3 - Upload/Download to/from S3 buckets
  • Secrets Manager - Create and retrieve secrets
  • SQS - Send SQS messages
  • SNS - Send SNS messages
  • Systems manager - Get/set parameters and run commands

User Guide

The User Guide contains additional instructions for getting up and running with the extension.

NOTE: The user-guide source content that used to live in this folder has been moved to its own GitHub repository.

Credentials Handling for AWS Services

To enable tasks to call AWS services when run as part of your build or release pipelines AWS credentials need to have been configured for the tasks or be available in the host process for the build agent. Note that the credentials are used specifically by the tasks when run in a build agent process, they are not related to end-user logins to your Azure DevOps instance.

The AWS tasks support the following mechanisms for obtaining AWS credentials:

  • One or more service endpoints, of type AWS, can be created and populated with AWS access and secret keys, and optionally data for Assumed Role credentials.
  • If only the Assumed Role is defined but neither access key ID nor secret key, the role be assumed regardless. This is useful when using instance profices, and and profile only allows to assume a role.
    • Tasks reference the configured service endpoint instances by name as part of their configuration and pull the required credentials from the endpoint when run.
  • Variables defined on the task or build.
    • If tasks are not configured with the name of a service endpoint they will attempt to obtain credentials, and optionally region, from variables defined in the build environment. The variables are named AWS.AccessKeyID, AWS.SecretAccessKey and optionally AWS.SessionToken. To supply the ID of the region to make the call in, e.g. us-west-2, you can also use the variable AWS.Region. Optionally a role to assume can be specified by using the variable AWS.AssumeRoleArn. When assuming roles AWS.RoleSessionName (optional) and AWS.ExternalId (optional) can be provided in order to specify an identifier for the assumed role session and an external id to show in customers' accounts when assuming roles.
  • Environment variables in the build agent's environment.
    • If tasks are not configured with the name of a service endpoint, and credentials or region are not available from task variables, the tasks will attempt to obtain credentials, and optionally region, from standard environment variables in the build process environment. These variables are AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and optionally AWS_SESSION_TOKEN. To supply the ID of the region to make the call in, e.g. us-west-2, you can also use the environment variable AWS_REGION.
  • EC2 instance metadata, for build hosts running on EC2 instances.
    • Both credential and region information can be automatically obtained from the instance metadata in this scenario.

Configuring an AWS Service Endpoint

To use AWS service endpoints add the AWS subscription(s) to use by opening the Account Administration screen (gear icon on the top-right of the screen) and then click on the Services Tab. Note that each Azure DevOps project is associated with its own set of credentials. Service endpoints are not shared across projects. You can associate a single service endpoint to be used with all AWS tasks in a build or multiple endpoints if you require.

Select the AWS endpoint type and provide the following parameters. Please refer to About Access Keys:

  • A name used to refer to the credentials when configuring the AWS tasks
  • AWS Access Key ID
  • AWS Secret Access Key

Note We strongly suggest you use access and secret keys generated for an Identity and Access Management (IAM) user account. You can configure an IAM user account with permissions granting access to only the services and resources required to support the tasks you intend to use in your build and release definitions.

Tasks can also use assumed role credentials by adding the Amazon Resource name (ARN) of the role to be assumed and an optional identifier when configuring the endpoint. The access and secret keys specified will then be used to generate temporary credentials for the tasks when they are executed by the build agents. Temporary credentials are valid for up to 15 minutes by default. To enable a longer validity period you can set the 'aws.rolecredential.maxduration' variable on your build or release definition, specifying a validity period in seconds between 15 minutes (900 seconds) and 12 hours (43200 seconds).

Supported environments

  • Azure DevOps
  • Team Foundation Server 2017 Update 1 (or higher) (now called Azure DevOps Server)

License

The project is licensed under the MIT license

Contributors

We thank the following contributor(s) for this extension: Visual Studio ALM Rangers.

aws-toolkit-azure-devops's People

Contributors

ansariwn avatar bdenhollander avatar bluphy avatar bryceitoc9 avatar cameronattard avatar cherylflowers avatar clakech avatar commanderroot avatar daiyyr avatar dependabot[bot] avatar emfl avatar hieuxlu avatar hunterwerlla avatar hyandell avatar jadensimon avatar lhgomes avatar motisoft avatar normj avatar ostreifel avatar paul-b-aws avatar pflugs30 avatar rbbarad avatar ryanrivest avatar sblauman avatar sbreakey avatar stevejroberts avatar svarcoe avatar vincelee888 avatar whoiskevinrich avatar william-keller avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-toolkit-azure-devops's Issues

[TFS 2015] AWS Endpoint configuration requires values in all fields.

AWS VSTS 1.0.12
TFS 2015 SP4

The endpoint configuration requires values in all fields including the optional fields. Adding a role-name then breaks using the build/release steps because a valid role must be available for the IAM user to assume:

2017-10-30T20:09:01.6962270Z Configuring task to use role-based credentials.
2017-10-30T20:09:02.2615620Z ##[error]The security token included in the request is invalid.

addawsconnection

DOS Fileformat convertion

Hello guys, it's my first issue report and forgive me any mistake!

I don't know why aws-vsts tool is converting my files to DOS format, this is an exemple using code deploy in a Linux server:


DOS line endings

The same file, but now cloned straight from the VSTS git repository:

it may be a MS issue when the files are compressed, is there anyone else having the same problem?

Thanks

Installation Issues on Agent Version

Greetings,

I encountered this issue while installing the aws-vsts-tools extension on my employer's copy of TFS 2015 Update 3. The extension would always report that it had successfully installed, however, the only tool available after every installation would be the AWS CLI tool. After scouring the extension code I took a gamble and removed the 2nd (when ordered alphabetically) item's entry in the tasks list (AWSPowerShellModuleScript) from the 'tasks' section in the make-options.json file, and the corresponding entries from the 'files' and 'contributions' sections in the vss-extension.json file. Installation after these modifications worked fine.

It appears that the issue is that this tool is the only one contained in the extension requiring a different build agent version (requiring v2.115.0 where the remainder require v1.91.0). When the installation of the tool failed due to requiring a higher level agent than our installation has, it exited the installation entirely with no indication that the rest of the extension had failed to install.

I had reported this to the Microsoft team who asked me to come here for support. Is there some way to continue if a conflicting build agent version is found? Again, this would need to occur on the tools individually as the AWSPowerShellModuleScript was the only offending outlier.

Thank you.

AWS Tools For Powershell Requires Credentials Saved On Server

So, I'm not sure if this is intended, but this doesn't match how the other AWS tools work from my experience.

For reference, I am using Version 1.0.12 of the tasks.

When I tried to run a custom powershell block with the AWS Tools For Powershell, it gave me the following error:

Unable to locate credentials. You can configure credentials by running "aws configure".

Within the task properties, I am selecting an endpoint with valid credentials that I am using for other AWS tools tasks in the same build process. For example, my previous step in this build is using the AWS Lambda .NET Core Deployment task, and it is able to properly take the credentials from the endpoint.

To get the Powershell task to run successfully, I had to run aws configure on the server that is running the build job and add the credentials directly on the server.

I know this is a slightly different task since I'm running powershell in-line, but I would expect the task to still use the credentials from the selected endpoint. Let me know if you need any additional information to look into this more!

Deploy with Codedeploy task

With the Deploy with Codedeploy task is there a way to use S3 encryption (AES256) as part of the zip file upload? Our AWS environment prevents uploading a file to S3 unless you use AES 256 which is causing the Codedeploy task to fail. As a test we can use the S3 Upload task fine as this supports AES 256 encryption. If we remove the AES 256 encryption setting from the S3 Upload task it also fails.

Many thanks

I am not able to add session_token variable in aws connection feature of VSTS . Please help

I am getting below error . I assume it is related to my session token .

2017-12-06T14:16:02.6827252Z ##[section]Starting: S3 Upload: MyBUCKET
2017-12-06T14:16:02.6831278Z ==============================================================================
2017-12-06T14:16:02.6831563Z Task : AWS S3 Upload
2017-12-06T14:16:02.6831955Z Description : Upload file and folder content to an Amazon Simple Storage Service (S3) Bucket
2017-12-06T14:16:02.6832215Z Version : 1.0.14
2017-12-06T14:16:02.6832422Z Author : Amazon Web Services
2017-12-06T14:16:02.6832759Z Help : Please refer to Working with Amazon S3 Buckets for more information on working with Amazon S3.
2017-12-06T14:16:02.6833117Z ==============================================================================
2017-12-06T14:16:24.2184412Z 4b091740-92a4-44e7-99e7-ce1a7ab67bd5 exists true
2017-12-06T14:16:24.2191092Z Configuring task to use role-based credentials.
2017-12-06T14:16:25.6892711Z Bucket MyBUCKET does not appear to exist (or you do not have access). Attempting to create in region us-east-2.
2017-12-06T14:16:25.7230982Z Failed to create bucket { CredentialsError: Missing credentials in config
2017-12-06T14:16:25.7232455Z at Request.extractError (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:5398:29)
2017-12-06T14:16:25.7232987Z at Request.callListeners (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:7632:20)
2017-12-06T14:16:25.7234019Z at Request.emit (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:7604:10)
2017-12-06T14:16:25.7234759Z at Request.emit (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:14799:14)
2017-12-06T14:16:25.7235775Z at Request.transition (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:14138:10)
2017-12-06T14:16:25.7236769Z at AcceptorStateMachine.runTo (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:14941:12)
2017-12-06T14:16:25.7239183Z at d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:14953:10
2017-12-06T14:16:25.7240099Z at Request. (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:14154:9)
2017-12-06T14:16:25.7240997Z at Request. (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:14801:12)
2017-12-06T14:16:25.7241980Z at Request.callListeners (d:\a_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.14\S3Upload.js:7642:18)
2017-12-06T14:16:25.7242467Z message: 'Missing credentials in config',
2017-12-06T14:16:25.7242827Z code: 'CredentialsError',
2017-12-06T14:16:25.7243201Z time: 2017-12-06T14:16:25.716Z,
2017-12-06T14:16:25.7243563Z requestId: '0b5169cf-da90-11e7-b669-73e0107f4698',
2017-12-06T14:16:25.7243925Z statusCode: 403,
2017-12-06T14:16:25.7244357Z retryable: false,
2017-12-06T14:16:25.7244661Z retryDelay: 93.85135920368765,
2017-12-06T14:16:25.7244965Z originalError:
2017-12-06T14:16:25.7245303Z { message: 'Could not load credentials from TemporaryCredentials',
2017-12-06T14:16:25.7245687Z code: 'CredentialsError',
2017-12-06T14:16:25.7246008Z time: 2017-12-06T14:16:25.716Z,
2017-12-06T14:16:25.7246325Z requestId: '0b5169cf-da90-11e7-b669-73e0107f4698',
2017-12-06T14:16:25.7246643Z statusCode: 403,
2017-12-06T14:16:25.7246945Z retryable: false,
2017-12-06T14:16:25.7247302Z retryDelay: 93.85135920368765,
2017-12-06T14:16:25.7247599Z originalError:
2017-12-06T14:16:25.7247887Z { message: 'The security token included in the request is invalid.',
2017-12-06T14:16:25.7248227Z code: 'InvalidClientTokenId',
2017-12-06T14:16:25.7248706Z time: 2017-12-06T14:16:25.716Z,
2017-12-06T14:16:25.7249126Z requestId: '0b5169cf-da90-11e7-b669-73e0107f4698',
2017-12-06T14:16:25.7249448Z statusCode: 403,
2017-12-06T14:16:25.7250352Z retryable: false,
2017-12-06T14:16:25.7253517Z retryDelay: 93.85135920368765 } } }
2017-12-06T14:16:25.7323000Z ##[error]CredentialsError: Missing credentials in config
2017-12-06T14:16:25.7480560Z ##[section]Finishing: S3 Upload: MyBUCKET

feature request - SSM Param Store Support

Hello,

As a DevOps Engineer i would love the ability to manage SSM Param Store values as a task in the VSTS pipeline out of the box.

Each Task should be able to support multiple parameters ( infinite add list ), a bit like this.

Region []
Credential [
]
KMS KeyID [_________]

Name [] Value [] Type [] [-]
Name [
] Value [] Type [] [-]
Name [] Value [] Type [_________] [+] [-]

AWS Beanstalk Deployment task during Release fails on Hosted agent

When running the AWS Beanstalk Deployment task on a Hosted agent, the task fails every time with the following error:

##[error]System.IO.FileNotFoundException: The specified module 'AWSPowerShell' was not loaded because no valid module file was found in any module directory.

Full logs of recent run below (AWS credentials removed from log):

2017-09-11T15:25:57.2848220Z ##[section]Starting: Release new version to Test
2017-09-11T15:25:57.2848220Z ==============================================================================
2017-09-11T15:25:57.2848220Z Task : AWS Beanstalk Deployment
2017-09-11T15:25:57.2848220Z Description : Build task to upload a new application version to an Elastic Beanstalk in AWS.
2017-09-11T15:25:57.2848220Z Version : 1.0.0
2017-09-11T15:25:57.2848220Z Author : Mark Buggermann
2017-09-11T15:25:57.2848220Z Help : More Information
2017-09-11T15:25:57.2848220Z ==============================================================================
2017-09-11T15:25:57.2928210Z Preparing task execution handler.
2017-09-11T15:26:01.2627616Z Executing the powershell script: d:\a_tasks\AwsBeanstalkRelease_506e537c-a122-40ae-928d-13e78628f6ff\1.0.0\beanstalk_deployment.ps1
2017-09-11T15:26:01.5377618Z Loading PowerShell module for AWS
2017-09-11T15:26:01.5777403Z ##[error]System.IO.FileNotFoundException: The specified module 'AWSPowerShell' was not loaded because no valid module file was found in any module directory.
2017-09-11T15:26:01.5797394Z AccessKey:
2017-09-11T15:26:01.5797394Z
2017-09-11T15:26:01.5797394Z
2017-09-11T15:26:01.5807415Z SecretKey:
2017-09-11T15:26:01.5807415Z
2017-09-11T15:26:01.5807415Z
2017-09-11T15:26:01.5807415Z Version: 12
2017-09-11T15:26:01.5807415Z
2017-09-11T15:26:01.5807415Z
2017-09-11T15:26:01.5807415Z Region: us-east-1
2017-09-11T15:26:01.5807415Z
2017-09-11T15:26:01.5807415Z
2017-09-11T15:26:01.5817398Z Setting aws credentials
2017-09-11T15:26:01.5817398Z
2017-09-11T15:26:01.5817398Z
2017-09-11T15:26:05.5383829Z ##[error]Microsoft.PowerShell.Commands.WriteErrorException: System.Management.Automation.CommandNotFoundException: The term 'Set-AwsCredentials' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
2017-09-11T15:26:05.5383829Z at System.Management.Automation.ExceptionHandlingOps.CheckActionPreference(FunctionContext funcContext, Exception exception)
2017-09-11T15:26:05.5383829Z at System.Management.Automation.Interpreter.ActionCallInstruction`2.Run(InterpretedFrame frame)
2017-09-11T15:26:05.5383829Z at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
2017-09-11T15:26:05.5383829Z at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
2017-09-11T15:26:05.5532898Z ##[error]PowerShell script completed with 2 errors.
2017-09-11T15:26:05.5542913Z ##[section]Finishing: Release new version to Test

AWS Lambda .NET Core Deployment

With the AWS Lambda .NET Core Deployment step. I would like top pass precompiled code form a build. But the only option is for "Path to Lambda Project" any suggestions?

CFN Change Set Name ValidationError with variables

trying to use a ENV Var to set the Change Set Name and not having much luck, looks like the input is being validated and it fails to meet the constraint because of the $_()

it should do the evaluation of the string after its been transformed. unless im doing something stupid ( always possible)

2017-10-11T16:30:25.3834998Z message: '1 validation error detected: Value 'vsts-$(BUILD_BUILDNUMBER)' at 'changeSetName' failed to satisfy constraint: Member must satisfy regular expression pattern: [a-zA-Z][-a-zA-Z0-9]*',
2017-10-11T16:30:25.3834998Z code: 'ValidationError',

S3 Upload - Content-Type set to "application/octet-stream"

I'm using the S3 Upload task to push an Angular web site into an S3 bucket during a build. Every file that ends up in the bucket has its content type set to application/octet-stream no matter what the file type is. I'm not seeing an option to control the metadata - is there something I need to make sure I setup in my VSTS build definition?

First Deploy Works, Second Deploy Fails

Hello,

I am using the process defined in...

http://docs.aws.amazon.com/vsts/latest/userguide/vsts-ug.pdf

The first deploy works but sadly, subsequent deploys fail. :/

2017-10-10T06:41:27.4162161Z ##[section]Starting: Deploy to Elastic Beanstalk: MyAppName
2017-10-10T06:41:27.4162161Z ==============================================================================
2017-10-10T06:41:27.4162161Z Task : AWS Elastic Beanstalk Deployment
2017-10-10T06:41:27.4162161Z Description : Deploys an application to Amazon EC2 instance(s) using AWS Elastic Beanstalk
2017-10-10T06:41:27.4162161Z Version : 1.0.7
2017-10-10T06:41:27.4162161Z Author : Amazon Web Services
2017-10-10T06:41:27.4172167Z Help : Please refer to AWS Elastic Beanstalk User Guide for more details on deploying applications with AWS Elastic Beanstalk.
2017-10-10T06:41:27.4172167Z ==============================================================================
2017-10-10T06:41:27.8917556Z 282aac41-3d39-4d5f-ab77-13b1900404cb exists true
2017-10-10T06:41:27.8947554Z Application Type set to aspnet
2017-10-10T06:41:28.8586138Z Determine S3 bucket elasticbeanstalk-us-east-2-150967549400 to store application bundle
2017-10-10T06:41:28.8586138Z Uploading application bundle d:\a\1\a\WebApp.zip to object MyAppName/MyAppName-env/WebApp-v1507617688632.zip in bucket elasticbeanstalk-us-east-2-150967549400
2017-10-10T06:41:33.3000104Z Application upload completed successfully
2017-10-10T06:41:34.0177235Z Created application version: v1507617688632
2017-10-10T06:41:34.8058931Z Started updating environment to version: v1507617688632
2017-10-10T06:41:34.8058931Z Waiting for deployment to complete
2017-10-10T06:41:34.8058931Z Events from Elastic Beanstalk:
2017-10-10T06:41:40.2508963Z Tue Oct 10 2017 06:41:34 GMT+0000 (Coordinated Universal Time) INFO Environment update is starting.
2017-10-10T06:41:40.2508963Z Tue Oct 10 2017 06:41:39 GMT+0000 (Coordinated Universal Time) INFO Deploying new version to instance(s).
2017-10-10T06:42:45.4837419Z Tue Oct 10 2017 06:42:30 GMT+0000 (Coordinated Universal Time) INFO Started Application Update
2017-10-10T06:42:45.4837419Z Tue Oct 10 2017 06:42:36 GMT+0000 (Coordinated Universal Time) ERROR Deployment Failed: Unexpected Exception
2017-10-10T06:42:45.4837419Z Tue Oct 10 2017 06:42:37 GMT+0000 (Coordinated Universal Time) ERROR Error occurred during build: Command hooks failed
2017-10-10T06:42:45.4837419Z Tue Oct 10 2017 06:42:40 GMT+0000 (Coordinated Universal Time) ERROR [Instance: i-03375a1599216350b ConfigSet: Infra-WriteRuntimeConfig, Infra-WriteApplication1, Infra-WriteApplication2, Infra-EmbeddedPreBuild, Hook-PreAppDeploy, Infra-EmbeddedPostBuild, Hook-EnactAppDeploy, Hook-PostAppDeploy] Command failed on instance. Return code: 1 Output: null.
2017-10-10T06:42:45.4837419Z Tue Oct 10 2017 06:42:40 GMT+0000 (Coordinated Universal Time) INFO Command execution completed on all instances. Summary: [Successful: 0, Failed: 1].
2017-10-10T06:42:45.4847417Z Tue Oct 10 2017 06:42:40 GMT+0000 (Coordinated Universal Time) ERROR Unsuccessful command execution on instance id(s) 'i-03375a1599216350b'. Aborting the operation.
2017-10-10T06:42:45.4847417Z Tue Oct 10 2017 06:42:40 GMT+0000 (Coordinated Universal Time) ERROR Failed to deploy application.
2017-10-10T06:42:45.4847417Z Tue Oct 10 2017 06:42:40 GMT+0000 (Coordinated Universal Time) ERROR During an aborted deployment, some instances may have deployed the new application version. To ensure all instances are running the same version, re-deploy the appropriate application version.
2017-10-10T06:42:45.4847417Z ##[error]Error: Error deploy application version to Elastic Beanstalk
2017-10-10T06:42:45.4967385Z ##[section]Finishing: Deploy to Elastic Beanstalk: MyAppName

Any thoughts or experience with errors like this?

Rebuilding the AWS environement allows me to deploy again, but this obviously ruins any server state.

Regards,

Daniel

EC2 Container Registry support?

Is there a way to deploy docker images to ECR on a Linux build server?

There is the "AWS Tools for Windows PowerShell scripts" Task but PowerShell does not work on Linux. With "AWS CLI commands" it is also not possible to login docker to AWS ECR.

There is no way to run a custom shell script with AWS credentials

Motivation: I need to run a hashicorp packer command that makes use of the aws-ebs builder from a private linux agent, using an aws endpoint configured in vsts.

I've tried to use an AWS Powershell Script task, which does not work, since that task is only available for Windows agents (until Powershell.Core support is added, see #53).

I then looked at the AWS Cli task, which does not work, because that task only supports

aws <command> <subcommand>

The big gap in the AWS Tools for Team Services, is (i think) a task like

AWS Shell Script (Run a shell script using bash with aws credentials)

Which would take either a script file name or an inline script.

As a workaround for now, I could configure the aws credentials in the private linux agent statically, and mark that agent with a capability, and use the Shell Script task. But that does not feel right, as the agent might be configured by mistake to access a completely different account.

I looked at the source code, and must admit, i don't have enough knowledge of node / typescript / vsts-extensions to contribute a PR.

Thoughts?

S3 Upload - Bucket [BUCKET_NAME] does not exist or you do not have access

That's the error I get when using the S3 Upload task to send an artifact to an already existing bucket.

I'm trying this on a Hosted Linux VSTS agent.

I'm confident that my IAM policies are fine, since I can upload the file without issues from my local box like so
aws configure set aws_access_key_id *********
aws configure set aws_secret_access_key ********
aws s3 cp build_artifact.tgz s3://[BUCKET_NAME]/build_artifact.tgz --region eu-west-1

At the moment I'm only granting s3:PutObject permission, but I've also tried with s3:* just in case read access was required.

The rest of my configuration is pretty close to the one on the screenshot in your README.

Full output below.
(Please note that I have obscured the name of the bucket to avoid disclosing it publicly)

2017-08-19T18:15:27.2379850Z ##[section]Starting: S3 Upload: [BUCKET_NAME] 2017-08-19T18:15:27.2463490Z ============================================================================== 2017-08-19T18:15:27.2476820Z Task : AWS S3 Upload 2017-08-19T18:15:27.2489660Z Description : Upload file and folder content to an Amazon Simple Storage Service (S3) Bucket 2017-08-19T18:15:27.2503150Z Version : 1.0.2 2017-08-19T18:15:27.2516010Z Author : Amazon Web Services 2017-08-19T18:15:27.2529530Z Help : Please refer to [Working with Amazon S3 Buckets](https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingBucket.html) for more information on working with Amazon S3. 2017-08-19T18:15:27.2544250Z ============================================================================== 2017-08-19T18:15:27.5998270Z 18596219-1165-4cd7-b448-95ea29bbe30d exists true 2017-08-19T18:15:27.7965780Z ##[error]Error: Bucket [BUCKET_NAME] does not exist or you do not have access. Auto-create option not set, cannot continue. 2017-08-19T18:15:27.8090710Z ##[section]Finishing: S3 Upload: [BUCKET_NAME]

Thanks for helping out

CFN - Task fails if there are no updates to apply

when trying to update a stack in our deployment process it is failing because a stack that is being processed 1st dose not contain any changes. this means the lower stacks will never get updated and your release pipeline will always fail.

for example, i have the following CFN Create /Update templates tasks, i have made a change to stack 3 and this triggers an update of the release. the process starts from stack one and fails as there is no changes for this stack.

Run on Agent

CFN Stack 1 - No Change - Failed !
CFN Stack 2
CFN Stack 3 - Has changes in GIT to apply !

Expected behavior for the release process would be check to see if there are changes, if so make the change set and apply, if not pass and continue on to the next one.

Help with glob minimatch patterns

The Glob minimatch patterns that we are entering in the task are not correctly matching the files to be copied to s3. In the first task, we want to match all extensionless files in both the root and any subdirectories and we have tried "!/." which should appear to work (tested on http://www.globtester.com/ ). In the second task we are trying to match any file with a distinct list of extensions in both the root and any subdirectory. We have tried "/*.{css,js,png,ico,xml,gif}" which also should work but does not. How can we debug this or is there something we are missing?

S3 Upload Glob pattern not finding files

I am having trouble getting the S3 upload task to find the files that reside in the following directory:

Source Folder: \\192.168.1.10\images\upgrade

Filename Patterns: *2.0.0.gz*

The files that reside in that directory have the following names:

enabler-2.0.0.gz
enabler-2.0.0.gz-sha256.txt

I have tested the glob pattern and it works fine in http://www.globtester.com

However, if you use the default filename pattern: ** it works fine and finds both files.

Any help would be appreciated in solving what's going on here.

S3Upload issue when using agent running on Windows 2008 R2

I receive a "SyntaxError: Invalid or unexpected token" error when i run the build agent on a Windows 2008 R2 server. The same build definition runs fine when it is running on a Windows 2016 server. Both are running agent version 2.117.2. Are there any known issues when running on Windows 2008 R2?

Here is the full log output for the S3 upload task:
2017-08-24T17:37:34.5009922Z ##[section]Starting: S3 Upload
2017-08-24T17:37:34.5029926Z ==============================================================================
2017-08-24T17:37:34.5029926Z Task : AWS S3 Upload
2017-08-24T17:37:34.5029926Z Description : Upload file and folder content to an Amazon Simple Storage Service (S3) Bucket
2017-08-24T17:37:34.5029926Z Version : 1.0.4
2017-08-24T17:37:34.5029926Z Author : Amazon Web Services
2017-08-24T17:37:34.5029926Z Help : Please refer to Working with Amazon S3 Buckets for more information on working with Amazon S3.
2017-08-24T17:37:34.5039928Z ==============================================================================
2017-08-24T17:37:34.7790478Z C:\agent_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.4\S3Upload.js:2177
2017-08-24T17:37:34.7790478Z /**
2017-08-24T17:37:34.7790478Z ^^^
2017-08-24T17:37:34.7790478Z SyntaxError: Invalid or unexpected token
2017-08-24T17:37:34.7810482Z at createScript (vm.js:56:10)
2017-08-24T17:37:34.7820484Z at Object.runInThisContext (vm.js:97:10)
2017-08-24T17:37:34.7820484Z at Module._compile (module.js:542:28)
2017-08-24T17:37:34.7820484Z at Object.Module._extensions..js (module.js:579:10)
2017-08-24T17:37:34.7820484Z at Module.load (module.js:487:32)
2017-08-24T17:37:34.7820484Z at tryModuleLoad (module.js:446:12)
2017-08-24T17:37:34.7820484Z at Function.Module._load (module.js:438:3)
2017-08-24T17:37:34.7820484Z at Module.runMain (module.js:604:10)
2017-08-24T17:37:34.7820484Z at run (bootstrap_node.js:390:7)
2017-08-24T17:37:34.7820484Z at startup (bootstrap_node.js:150:9)
2017-08-24T17:37:34.7920504Z ##[error]Exit code 1 returned from process: file name 'C:\agent\externals\node\bin\node.exe', arguments '"C:\agent_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.4\S3Upload.js"'.
2017-08-24T17:37:34.7970514Z ##[section]Finishing: S3 Upload

Proxy Support?

We're having issues with these tasks when executed on a VSTS Linux agent that is sitting behind a firewall (networking error: connect ENETUNREACH). We've been seeing this with the S3 upload task, but I assume it applies to all tasks.

The agent has all of the relevant env vars set appropriately and other tasks are leveraging the proxy configuration and successfully executing on the same agent.

Is it possible to run these tasks behind a proxy, and if so, how does the agent need to be configured?

AWS CloudFormation Create/Update Stack

Hi,
I'm trying to use the AWS CloudFormation Create/Update Stack task.
It says "Template File" is a mandatory field. But there is no input box for Template Files.
And I cannot save the template without entering Template file name.
screen shot 2017-08-24 at 10 48 28

Please provide an AWSPowerShell.NetCore compatible version of the tasks, so that Linux agents can be used

In the Dockerfile for my private build agent, I install the AWSPowerShell.NetCore module

#
# Install AWS Tools for PowerShell Core Edition
#
ENV AWS_POWERSHELL_VERSION=3.3.215.0
RUN pwsh \
   -c 'Install-Module -Force \
   -Name AWSPowerShell.NetCore
   -RequiredVersion ${AWS_POWERSHELL_VERSION}'

But then the DotNetFramework capability is missing. Adding it manually still causes the error

2018-01-13T06:18:13.3211110Z ##[error]A supported task execution handler was not found.
This error usually means the task does not carry an implementation that is compatible with
your current operating system. Contact the task author for more details.

Is any first-class support for AWS Powershell scripts planned for linux agents?

Error When using AWS Lambda .NET Core Deployment Task

I am using the .NET Core Deployment Task for Visual Studio Team Services to try and deploy a .Net Core Serverless application but am getting an error when the task runs. The error says:

No executable found matching command "dotnet-lambda"

Here is the full log of the task execution:

2018-01-29T02:57:37.2978148Z ##[section]Starting: Deploy to Lambda:  SOSStack
2018-01-29T02:57:37.2982155Z ==============================================================================
2018-01-29T02:57:37.2982452Z Task         : AWS Lambda .NET Core Deployment
2018-01-29T02:57:37.2982695Z Description  : Build and deploy a Serverless .NET Core application or AWS Lambda function
2018-01-29T02:57:37.2982935Z Version      : 1.0.17
2018-01-29T02:57:37.2983127Z Author       : Amazon Web Services
2018-01-29T02:57:37.2983415Z Help         : Please refer to [AWS Lambda Developer Guide](https://docs.aws.amazon.com/lambda/latest/dg/) for more information on working with AWS Lambda.
2018-01-29T02:57:37.2983717Z ==============================================================================
2018-01-29T02:57:37.7748540Z d4194695-5ae1-4063-ab6c-c39aa4079814 exists true
2018-01-29T02:57:37.7760737Z Deploying Lambda project at d:\a\1\s
2018-01-29T02:57:37.7761349Z Beginning dotnet restore
2018-01-29T02:57:37.7762506Z Path to tool: C:\Program Files\dotnet\dotnet.exe
2018-01-29T02:57:37.7920105Z [command]"C:\Program Files\dotnet\dotnet.exe" restore
2018-01-29T02:57:39.5923976Z   Restoring packages for d:\a\1\s\SimOpSolutions.API\SimOpSolutions.API.csproj...
2018-01-29T02:57:39.6242143Z   Restore completed in 68.14 ms for d:\a\1\s\SimOpSolutions.Domain\SimOpSolutions.Domain.csproj.
2018-01-29T02:57:39.6301464Z   Restore completed in 21.09 ms for d:\a\1\s\SimOpSolutions.Infrastructure\SimOpSolutions.Infrastructure.csproj.
2018-01-29T02:57:39.6331221Z   Restore completed in 39.68 ms for d:\a\1\s\SimOpSolutions.Tasks\SimOpSolutions.Tasks.csproj.
2018-01-29T02:57:39.6331539Z   Restore completed in 22.78 ms for d:\a\1\s\SimOpSolutions.API\SimOpSolutions.API.csproj.
2018-01-29T02:57:39.6895035Z   Restore completed in 65.8 ms for d:\a\1\s\SimOpSolutions.API\SimOpSolutions.API.csproj.
2018-01-29T02:57:40.6825142Z   Restore completed in 1.19 sec for d:\a\1\s\SimOpSolutions.API\SimOpSolutions.API.csproj.
2018-01-29T02:57:40.7265754Z Beginning Serverless Deployment
2018-01-29T02:57:40.7414630Z Path to tool: C:\Program Files\dotnet\dotnet.exe
2018-01-29T02:57:40.7427608Z [command]"C:\Program Files\dotnet\dotnet.exe" lambda deploy-serverless --disable-interactive true --region us-east-1 --stack-name SOSStack --s3-bucket mysoss3bucket --s3-prefix sosprefix
2018-01-29T02:57:40.9633852Z No executable found matching command "dotnet-lambda"
2018-01-29T02:57:40.9727346Z ##[error]Error: C:\Program Files\dotnet\dotnet.exe failed with return code: 1
2018-01-29T02:57:40.9786721Z ##[section]Finishing: Deploy to Lambda:  SOSStack

And here is my project's .csproj contents:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup Label="Configuration" Condition="'$(Configuration)|$(Platform)'=='Debug|AnyCPU'">
    <OutputType>exe</OutputType>
  </PropertyGroup>
  <PropertyGroup>
    <TargetFramework>netcoreapp2.0</TargetFramework>
    <GenerateRuntimeConfigurationFiles>true</GenerateRuntimeConfigurationFiles>
  </PropertyGroup>
  <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|AnyCPU'">
    <DocumentationFile>bin\Debug\netcoreapp2.0\SimOpSolutions.API.xml</DocumentationFile>
  </PropertyGroup>
  <ItemGroup>
    <Compile Remove="BrowserJsonFormatter.cs" />
    <Compile Remove="Controllers\ExternalFormController.cs" />
    <Compile Remove="Controllers\HomeController.cs" />
    <Compile Remove="Controllers\LoggerController.cs" />
    <Compile Remove="Controllers\UserController.cs" />
    <Compile Remove="GlobalExceptionFilter.cs" />
    <Compile Remove="RepositoriesInstaller.cs" />
  </ItemGroup>
  <ItemGroup>
    <PackageReference Include="AWSSDK.CloudFormation" Version="3.3.9" />
    <PackageReference Include="AWSSDK.IdentityManagement" Version="3.3.5.3" />
    <PackageReference Include="AWSSDK.Lambda" Version="3.3.12" />
    <PackageReference Include="Microsoft.AspNetCore.All" Version="2.0.3" />
    <PackageReference Include="AWSSDK.S3" Version="3.3.16.2" />
    <PackageReference Include="AWSSDK.Extensions.NETCore.Setup" Version="3.3.4" />
    <PackageReference Include="Amazon.Lambda.AspNetCoreServer" Version="2.0.0" />
    <PackageReference Include="AWSSDK.Route53" Version="3.3.13" />
    <PackageReference Include="EnyimMemcachedCore" Version="2.1.0.2" />
    <PackageReference Include="FluentScheduler" Version="5.3.0" />
    <PackageReference Include="FluentValidation.AspNetCore" Version="7.4.0" />
    <PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="2.0.1" />
    <PackageReference Include="Microsoft.AspNetCore.Server.IISIntegration" Version="2.0.1" />
    <PackageReference Include="Microsoft.AspNetCore.Server.Kestrel" Version="2.0.1" />
    <PackageReference Include="Microsoft.ApplicationInsights.AspNetCore" Version="2.1.1" />
    <PackageReference Include="Microsoft.AspNetCore" Version="2.0.1" />
    <PackageReference Include="Microsoft.AspNetCore.Mvc" Version="2.0.2" />
    <PackageReference Include="Microsoft.AspNetCore.Routing" Version="2.0.1" />
    <PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="2.0.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.EnvironmentVariables" Version="2.0.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.FileExtensions" Version="2.0.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="2.0.0" />
    <PackageReference Include="Microsoft.Extensions.Logging" Version="2.0.0" />
    <PackageReference Include="Microsoft.Extensions.Options.ConfigurationExtensions" Version="2.0.0" />
    <PackageReference Include="AWSSDK.S3" Version="3.3.16.2" />
    <PackageReference Include="AWSSDK.Extensions.NETCore.Setup" Version="3.3.4" />
    <PackageReference Include="Amazon.Lambda.Core" Version="1.0.0" />
    <PackageReference Include="Amazon.Lambda.Serialization.Json" Version="1.1.0" />
    <PackageReference Include="Amazon.Lambda.AspNetCoreServer" Version="2.0.0" />
    <PackageReference Include="Amazon.Lambda.Logging.AspNetCore" Version="2.0.0" />
    <PackageReference Include="Newtonsoft.Json" Version="10.0.3" />
    <PackageReference Include="ServiceStack.Aws.Core" Version="5.0.2" />
    <PackageReference Include="Swashbuckle.AspNetCore" Version="1.1.0" />
    <PackageReference Include="Swashbuckle.AspNetCore.Swagger" Version="1.1.0" />
    <PackageReference Include="Swashbuckle.AspNetCore.SwaggerGen" Version="1.1.0" />
    <PackageReference Include="Swashbuckle.AspNetCore.SwaggerUi" Version="1.1.0" />
    <PackageReference Include="System.IO.Compression.ZipFile" Version="4.3.0" />
    <PackageReference Include="System.Runtime.Loader" Version="4.3.0" />
    <PackageReference Include="YamlDotNet.Signed" Version="4.3.0" />
  </ItemGroup>
  <ItemGroup>
    <DotNetCliToolReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Tools" Version="2.0.1" />
    <DotNetCliToolReference Include="Amazon.Lambda.Tools" Version="2.0.1" />
  </ItemGroup>
  <ItemGroup>
    <ProjectReference Include="..\SimOpSolutions.Domain\SimOpSolutions.Domain.csproj" />
    <ProjectReference Include="..\SimOpSolutions.Tasks\SimOpSolutions.Tasks.csproj" />
  </ItemGroup>
  <ItemGroup>
    <Reference Include="System.Configuration">
      <HintPath>..\..\..\..\..\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.6.2\System.Configuration.dll</HintPath>
    </Reference>
  </ItemGroup>
</Project>

Anyone have any idea how to get me passed this issue?

S3 Upload Task: "Bucket does not exist" masking SSL certificate validation error

New to using AWS and these tasks (thanks for creating them!). Im having trouble using the "S3 Upload" task, yet using the CLI directly to upload is working fine.

Using the S3 Upload task, I receive the error:
image

The configuration is as follows:
image

I'd ideally like to use the simple S3 task as it's easier to manage with straight forward properties specific to the task... The CLI task works, but not as straight forward to maintain.

It's worth mentioning that I am using an account with very limited access to the bucket; however, since it works successfully using aws s3 cp, it stands to reason that the S3 Upload task should work...

Thoughts? Thanks for your time!

AWS CLI Task - error when using "s3 cp" to upload a file

After troubleshooting my report on issue #5 I tried to use the AWS CLI to accomplish the same objective.
Uploading an artifact to an S3 bucket from VSTS.

This is also on a Hosted Linux agent.
The upload works as expected when I try to upload the file by manually executing "s3 cp" in the cli.

To be able to run the cli on a hosted agent, I installed the official ubuntu package via apt-get.
The installed version for "awscli" is 1.11.13-1ubuntu1~16.04.0

The error output leads me to believe that there might be an issue with the string concatenation for the "configure set aws_secret_access_key" command.
As you can see in the log below the "aws_access_key_id" part is repeated from the previous command, and it looks like it doesn't parse correctly.

2017-08-19T20:31:27.4213130Z ##[section]Starting: AWS CLI: copy artifact to bucket 2017-08-19T20:31:27.4285300Z ============================================================================== 2017-08-19T20:31:27.4300490Z Task : AWS CLI 2017-08-19T20:31:27.4315920Z Description : Run an AWS CLI command 2017-08-19T20:31:27.4330900Z Version : 1.0.2 2017-08-19T20:31:27.4346380Z Author : Amazon Web Services 2017-08-19T20:31:27.4365750Z Help : The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts. You must have the AWS CLI installed to use this task. See http://docs.aws.amazon.com/cli/latest/userguide/installing.html for more details. 2017-08-19T20:31:27.4382400Z ============================================================================== 2017-08-19T20:31:27.7578150Z 4604ae4d-7b67-42ae-bb99-59837d6e43ce exists true 2017-08-19T20:31:27.7816220Z [command]/usr/bin/aws configure set aws_access_key_id ******** 2017-08-19T20:31:28.6579480Z [command]/usr/bin/aws configure set aws_access_key_id ******** configure set aws_secret_access_key ******** 2017-08-19T20:31:28.9931390Z 2017-08-19T20:31:28.9950290Z Unknown options: configure,set,aws_secret_access_key,******** 2017-08-19T20:31:29.0566510Z ##[error]/usr/bin/aws failed with return code: 255 2017-08-19T20:31:29.1260640Z ##[section]Finishing: AWS CLI: copy artifact to bucket

Here is a screenshot of my task configuration.
Note that I've masked the name of the bucket.
aws_issue

Installing AWSPowerShell module on hosted agent takes 8 minutes

Hi, In the AWSPowershell task the AWSPowershell module is installed on the build agent if it's not yet there. I'm testing with a hosted agent, but it takes more than 8 minutes for the module to be installed. This is the line of code that takes long:
Install-Module -Name AWSPowerShell -Scope CurrentUser -Verbose -Force

This might not be able to influence this, but could you please work with Microsoft to find a solution for this, since it's holding up our releases very much now.

Support for Serverless transformation

I'm trying to run my Cloud Formation template with the latest AWS Tools extension for TFS (15.117.26714.0).

Getting the following error:

##[error]ValidationError: CreateStack cannot be used with templates containing Transforms.

Looks like there's no support for templates which require serverless transform:
Transform: AWS::Serverless-2016-10-31

Any chance this could be fixed anytime soon?

Thanks

S3 Upload - Setting Cache Control

These days we are building several single page web applications, it would be very important if vsts-tools had option to set cache control settings, like:
--cache-control 'max-age=2592000, public'

EC2 Deployment keeps failing

I've verified the artifact output of a build is clean and not adding this element into the config, as I'd expect. However, once the EC2 deployment task, within a release runs, the package that is uploaded to S3 for the deployment contains a rogue rewrite url. This is giving us 500's and removing this section, the application works fine.

<rule name="AWS_BEANSTALK" stopProcessing="false">
     <match url="^(https?://[^/]+/)MyProject.Web_deploy" ignoreCase="true" negate="true" />
          <conditions>
               <add input="{PATH_INFO}" pattern="^/MyProject.Web_deploy" negate="true" />
          </conditions>
          <action type="Rewrite" url="{R:1}MyProject.Web_deploy{PATH_INFO}" logRewrittenUrl="true" /></rule>

beanstalk deploy VSTS Application app is missing web.config file.

Hello,

I am using AWS and VSTS.

After creating a .NET core application i was able to upload and deploy. Woohoo!

However, after a few more deployments i can no longer achieve this.

Although a bit out of date, i used this guide as a starting point.
http://docs.aws.amazon.com/vsts/latest/userguide/vsts-ug.pdf

Here is the current error.....

Mon Oct 09 2017 03:20:49 GMT+0000 (Coordinated Universal Time) INFO Environment update is starting.
Mon Oct 09 2017 03:20:57 GMT+0000 (Coordinated Universal Time) INFO Deploying new version to instance(s).
Mon Oct 09 2017 03:22:00 GMT+0000 (Coordinated Universal Time) ERROR Error during deployment: Application app is missing web.config file.

This item is not documented correctly in the PDF linked, and changing it can result in different errors.
Published Application Path

When i choose the dropdown to select a path for this variable it shows me my repo folder structure, and i can choose the sub-directory containing the csproj file. This does not appear to be correct.

Any advice?

Elastic Bean Stalk Instance with IIS default config.
.NET Core 2.0 Application MVC6

Kind regards,

Daniel

How to install AWS CLI on VSTS Hosted Linux agent ?

I'm trying to run the AWS CLI on a VSTS hosted linux agent and following the instructions from here:

http://docs.aws.amazon.com/cli/latest/userguide/awscli-install-linux.html

  • Download PIP
  • Install PIP
  • Use PIP to install AWS CLI

Seems simple enough and using the Command Line task in my VSTS release I was able to download and install PIP, but the very next command that attempts to use it fails:

##[error]Failed which: Not found pip: null
##[error]undefined failed with error: Failed which: Not found pip: null

Full logs below. What am I doing wrong?

Thanks,

Sam

##[section]Starting: Download PIP
==============================================================================
Task         : Command Line
==============================================================================
[command]/usr/bin/curl -O https://bootstrap.pypa.io/get-pip.py
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100 1558k  100 1558k    0     0  1884k      0 --:--:-- --:--:-- --:--:-- 1886k
##[section]Finishing: Download PIP
##[section]Starting: Install PIP
==============================================================================
Task         : Command Line
==============================================================================
[command]/usr/bin/python get-pip.py --user
Collecting pip
  Downloading pip-9.0.1-py2.py3-none-any.whl (1.3MB)
Collecting setuptools
  Downloading setuptools-36.4.0-py2.py3-none-any.whl (478kB)
Collecting wheel
  Downloading wheel-0.30.0-py2.py3-none-any.whl (49kB)
Installing collected packages: pip, setuptools, wheel
Successfully installed pip-9.0.1 setuptools-36.4.0 wheel-0.30.0
##[section]Finishing: Install PIP
##[section]Starting: Install AWS CLI
==============================================================================
Task         : Command Line
==============================================================================
##[error]Failed which: Not found pip: null
##[error]undefined failed with error: Failed which: Not found pip: null
##[section]Finishing: Install AWS CLI
##[section]Finishing: Release

AWS Elastic Beanstalk Version Label

Hi!
We're using TFS to deploy a PHP app to AWS Elastic Beanstalk and everything works great but we're missing the functionality to set the Version Label.

tfs

aws

Thanks,
Radu

S3 Upload zip file

Hi, I'm trying to upload a zip file but the task does not find the zip file,
I've tried $(ZipName).zip and also *.zip in the patterns but there is no match

Lambda .NET Core Deployment for netcoreapp2.0

With netcoreapp2.0 functionality recently announced for AWS Lambda, do I need to do anything special in my VSTS build task to make use of it? I am receiving the following errors after changing the TargetFramework to 2.0 and attempting a build:

2018-01-22T21:39:07.1381607Z Beginning Lambda Deployment
2018-01-22T21:39:07.1651518Z Path to tool: C:\Program Files\dotnet\dotnet.exe
2018-01-22T21:39:07.1679934Z [command]"C:\Program Files\dotnet\dotnet.exe" lambda deploy-function --disable-interactive true --region us-east-1 -fn FunctionName -fh FunctionName::FunctionName.Function::FunctionHandler --function-role arn:aws:iam::ID:role/Role --function-memory-size 256 --function-timeout 300
2018-01-22T21:39:21.9510247Z Executing publish command
2018-01-22T21:39:22.0504026Z ... invoking 'dotnet publish', working folder 'd:\a\1\s\FunctionName\bin\Release\netcoreapp1.0\publish'
2018-01-22T21:39:23.7069702Z ... publish: Microsoft (R) Build Engine version 15.4.8.50001 for .NET Core
2018-01-22T21:39:23.7070345Z ... publish: Copyright (C) Microsoft Corporation. All rights reserved.
2018-01-22T21:39:24.1988596Z ... publish: C:\Program Files\dotnet\sdk\2.0.3\Sdks\Microsoft.NET.Sdk\build\Microsoft.PackageDependencyResolution.targets(165,5): error : Assets file 'd:\a\1\s\FunctionName\obj\project.assets.json' doesn't have a target for '.NETCoreApp,Version=v1.0'. Ensure that restore has run and that you have included 'netcoreapp1.0' in the TargetFrameworks for your project. [d:\a\1\s\FunctionName\FunctionName.csproj]
2018-01-22T21:39:24.3386621Z ##[error]Error: C:\Program Files\dotnet\dotnet.exe failed with return code: 4294967295

Thanks

Support CodeDeploy blue/green deployment

Hi, great work so far! The 'manually provision instances' option with blue/green deployment isn't supported in the CodeDeploy task. I think I need that in the near future. Are there any plans to support this in a future release?

S3 Upload XMLParserError

Trying to upload files into a new S3 bucket the build using the S3Upload task fails with the following error (here full log related to the task):

2017-10-04T15:22:47.0249089Z ##[debug]Evaluating condition for step: 'S3 Upload: BasicTemplate'
2017-10-04T15:22:47.0249089Z ##[debug]Evaluating: succeeded()
2017-10-04T15:22:47.0249089Z ##[debug]Evaluating succeeded:
2017-10-04T15:22:47.0249089Z ##[debug]=> (Boolean) True
2017-10-04T15:22:47.0249089Z ##[debug]Expanded: True
2017-10-04T15:22:47.0249089Z ##[debug]Result: True
2017-10-04T15:22:47.0249089Z ##[section]Starting: S3 Upload: BasicTemplate
2017-10-04T15:22:47.0405355Z ==============================================================================
2017-10-04T15:22:47.0405355Z Task : AWS S3 Upload
2017-10-04T15:22:47.0405355Z Description : Upload file and folder content to an Amazon Simple Storage Service (S3) Bucket
2017-10-04T15:22:47.0405355Z Version : 1.0.7
2017-10-04T15:22:47.0405355Z Author : Amazon Web Services
2017-10-04T15:22:47.0405355Z Help : Please refer to Working with Amazon S3 Buckets for more information on working with Amazon S3.
2017-10-04T15:22:47.0405355Z ==============================================================================
2017-10-04T15:22:47.3217918Z ##[debug]agent.TempDirectory=T:\AWS_AgtBld001_work_temp
2017-10-04T15:22:47.3217918Z ##[debug]loading inputs and endpoints
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_$/
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_****************************
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_PARAMETER_$/ACCESSTOKEN
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_PARAMETER
PASSWORD
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_PARAMETER
USERNAME
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_PARAMETER_SYSTEMVSSCONNECTION_ACCESSTOKEN
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_SCHEME
$/
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_SCHEME_***************************************
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_SCHEME_SYSTEMVSSCONNECTION
2017-10-04T15:22:47.3217918Z ##[debug]loading ENDPOINT_AUTH_SYSTEMVSSCONNECTION
2017-10-04T15:22:47.3217918Z ##[debug]loading INPUT_AWSCREDENTIALS
2017-10-04T15:22:47.3217918Z ##[debug]loading INPUT_BUCKETNAME
2017-10-04T15:22:47.3374175Z ##[debug]loading INPUT_CREATEBUCKET
2017-10-04T15:22:47.3374175Z ##[debug]loading INPUT_FILESACL
2017-10-04T15:22:47.3374175Z ##[debug]loading INPUT_FLATTENFOLDERS
2017-10-04T15:22:47.3374175Z ##[debug]loading INPUT_GLOBEXPRESSIONS
2017-10-04T15:22:47.3374175Z ##[debug]loading INPUT_OVERWRITE
2017-10-04T15:22:47.3374175Z ##[debug]loading INPUT_REGIONNAME
2017-10-04T15:22:47.3374175Z ##[debug]loading INPUT_SOURCEFOLDER
2017-10-04T15:22:47.3374175Z ##[debug]loaded 19
2017-10-04T15:22:47.3374175Z ##[debug]Agent.ProxyUrl=undefined
2017-10-04T15:22:47.3374175Z ##[debug]Agent.CAInfo=undefined
2017-10-04T15:22:47.3374175Z ##[debug]Agent.ClientCert=undefined
2017-10-04T15:22:47.3686684Z ##[debug]check path : T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\task.json
2017-10-04T15:22:47.3686684Z ##[debug]set resource file to: T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\task.json
2017-10-04T15:22:47.3686684Z ##[debug]system.culture=en-US
2017-10-04T15:22:47.3686684Z ##[debug]awsCredentials=*****************************
2017-10-04T15:22:47.3842932Z *************************** exists true
2017-10-04T15:22:47.3842932Z ##[debug]***************************** exists true
2017-10-04T15:22:47.3842932Z ##[debug]regionName=eu-west-1
2017-10-04T15:22:47.3842932Z ##[debug]bucketName=BasicTemplate
2017-10-04T15:22:47.3842932Z ##[debug]overwrite=true
2017-10-04T15:22:47.3842932Z ##[debug]flattenFolders=false
2017-10-04T15:22:47.3842932Z ##[debug]sourceFolder=T:\AWS_AgtBld001_work\5\s
2017-10-04T15:22:47.3842932Z ##[debug]check path : T:\AWS_AgtBld001_work\5\s
2017-10-04T15:22:47.3842932Z ##[debug]targetFolder=null
2017-10-04T15:22:47.3842932Z ##[debug]globExpressions=**
2017-10-04T15:22:47.3842932Z ##[debug]filesAcl=public-read-write
2017-10-04T15:22:47.3842932Z ##[debug]createBucket=true
2017-10-04T15:22:47.3842932Z ##[debug]contentType=null
2017-10-04T15:22:47.4467990Z Bucket BasicTemplate does not appear to exist (or you do not have access). Attempting to create in region eu-west-1.
2017-10-04T15:22:47.7347845Z Failed to create bucket { XMLParserError: Unexpected close tag
2017-10-04T15:22:47.7347845Z Line: 6
2017-10-04T15:22:47.7347845Z Column: 7
2017-10-04T15:22:47.7347845Z Char: >
2017-10-04T15:22:47.7347845Z at error (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:22047:10)
2017-10-04T15:22:47.7347845Z at strictFail (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:22073:7)
2017-10-04T15:22:47.7347845Z at closeTag (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:22267:9)
2017-10-04T15:22:47.7347845Z at Object.write (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:22829:13)
2017-10-04T15:22:47.7347845Z at Parser.exports.Parser.Parser.parseString (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:10388:31)
2017-10-04T15:22:47.7347845Z at Parser.parseString (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:9887:59)
2017-10-04T15:22:47.7347845Z at NodeXmlParser.parse (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:21245:10)
2017-10-04T15:22:47.7347845Z at Request.extractError (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:24616:39)
2017-10-04T15:22:47.7347845Z at Request.callListeners (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:9476:20)
2017-10-04T15:22:47.7347845Z at Request.emit (T:\AWS_AgtBld001_work_tasks\S3Upload_3a219265-181c-4ed2-9a51-75a7f308f0d5\1.0.7\S3Upload.js:9448:10)
2017-10-04T15:22:47.7347845Z message: 'Unexpected close tag\nLine: 6\nColumn: 7\nChar: >',
2017-10-04T15:22:47.7347845Z code: 'XMLParserError',
2017-10-04T15:22:47.7347845Z retryable: true,
2017-10-04T15:22:47.7347845Z time: 2017-10-04T15:22:47.734Z,
2017-10-04T15:22:47.7347845Z statusCode: 400 }
2017-10-04T15:22:47.7347845Z ##[debug]task result: Failed
2017-10-04T15:22:47.7504026Z ##[error]XMLParserError: Unexpected close tag
Line: 6
Column: 7
Char: >
2017-10-04T15:22:47.7504026Z ##[debug]Processed: ##vso[task.issue type=error;]XMLParserError: Unexpected close tag%0ALine: 6%0AColumn: 7%0AChar: >
2017-10-04T15:22:47.7504026Z ##[debug]Processed: ##vso[task.complete result=Failed;]XMLParserError: Unexpected close tag%0ALine: 6%0AColumn: 7%0AChar: >
2017-10-04T15:22:47.7504026Z ##[section]Finishing: S3 Upload: BasicTemplate

AWS CLI task leaves credentials on agents

I've just noticed that there doesn't seem to be any post-job cleanup, meaning the ~/.aws/credentials file remains on the agent after the job has completed. This is not ideal for use-cases where several teams are using the same build agent pools with different AWS IAM requirements.

S3 Upload Task is missing

After installing the AWS toolkit on the TFS server (2015 update 3) the only AWS task that shows up in the AWS CLI. None of the other tasks are displayed as available tasks for the build.

Issue with beanstalk and cloudformation

Parden me if I'm asking dumb questions.

So for beanstalk, my current folder structure is like this:
image
which the extension does is add https binding and add environment variable("Development, Staging etc"), now I manage to build out this structure through TeamCity. Somehow the breanstalk step package up the bundle and generates aws-windows-deployment-manifest.json by itself and not include my extension script which failed my deployment
image, how can I fix this?

2nd question about cloudformation. My cloudformation template are nested, and I refer them in my main template:
image
image
without uploading them to s3, how the create/update stack task knows where to find my sub-templates?
What should I put on my bases3repo variable instead of my s3 bucket url(since it will be running on vsts build agent now)?

Thank you for your time.

Assume Role Support?

Hi,

We would like to deploy CloudFormation stacks from a central role that is able to assume a deployment role across multiple AWS accounts. Is this possible using these tools?

Thanks!

credentials file always open

I have Visual Studio working with AWS Toolkit using an AWS Role credientials.

I have a batch process that grabs credientials from a machine every hour and stores in the default location to run commands. This allows commands etc to always have a fresh set of credientials without creating an IAM user.

However, AWS Toolkit throws a blocker in this because it keeps the file open forever as long as visual studio is open. This prevents me from writing the new credientials on a regular basis.

Can you please make the toolkit load the credientials every few minutes or so instead of keeping the file open the entire time.

Beanstalk deploy fails when version label already exists

We have a CI process setup where we deploy to beanstalk from a release definition using the Deploy to Beanstalk task. We specify the version label to be the same as our build number, which is in the format 20180122.6. We have several environments (dev, qa, staging, prod) and we would like to deploy the same binary to all environments. The problem is that the task seems to assume that it should ALWAYS upload a binary, and fails if it already exists. This results in our "first release" (say to dev) works, but then all subsequent releases (qa, staging, prod) fail because the version label already exists. Could the task be updated to skip uploading the binary if it already exists, or is there a different recommendation to handle this use case?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.