GithubHelp home page GithubHelp logo

lambda-deploy's Introduction

Lambda Deploy - Easily Deploy Code to AWS Lambda

Note: 0.1.0 introduces a change in how Lambda Deploy operates - no longer are all directories in the current working directory assumed to be Lambdas, instead the current directory itself is assumed to be a Lambda, and you need to call it multiple times to upload multiple Lambdas.

This tool provides an easy way to deploy your code to AWS's Lambda service. It provides a number of useful features:

  • Uses the standard boto/aws configuration options
  • Handles packaging of (pure python) dependencies
  • Allows for providing environment variables to Lambda
  • Simplifies deployment steps

It should be noted that this is alpha, and issues are expected, as well as changes to the interface. Where an issue is known, I try and document it here, but if you find something, please open an issue.

Usage

At its heart the tool simply takes the directory it is run from, and will package it up, giving it the name of the directory, and pushing it to AWS with the options you configure (see Configuration below).

While the tool is oriented towards Python at the moment, there is no reason it could not push other types of code to AWS, and its dependency bundling could be extended to support other languages. If this interests you, please open a ticket.

A simple example usage would be the following:

$ lambda-deploy deploy

This will load all the contents of the current working directory into a Lambda and upload it to AWS.

There is one other command aside from deploy: list. You can guess what list does - it lists your current Lambdas along with some information about them.

At its most basic that's it. The next section will cover how to configure things.

Configuration

Configuration of the tool can be done though two primary avenues - command line arguments and environment variables. Within environment variables, you can either set them yourself using traditional means, or make use of a .env file, which the tool will read in to populate the environment.

Command line arguments will override environment variables.

AWS Credentials

You can configure AWS in any way which boto3 supports. This tool actually does not touch these at all, and relies on boto to pick them up entirely. There's no way to pass them via the command line.

Lambda Options

There are several options can be passed in to your Lambda jobs, as well as one required piece of information. Options that can also be configured via an environment variable will have the environment variable in paranthesis in the header.

These correspond to boto3 arguments, so if something is unclear, I recommend checking the boto3 documentation.

Role (LAMBDA_ROLE)

The only thing that is required is that you specify the ARN role that your Lambda job will operate under when communicating with AWS. This can be specified via the -r/--role option on the command line, or via the environment variable. You must configure via one of these methods, and if you've done both, the command line takes precedence.

Runtime (LAMBDA_RUNTIME)

This is the runtime on AWS Lambda that your code will run under. It defaults to python2.7, and can only be changed via the environment variable.

Handler (LAMBDA_HANDLER)

This controls the entry point of your Lambda code. It defaults to lambda_function.lambda_handler, and can only be changed via the environment variable.

This means, for example, that inside your Lambda directory you would have a file lambda_function.py which contains a function lambda_handler.

Name (LAMBDA_NAME)

This is the name attached to your Lambda job on AWS. It defaults to the directory name the code resides in. This can be specified via the -n/--name option on the command line or the environment variable.

Description (LAMBDA_DESCRIPTION)

This is the description attached to your Lambda job on AWS. It defaults to "Lambda code for " followed by the name of your Lambda job, and can only be changed via the environment variable.

Timeout (LAMBDA_TIMEOUT)

This is the amount of time, as an integer, that Lambda should allow your job to run before it is killed. This value defaults to 3 seconds, and can only be changed via the environment variable. The maximum that AWS allows is 300 seconds.

Memory Size (LAMBDA_MEMORY_SIZE)

The amount of memory, expressed as an integer number of megabytes, that should be allocated to your job. The default is 128, and values must be given as multiples of 64, and can only be changed via the environment variable.

The amount of CPU is also inferred based on this. For specifics, as well as maximums, I recommend you check the AWS documentation.

Tool Options

The following options can be provided to tweak how the tool runs. Options that can also be configured via an environment variable will have the environment variable in paranthesis in the header.

Environment File (LAMBDA_ENV_FILE)

You can specify a different environment file (from the default .env) to populate the environment:

$ lambda-deploy -e /my/env/file deploy

Note that shell expansions haven't been tested here yet.

Note that a .env file inside your Lambda directory will not be uploaded, to protect you from accidentally uploading sensitive information. Use LAMBDA_ENV_VARS as is described below.

Environment Variables (LAMBDA_ENV_VARS)

You can specify one or more environment variables to pluck out of the environment the tool is running in, which will be placed in a .env file that will be shipped with your Lambdas.

$ lambda-deploy -E MY_ENV_VAR -E MY_OTHER_ENV_VAR deploy

When setting this in an environment variable itself, you can set values in a comma-delimated fashion:

LAMBDA_ENV_VARS=MY_ENV_VAR,MY_OTHER_ENV_VAR

This is useful for keeping all your configuration inside a .env file. If for example MY_ENV_VAR had a value of "foo" and MY_OTHER_ENV_VAR had a value of bar, providing this options above would result in a .env file being creatd in your LAMBDA that looks like the following:

MY_ENV_VAR=foo
MY_OTHER_ENV_VAR=bar

Lambda Directory (LAMBDA_DIRECTORY)

By default the tool uses the current working directory as its base to package, but you can change this by providing this option:

$ lambda-deploy -d /another/directory

Like the environment file, support for things like shell expansions isn't really there yet.

Logging Level (LAMBDA_LOGGING_LEVEL)

In order to change the logging level, you can simply provide the -v/--verbose option to get DEBUG level logging, or you can specify what you want using the -l/--logging-level option:

$ lambda-deploy -l WARNING deploy

These correspond to standard Python logging module levels - CRITICAL, ERROR, WARNING, INFO, DEBUG or NOTSET.

Automatic Dependency Bundling

One of the nicest features of this tool is that you can use a requirements.txt file as you normally would, and have those dependencies bundles at the time you build your Lambda, without polluting your local development environment or even requiring a virtual environment.

Just place the requirements.txt in the root of your Lambda's folder (i.e. peered with your Lambda handler file) and we'll handle the rest.

If this sounds to good to be true, it is, or at least there are some limits. Unfortunately, while this works well for pure Python modules, modules with compiled resources will not work directly.

There are some ways around this. If you build modules on an Amazon Linux x86_64 EC2 instance, as long as the resulting code is relatively self contained (i.e. doesn't require the installation of compiled binaries elsewhere on the system) then you should be able to move this off of that system into your Lambda bundle.

Additionally, if you Google you can find some people that have made special pip installable versions of packages designed to work on Lambda.

Feel free to open an issue if you have problems getting this to work.

An Example of Deploying Multiple Lambdas

A common use case is having a stable of Lambdas that you would like deployed, perhaps as part of a CI solution. An example of this might be having a single git repo, in which you have a directory called "lambdas" which contains directories containing your individual Lambda directories, like so:

$ ls lambdas/
lambdaA lambdaB

Assuming your CI solution inserts the current git commit SHA1 in an environment variable, lets call it GIT_COMMIT_SHA1, you could construct a command like the following to only release the Lambdas that changed:

$ git show --pretty="format:" --name-only $GIT_COMMIT_SHA1 | grep '^lambdas' | cut -d/ -f 1-2 | uniq | xargs -I {} sh -c 'test -d "{}" && lambda-deploy -d "{}" deploy'

This correctly deals with not acting on things outside of your lambdas directory, and only uploading a Lambda if it changed. It does not remove existing Lambdas if they are removed from your git source - that's still something you'd need to do manually.

Development and Support

Pull requests and issues are welcome - join us on GitHub

lambda-deploy's People

Contributors

jimjkelly avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

lambda-deploy's Issues

Have Default Action be Single Upload of the Current Directory

As requested in issue #1, it might make more sense that the default deploy action is to deploy the current directory as a single Lambda, and only search for multiple Lambdas if the -d option is given. Some things to consider:

  • We'll want to filter out .env files then, since we don't want to by default upload their credentials
  • Is there any value in confirming that the entry point as specified exists? Probably. That could be good in either case.

Issue deploying on Windows 10

Trying to get this going on Windows, but running into an error:

2016-04-18 11:38:33:INFO:botocore.credentials: Found credentials in environment variables.
2016-04-18 11:38:33:INFO:lambda_deploy: Packaging lambda .env
Traceback (most recent call last):
  File "C:\Users\Joe\appdata\local\scoop\apps\python27\2.7.10\scripts\lambda-deploy-script.py", line 9, in <module>
    load_entry_point('lambda-deploy==0.0.1', 'console_scripts', 'lambda-deploy')()
  File "C:\Users\Joe\appdata\local\scoop\apps\python27\2.7.10\lib\site-packages\lambda_deploy\lambda_deploy.py", line 321, in main
    method()
  File "C:\Users\Joe\appdata\local\scoop\apps\python27\2.7.10\lib\site-packages\lambda_deploy\lambda_deploy.py", line 133, in deploy
    zfh = self.package(lambda_name)
  File "C:\Users\Joe\appdata\local\scoop\apps\python27\2.7.10\lib\site-packages\lambda_deploy\lambda_deploy.py", line 98, in package
    if 'requirements.txt' in os.listdir(src_dir):
WindowsError: [Error 267] The directory name is invalid: 'C:\\Dev\\python\\kodi-alexa-master\\.env/*.*'

This is my .env file:

AWS_ACCESS_KEY_ID=***********
AWS_SECRET_ACCESS_KEY=*****************
LAMBDA_ROLE=lambda_basic_execution
LAMBDA_TIMEOUT=30

It looks like it's trying to use .env as a directory that everything is in. Anything I can try?

Doesn't Recognize Region

Having issues deploying, getting NoRegionError even though region is specified

2016-12-26 22:28:33:INFO:botocore.vendored.requests.packages.urllib3.connectionpool: Starting new HTTP connection (1): 169.254.169.254
Traceback (most recent call last):
  File "/usr/local/bin/lambda-deploy", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python2.7/dist-packages/lambda_deploy/lambda_deploy.py", line 360, in main
    name=options.name
  File "/usr/local/lib/python2.7/dist-packages/lambda_deploy/lambda_deploy.py", line 75, in __init__
    self.client = boto3.client('lambda')
  File "/usr/local/lib/python2.7/dist-packages/boto3/__init__.py", line 83, in client
    return _get_default_session().client(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/boto3/session.py", line 263, in client
    aws_session_token=aws_session_token, config=config)
  File "/usr/local/lib/python2.7/dist-packages/botocore/session.py", line 828, in create_client
    client_config=config, api_version=api_version)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 69, in create_client
    verify, credentials, scoped_config, client_config, endpoint_bridge)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 222, in _get_client_args
    verify, credentials, scoped_config, client_config, endpoint_bridge)
  File "/usr/local/lib/python2.7/dist-packages/botocore/args.py", line 44, in get_client_args
    endpoint_url, is_secure, scoped_config)
  File "/usr/local/lib/python2.7/dist-packages/botocore/args.py", line 101, in compute_client_args
    service_name, region_name, endpoint_url, is_secure)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 295, in resolve
    service_name, region_name)
  File "/usr/local/lib/python2.7/dist-packages/botocore/regions.py", line 122, in construct_endpoint
    partition, service_name, region_name)
  File "/usr/local/lib/python2.7/dist-packages/botocore/regions.py", line 135, in _endpoint_for_partition
    raise NoRegionError()
botocore.exceptions.NoRegionError: You must specify a region.

In my .env.lambda:
AWS_DEFAULT_REGION = us-east-1

I even hard coded region into regions.py still doesn't work

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.