GithubHelp home page GithubHelp logo

emicklei / gmig Goto Github PK

View Code? Open in Web Editor NEW
27.0 3.0 10.0 2.71 MB

Google Cloud Platform migrations tool for infrastructure-as-code

License: MIT License

Go 97.50% Makefile 2.04% Dockerfile 0.46%
gcp google-cloud-platform tool golang gcloud migrations infrastructure-as-code gcloud-migrations iam-policy

gmig's Introduction

gmig - GCP migrations

pronounced as gimmick.

Build Status Go Report Card GoDoc

Manage Google Cloud Platform (GCP) infrastructure using migrations that describe incremental changes such as additions or deletions of resources. This work is inspired by MyBatis migrations for SQL database setup.

Introduction blog post

Your gmig infrastructure is basically a folder with incremental change files, each with a timestamp prefix (for sort ordering) and readable name.

/010_create_some_account.yaml
/015_add_permissions_to_some_account.yaml
/my-gcp-production-project
    gmig.yaml

Each change is a single YAML file with one or more shell commands that change infrastructure for a project.

# add loadrunner service account

do:
- gcloud iam service-accounts create loadrunner --display-name "LoadRunner"

undo:
- gcloud iam service-accounts delete loadrunner

A change must have at least a do section and optionally an undo section. The do section typically has a list of gcloud commands that create resources but any available tool can be used. All lines will be executed at once using a single temporary shell script so you can use shell variables to simplify each section. The undo section typically has an ordered list of gcloud commands that deletes the same resources (in reverse order if relevant). Each command in each section can use the following environment variables: $PROJECT,$REGION,$ZONE,$GMIG_CONFIG_DIR, and any additional environment variables populated from the target configuration (see env section in the configuration below).

State

Information about the last applied migration to a project is stored as a Google Storage Bucket object. Therefore, usage of this tool requires you to have create a Bucket and set the permissions (Storage Writer) accordingly. To view the current state of your infrastructure related to each migration, you can add the view section to the YAML file, such as:

# add loadrunner service account

do:
- gcloud iam service-accounts create loadrunner --display-name "LoadRunner"

undo:
- gcloud iam service-accounts delete loadrunner

view:
- gcloud iam service-accounts describe loadrunner

and use the view subcommand.

Conditional migration

Commands (do,undo,view) can be made conditional by adding an if section. You can only use custom environment variables and configuration parameters (PROJECT,ZONE,REGION) in expressions. If the expression evaluates to true then the do (up), undo (down) and view (view) commands are executed.

if: PROJECT == "your-project-id"
do:
- gcloud condig list

or with combinations:

if: (PROJECT == "your-project-id") && (ZONE == "my-zone")
do:
- gcloud condig list

For available operators, see Language-Definition

Help

NAME:
gmig - Google Cloud Platform infrastructure migration tool

USAGE:
gmig [global options] command [command options] [arguments...]

COMMANDS:
    init     Create the initial configuration, if absent.
    new      Create a new migration file from a template using a generated timestamp and a given title.
    up       Runs the do section of all pending migrations in order, one after the other.
             If a migration file is specified then stop after applying that one.
    down     Runs the undo section of the last applied migration only.
    down-all Runs the undo section of all applied migrations.
    plan     Log commands of the do section of all pending migrations in order, one after the other.
    status   List all migrations with details compared to the current state.
    view     Runs the view section of all applied migrations to see the current state reported by your infrastructure.
    force    state | do | undo
    util     create-named-port | delete-named-port
    export   project-iam-policy | storage-iam-policy
    help, h  Shows a list of commands or help for one command

GLOBAL OPTIONS:
-q                   quiet mode, accept any prompt
-v                   verbose logging
--help, -h           show help
--print-version, -V  print only the version

Getting started

Installation

You need to compile it using the Go SDK.

go install github.com/emicklei/gmig@latest

init <path>

Prepares your setup for working with migrations by creating a gmig.json file in a target folder.

gmig init my-gcp-production-project

Then your filesystem will have:

/my-gcp-production-project/
    gmig.yaml

You must change the file gmig.yaml to set the Project and Bucket name.

# gmig configuration file
#
# Google Cloud Platform migrations tool for infrastructure-as-code. See https://github.com/emicklei/gmig.

# [project] must be the Google Cloud Project ID where the infrastructure is created.
# Its value is available as $PROJECT in your migrations.
#
# Required by gmig.
project: my-project

# [region] must be a valid GCP region. See https://cloud.google.com/compute/docs/regions-zones/
# A region is a specific geographical location where you can run your resources.
# Its value is available as $REGION in your migrations.
#
# Not required by gmig but some gcloud and gsutil commands do require it.
# region: europe-west1

# [zone] must be a valid GCP zone. See https://cloud.google.com/compute/docs/regions-zones/
# Each region has one or more zones; most regions have three or more zones.
# Its value is available as $ZONE in your migrations.
#
# Not required by gmig but some gcloud and gsutil commands do require it.
# zone: europe-west1-b

# [bucket] must be a valid GCP Storage bucket.
# A Google Storage Bucket is used to store information (object) about the last applied migration.
# Bucket can contain multiple objects from multiple applications. Make sure the [state] is different for each app.
#
# Required by gmig.
bucket: my-bucket

# [state] is the name of the object that hold information about the last applied migration.
# Required by gmig.
state: myapp-gmig-last-migration

# [env] are additional environment values that are available to each section of a migration file.
# This can be used to create migrations that are independent of the target project.
# By convention, use capitalized words for keys.
# In the example, "myapp-cluster" is available as $K8S_CLUSTER in your migrations.
#
# Not required by gmig.
env:
  K8S_CLUSTER: myapp-cluster

If you decide to store state files of different projects in one Bucket then set the state object name to reflect this, eg. myproject-gmig-state. If you want to apply the same migrations to different regions/zones then choose a target folder name to reflect this, eg. my-gcp-production-project-us-east. Values for region and zone are required if you want to create Compute Engine resources. The env map can be used to parameterize commands in your migrations. In the example, all commands will have access to the value of $K8S_CLUSTER.

new <title>

Creates a new migration for you to describe a change to the current state of infrastructure.

gmig new "add storage view role to cloudbuild account"

Using a combination of the options --do, --undo and --view, you can set the commands directly for the new migration.

status <path> [--migrations folder]

List all migrations with an indicator (applied,pending) whether is has been applied or not.

gmig status my-gcp-production-project/

Run this command in the directory where all migrations are stored. Use --migrations for a different location.

plan <path> [stop] [--migrations folder]

Log commands of the do section of all pending migrations in order, one after the other. If stop is given, then stop after that migration file.

up <path> [stop] [--migrations folder]

Executes the do section of each pending migration compared to the last applied change to the infrastructure. If stop is given, then stop after that migration file. Upon each completed migration, the gmig-last-migration object is updated in the bucket.

gmig up my-gcp-production-project

down <path> [--migrations folder]

Executes one undo section of the last applied change to the infrastructure. If completed then update the gmig-last-migration object.

gmig down my-gcp-production-project

down-all <path> [--migrations folder]

Executes undo section of all applied change to the infrastructure. Updates the gmig-last-migration object after each successfull step.

gmig down-all my-gcp-production-project

view <path> [migration file] [--migrations folder]

Executes the view section of each applied migration to the infrastructure. If migration file is given then run that view only.

gmig view my-gcp-production-project

template [-w] source-file

Processes the source-file as a Go template and write the result to stdout. If the -w is given then rewrite the source with the processed content. The following functions are available:

env

This function takes the first argument and does a lookup in the available OS environment values. Example of a configuration snippet that needs the environment dependent value for $PROJECT.

project: {{ env "PROJECT" }}

Example:

gmig template some-config.template.yaml > some-config.yaml

Export existing infrastructure

Exporting migrations from existing infrastructure is useful when you start working with gmig but do not want to start from scratch. Several sub commands are (or will become) available to inspect a project and export migrations to reflect the current state. After marking the current state in gmig (using force-state), new migrations can be added that will bring your infrastructure to the next state. The generated migration can ofcourse also be used to just copy commands to your own migration.

export project-iam-policy <path>

Generate a new migration by reading all the IAM policy bindings from the current infrastructure of the project.

gmig -v export project-iam-policy my-project/

export storage-iam-policy <path>

Generate a new migration by reading all the IAM policy bindings, per Google Storage Bucket owned by the project.

gmig -v export storage-iam-policy my-project/

Working around migrations

Sometimes you need to fix things because you made a mistake or want to reorganise your work. Use the force and confirm your action.

force state <path> <filename>

Explicitly set the state for the target to the last applied filename. This command can be useful if you need to work from existing infrastructure. Effectively, this filename is written to the bucket object. Use this command with care!.

gmig force state my-gcp-production-project 010_create_some_account.yaml

force do <path> <filename>

Explicitly run the commands in the do section of a given migration filename. The gmig-last-migration object is not updated in the bucket. Use this command with care!.

gmig force do my-gcp-production-project 010_create_some_account.yaml

force undo <path> <filename>

Explicitly run the commands in the undo section of a given migration filename. The gmig-last-migration object is not updated in the bucket. Use this command with care!.

gmig force undo my-gcp-production-project 010_create_some_account.yaml

export-env <path>

Export all available environment variable from the configuration file and also export $PROJECT, $REGION and $ZONE Use this command with care!.

eval $(gmig export-env my-gcp-production-project)

GCP utilities

util create-named-port <instance-group> <name:port>

The Cloud SDK has a command to set-named-ports but not a command to add or delete a single name:port mapping. To simplify the migration command for creating a name:port mapping, this gmig util command is added. First it calls get-named-ports to retrieve all existing mappings. Then it will call set-named-ports with the new mapping unless it already exists.

util delete-named-port <instance-group> <name:port>

The Cloud SDK has a command to set-named-ports but not a command to add or delete a single name:port mapping. To simplify the migration command for deleting a name:port mapping, this gmig util command is added. First it calls get-named-ports to retrieve all existing mappings. Then it will call set-named-ports without the mapping.

util add-path-rules-to-path-matcher [config folder] -url-map [url-map-name] -service [backend-service-name] -path-matcher [path-matcher-name] -paths "/v1/path/, /v1/otherpath/"

The Cloud SDK has a command to add a patch matcher with a set of paths but not a command update the path rules of an existing path matcher in the url map. To write a migration that changes the set of paths (add,remove), this gmig util command is added. First is exports an URL map, updates the paths of the rules of a path-matcher, then imports the changed URL map. Because this migration is changing a regional resource which is typically shared by multiple services, the patching of the URL map is executed using a global lock (using the Bucket from the config).

Examples

This repository has a number of examples of migrations.

© 2022, ernestmicklei.com. MIT License. Contributions welcome.

gmig's People

Contributors

emicklei avatar jroosing avatar lewtonnn avatar mrlauy avatar nima-kramphub avatar ormanli avatar tkivisik avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

gmig's Issues

[IMPROVEMENT IDEA] Standardize logging

Date in ISO
Currently, status output starts with a timestamp (e.g. 2018/10/22 16:55:35). I propose the date be formatted in ISO format (i.e. 2018-10-22).

Column Separator to Tab
Currently, logging column separator is space ' ', which can also be found inside the columns themselves. This means cutting out columns (e.g. cut -d ' ' -f 1 gmig_log_file.txt) requires significant effort to get right. I propose the column separator be changed to '\t' which is the default separator in Bash cut and some other tools, and is unlikely to be part of a filename. Think of also using sort by not the first column - currently very tricky.

Filename
Currently, filename is written inside the parentheses. I propose it be turned into a column of it's own.

stdout instead of stderr
Currently, logs end up in stderr, not in stdout (where I expected it). That means, piping it into a as follows didn't work: gmig status project >> log.txt, but this worked gmig status project 2>> log.txt

Environment vars

When i have the following config:

{
    "project": "woopwoop",
    "region": "us-central1",
    "zone": "us-central1-a",
    "bucket": "woopwoop-gmig-staging",
    "state": "woopwoop-state",
    "env": {
        "stage": "staging",
        "data_location": "US",
        "upload_bucket_prefix": "upload-"
    }
}

And the following commands

do:
- gsutil mb -p $project -c regional -l $region gs://$upload_bucket_prefix$project/
undo:
- gsutil rm -r gs://$upload_bucket_prefix$project

The error:

2018/10/18 15:32:37 executing [do] failed, error: [failed to run migration section:
gsutil mb -p $PROJECT -c regional -l $region gs://$upload_bucket_prefix$PROJECT/
CommandException: The mb command requires at least 1 argument. Usage:

  gsutil mb [-c class] [-l location] [-p proj_id] [--retention time] url...

For additional help run:
  gsutil help mb

.....

I can't get it to work. However when copy/paste the project and region fields to the "env" part. Everything runs fine.

Is this intended behavior?

Tried versions 1.6.0/1.70/1.7.1

migration file in force command is not found

gmig force undo migrations/kramp-hub/ migrations/015_create_internal_waiting_pubsub_resources.yaml

results in

ERROR: no such migration (wrong project?, git pull?):/Users/emicklei/Projects/parzello-infra/migrations/migrations/015_create_internal_waiting_pubsub_resources.yaml

because it use the incorrect path

Reverse scan for removals/deletions

It might be preferential when resources are deleted to afterwards remove both the creation and deletion script or maybe flag them or something, to be excluded when setting up a new environment.

[IMPROVEMENT IDEA] - short index instead of migration timestamp

Problem:

  • command line autocomplete is tedious with the current long timestamps. Especially when the migrations were created only seconds apart (with a script).
  • filenames are unnecessarily long. It means there is less meaningful content displayed whenever filenames are involved, e.g. in gmig as well as here in GitHub ( https://github.com/emicklei/gmig/tree/master/examples )
  • timestamp serves no further purpose than ordering of migrations. For example, migration A could be updated after many more migrations B C D have been added. Timestamp therefore can't be seen as 'last edit' time nor as a 'time of creation' as D could be made to run earlier than A by changing D's timestamp.

Proposed solution:

  • Replace the current long timestamp with a short index instead (e.g. 01).

Benefits

  • autocomplete is easier - one keystroke to narrow our search to 10 migrations, two keystrokes to a unique migration. The unique characters are also at the beginning of the filename, making it easier to read and find them.
  • more space is left for meaningful content of the migration.
  • until convenience functions are written to allow inserting a migration immediately after some existing migration, indexes could be added with a gap in between (01, 04, 07, 10 etc)

add describe section to each migration

A migration list the commands to change Google infrastructure and optionally to revert it.
The tool gmig can show the status with respect to what migrations have been applied.
However, it does not tell you what the current status of your infrastructure is.
For that, you need to go to the Console or use the gcloud command to describe the resources.

This issue proposes to extend migrations files with a section that has the commands to do just that; describe the resources just created, changed or even deleted.

do:
- add a firewall rule

describe:
- show the created firewall rule

undo:
- delete a firewall rule

Alternative names for describe are view, list, show, inspect.
With this section, we can have a command such as

gmig view production/

which reports all the resources created by all migrations.

use verbose flag to expand variables in commands

currently, the execution of commands of a section (do, undo) is printing the commands as they are written in the migration ; i.e any variables are not-expanded before writing.

this feature is to only print expanded variables if gmig is running in verbose mode.

deprecate JSON config in favour of YAML

Motivation: yaml allows for documenting all the required and optional properties.

gmig should keep supporting the JSON format.

  1. try to read gmig.yaml or gmig.yml
  2. if not found then try to read gmig.json
  3. if gmig.json is found then print a deprecation warning
  4. if gmig.json is found then print the yaml contents (such that the gmig.yaml can be created by the user) ?

gmig all down

currently, executing gmig down <folder> will run the undo section of the last migration only.

This feature will allow you to run all undo sections of all migrations, one after the other.

improve status output

currently, one list entry produced by gmig status may look like this

2018/08/29 14:54:34 --- applied --- 2018-08-28 07:53:10 20180828t075310 add firewall rule for lb service

Notice the duplicate information about the timestamp. This should be removed.
The meaning of the migration is extracted from the filename.
In practice it is convenient to also print the actual filename of the migration.
Proposal:

2018/08/29 14:54:34 --- applied --- 2018-08-28 07:53:10 add firewall rule for lb service (20180828t075310_add_firewall_rule_for_lb_service)

Unexpected: any yaml file treated as migration

Currently, all yaml files are treated (found) as migration files.

gmig status project will list all yaml files, even when they don't follow the naming convention (currently starts with a timestamp, future starts with an index)

I propose that a check be made to verify whether a yaml file follows a migration file naming convention.

support migration from old timestamp prefix to sequence prefix

Recently, the gmig tool uses a different strategy for name the migration files.
The old strategy was to use a timestamp format.
The new strategy uses a simple sequence e.g. 010, 015 etc.
To support the conversion to the new strategy forexisting projects that use the old strategy, the gmig tool could have a separate comment called "upgrade" that performs this and any future tasks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.