GithubHelp home page GithubHelp logo

briancaffey / django-step-by-step Goto Github PK

View Code? Open in Web Editor NEW
168.0 9.0 37.0 22.27 MB

A Django + Vue reference project that focuses on developer tooling and CI/CD + IaC

Home Page: https://briancaffey.github.io/django-step-by-step/

Makefile 3.93% Python 41.12% HTML 9.47% Jupyter Notebook 3.97% CSS 0.09% JavaScript 7.62% Dockerfile 0.72% Shell 4.05% TypeScript 15.14% Vue 8.18% SCSS 0.25% HCL 5.47%
django docker python terraform cicd github-actions celery typescript vue quasar-framework

django-step-by-step's Issues

remove heroku support

Remove support for building with Heroku

  • requirements.txt file in root directory
  • Procfile
  • runtime.txt
  • Documentation in VuePress

Refactor application upgrade scripts

Both the frontend and backend applications are upgraded via AWS CLI calls.

The frontend application upgrade script looks like this:

Script
TASK_FAMILY=$WORKSPACE-$TASK

  # save the task definition JSON to a variable
  TASK_DESCRIPTION=$(aws ecs describe-task-definition \
    --task-definition $TASK_FAMILY \
  )

echo $TASK_DESCRIPTION | jq -r \
  .taskDefinition.containerDefinitions \
  > /tmp/$TASK_FAMILY.json

# write new container definition JSON with updated image
echo "Writing new $TASK_FAMILY container definitions JSON..."

# replace old image URI with new image URI in a new container definitions JSON
cat /tmp/$TASK_FAMILY.json \
  | jq \
  --arg IMAGE "$NEW_FRONTEND_IMAGE_URI" '.[0].image |= $IMAGE' \
  > /tmp/$TASK_FAMILY-new.json

# Get the existing configuration for the task definition (memory, cpu, etc.)
# from the variable that we saved the task definition JSON to earlier
echo "Getting existing configuration for $TASK_FAMILY..."

MEMORY=$( echo $TASK_DESCRIPTION | jq -r \
  .taskDefinition.memory \
)

CPU=$( echo $TASK_DESCRIPTION | jq -r \
  .taskDefinition.cpu \
)

ECS_EXECUTION_ROLE_ARN=$( echo $TASK_DESCRIPTION | jq -r \
  .taskDefinition.executionRoleArn \
)

ECS_TASK_ROLE_ARN=$( echo $TASK_DESCRIPTION | jq -r \
  .taskDefinition.taskRoleArn \
)

# check the content of the new container definition JSON
cat /tmp/$TASK_FAMILY-new.json

# register new task definition using the new container definitions
# and the values that we read off of the existing task definitions
echo "Registering new $TASK_FAMILY task definition..."

aws ecs register-task-definition \
  --family $TASK_FAMILY \
  --container-definitions file:///tmp/$TASK_FAMILY-new.json \
  --memory $MEMORY \
  --cpu $CPU \
  --network-mode awsvpc \
  --execution-role-arn $ECS_EXECUTION_ROLE_ARN \
  --task-role-arn $ECS_TASK_ROLE_ARN \
  --requires-compatibilities "FARGATE"

While this works, there is an easier way to write this using the --cli-input-json flag from the register-task-definition AWS CLI command.

Both the frontend and backend application update scripts should be re-written to use this argument instead of reading out memory, CPU, execution-role-arn, task-role-arn individually.

Django ALLOWED_HOSTS settings for production

First of all, thanks for this awesome project! I've learned a lot :)

I noticed in settings base.py, ALLOWED_HOST=['*'] which is a huge security risk in production.
In production, I tried setting this to [.domain.com]. The issue I encountered is that django gunicorn ECS service keeps complaining that I need to add 10.0.x.x to ALLOWED_HOSTS.
ERROR 2023-04-11 21:50:35,430 exception 20 140390254044992 Invalid HTTP_HOST header: '10.0.x.x:8000'. You may need to add '10.0.x.x' to ALLOWED_HOSTS.

Any suggestions on how to fix this issue? I tried following this post https://stackoverflow.com/questions/49828259/when-deploying-django-into-aws-fargate-how-do-you-add-the-local-ip-into-allowed but so far unsuccessful.

Appreciate your help!

Does Adhoc environment work?

Hi @briancaffey,

Were you able to get the adhoc environment work? I didn't see the adhoc pipeline run on this project.

When I look at your github action workflow, the adhoc base create/update (spinning up new bastion host and new RDS) looks almost identical with the the prod base create/update, that means it doesn't seem to use shared resources. Can you help me understand this?
I was reading this documentation: https://hackernoon.com/ad-hoc-environments-for-django-applications-with-ecs-terraform-and-github-actions
Btw loved the detailed explanations!
Looking forward to hearing from you!

Poetry - Separate test dependencies from dev dependencies

2 Parts to this issue:

Part 1

It seems the latest idiomatic way to define dependencies is:

[tool.poetry.group.dev.dependencies]

instead of:

[tool.poetry.dev-dependencies]

See: "A note about defining a dev dependencies group" here

Part 2

To separate dev and test dependencies—after the above change is made—we would simply need to move some items in the pyproject.toml into a new [tool.poetry.group.test.dependencies] group. I have found it helpful to separate out the test dependencies for github actions / CI purposes.

Pass Github secrets to ECS environment variables

I tried to pass Github secrets to terraform using TF_VAR_variable_name, then use as an environment variable for Gunicorn task, but so far it has not been successful.
I saw in your terraform-aws-django modules/prod/app module "api" there is a variable called extra_env_vars. What is the use case for this? Can we actually use this variable to pass secrets from Github?

I would appreciate if you have any suggestions on how to pass extra environment variables to django.

Thanks!

Refactor Terraform prod GitHub Actions workflows

The GitHub Actions for Terraform and other IaC tools should use the following pattern:

  • terraform init + plan
  • manual review (using environment set on the job)
  • terraform apply

We can use artifacts to pass the terraform plan file available from the init + plan job to the apply job

This has already been successfully implemented for the Terraform prod base create/update and prod base destroy workflows. Here's an example:

Screenshot 2023-01-14 at 3 56 28 PM

The other Terraform workflows need to be updated using the same pattern:

  • prod base create/update

  • prod base destroy

  • prod app create/update

  • prod app destroy

  • ad hoc base create/update

  • ad hoc base destroy

  • ad hoc app create/update

  • ad hoc app destroy

Some links and issues:

  • the actions/upload-artifact / actions/download-artifact actions do not preserve the permissions of the files, so executable permissions need to be set manually using chmod
  • I set the working-directory default for run, but this does not apply to the artifact jobs, so artifacts need to be set relative to the root of the repo (not relative to the root of the default working directory) see this issue for more
  • I'm following the recommendations from the document Running Terraform in Automation from the Terraform docs, which recommends artifacting not only the tfplan file, but also the .terraform directory. This allows us to run terraform init only once in our workflow.

cc @codyfletcher

clean up makefile

The Makefile has a lot of targets that are no longer used. Remove unused targets and improve targets for working with virtual environments

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.