GithubHelp home page GithubHelp logo

workflow's Introduction

JH Development Workflow Tool

Some Badges would be nice.

Install

Make sure you have fswatch installed. You can install via homebrew:

brew install fswatch

Run the following commands:

composer global config repositories.workflow vcs [email protected]:wearejh/workflow
composer global require wearejh/workflow:dev-master

Notes:

  • Make sure your composer global bin directory ~/.composer/vendor/bin is available in your $PATH environment variable.
  • Because packages installed globally with composer share dependencies, you may need to run composer global update if the previous command failed.

Usage

Before you create any new project, first update the tool, in-case of any fixes or new features.

composer global update wearejh/workflow

Then run workflow to see the list of available commands.

Read the wiki for detailed information on each command

Troubleshooting

If you are experiencing very slow speeds (i.e. it's hanging for minutes inbetween commands), it may be due to a slow DNS lookup to localunixsocket.local. See relevant GitHub issue A quick fix is to add the following to your hosts file.

127.0.0.1 localunixsocket.local

workflow's People

Contributors

annybs avatar aydinhassan avatar benjaminlill avatar jodiwarren avatar maxbucknell avatar mikeymike avatar shakyshane avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

workflow's Issues

Refactor - use verbosity option

workflow new and workflow fit to use -vvv options passed to workflow to enable variable levels of logging at each step. WIP

Update Circle CI

Circle CI is now 2.0 and it's much better with docker builds!

We need to decide on whether to keep the Circle CI configuration in here too tbh as this is an OS project.

// or make it private 😭

Persisting the env.php file

Today @AydinHassan hit a weird bug that was potentially caused by the fact we have a full named volume for the app/etc directory.

Volumes will happily mount over the files in the image and thus if a file is somehow removed from the volume it will cause that file to not exists in later rebuilds until you recreate it yourself after the volume is mounted.

We have this volume purely to persist the env.php file. So we need to better approach at persisting the single configuration file that doesn't impact other files in the directory.

Possible solutions were discussed

  1. Single file volume.. not possible because it's a Linux only feature 😭
  2. Copy from local filesystem.

Solution proposed...

  1. On workflow install command/s ensure the env.php file is pulled back to the local FS.
  2. Remove env related volumes (might be Selco specific until backported if not already)
  3. Subsequent builds will copy the file in on the COPY app app command

Caveat to this is that the user is then responsible for ensuring that file stays persisted, if they delete it they will need to recover it, recreate it or re-install Magento to generate it.

Change composer cache workflow

  • Remove volume for composer cache
  • Update workflow to...
    • Pull composer cache dir down from container on start
    • Pull composer cache dir down on composer commands

By doing this we no longer require a volume and can have the files persist for the next build locally by default.

Add PHP version option

Latest versions of Magento (2.2.1 and up, anecdotally) seem to require php >= 7.1 so we need to bake in an option to specify PHP version.

Add git clean step to create project

This workflow leaves the host really clean, however the initial project creation clones the original repo and pollutes with unneeded files.

We should add a step that runs git clean -dfX to remove them.. and also review the .gitignore rules to make sure everything lines up correctly.

Use Composers output obj

This should provide cleaner output than just echo here there and everywhere and will allow for output suppression etc

Module enable command

Now we keep the config.php file in the repo we need to ensure we pull back the file from the container after enabling a module during development.

Magento Full Install Command Fails to Install Magento

When running the command workflow mfi I got the following error:

--cleanup-database: command not found

I found this happens when workflow executes the .docker/php/bin/magento-install file from within the container.

Workaround is to ssh into the container and run the commands found on that file manually from the command line as follows :

bin/magento setup:install \
    --db-host=db \
    --db-name=$MYSQL_DATABASE \
    --db-user=$MYSQL_USER \
    --db-password=$MYSQL_PASSWORD \
    --base-url=$MAGE_HOST \
    --base-url-secure=$MAGE_HOST \
    --admin-firstname=$MAGE_ADMIN_FIRSTNAME \
    --admin-lastname=$MAGE_ADMIN_LASTNAME \
    --admin-email=$MAGE_ADMIN_EMAIL \
    --admin-user=$MAGE_ADMIN_USER \
    --admin-password=$MAGE_ADMIN_PASS \
    --backend-frontname=$MAGE_BACKEND_FRONTNAME \
    --use-secure=1 \
    --use-secure-admin=1 \ 
    --cleanup-database -vvv \
    || { exit 1; }

and then

bin/magento index:reindex && \
bin/magento dev:source-theme:deploy --area="adminhtml" --theme="Magento/backend" css/styles-old css/styles && \
bin/magento dev:source-theme:deploy --theme="Magento/blank" css/styles-m css/styles-l css/email css/email-inline && \
bin/magento dev:source-theme:deploy && \
bin/magento setup:static-content:deploy

Smoother Set Up

Requirements

  • As a developer, I should be able to initialise a project in a fashion that is seamless and consistent across projects, requiring only the codebase and a database seed.
  • As a developer, I should be able to install Magento from scratch by configuring local.env and running a workflow command.

Solution

Existing Projects

Right now, the workflow to set up an existing project is:

Magento 1:

$ workflow n98 local-config:generate
$ workflow sql -f dump.sql

Magento 2:

$ workflow mfi
$ workflow sql -f dump.sql

The requirements of setting up Magento 1 and Magento 2 are largely the same: we need a configuration file (app/etc/local.xml and app/etc/env.php respectively), and we need a database with which to seed.

But these two processes are quite different. In the Magento 2 case, we are installing Magento from scratch, and then installing a database dump on top of that. This wastes a little bit of time, but can also cause problems for projects that don’t install cleanly.

We could unify these approaches by creating a command called configure, which takes a single argument of a path to a database dump. In each case this would provision the relevant configuration files (bifurcation between M1 and M2), and then install a database.

Clean Installations

As alluded to above, we sometimes have issues when the existing mfi command in Workflow does not complete. This is because of assumptions inherent in Magento modules’ setup scripts, wherein they assume that they are being installed into an already running Magento instance. These can be tricky to locate and fix, and in some cases there are things you wish to do in an upgrade script that are simply not possible, especially things to do with selecting themes, configuring websites and other things.

Nevertheless, it should be an aim of all JH projects that installation without a seed database yields a valid environment, and we should explore opportunities on how to enforce and improve this.

But for now, we have cut the proverbial Gordian Knot by not requiring installation.

New Projects

Sometimes you want to install Magento from scratch. I don’t think anybody really wants to install Magento 1 from scratch anymore, so I’ve neglected that case and will only consider Magento 2 now.

We should maintain the mfi command, although I’d appreciate a rebranding to install (as opposed to configure) as part of simplifying and unifying the Workflow API.

There isn’t much wrong with this, though. It reads from local.env and runs a setup:install command. In doing that, it’s basically perfect.

Generating Environment Files

As part of workflow configure, we generate either local.xml, or env.php. This process should be improved.

Every project will have an etc/ directory that contains sample files, like env.php.twig. These files will be processed through Twig (populated by variables in local.env, and written to app/etc. This also allows us to remove the TemplateWriter class from Workflow, and make the template language a little bit more approachable.

Only clean directories when syncing

Currently push and pull commands will wipe directories if they already exist on the target, but also for files.

For files we can just push/pull without checking if they exist and overwrite the target.

Multiple Projects

Requirements

  • As a developer, I should be able to run multiple projects at the same time, so that less time is lost stopping and starting and keeping track of different environments
  • As a developer, I should be able to address projects by a convenient name, so that TLS, cookies, sessions, and other stuff works in a reasonable way.
  • As a developer, server names should be set correctly so that editors know which Xdebug requests to intercept.

Solution

Port Mapping

Currently, Workflow binds containers to the canonical ports on the host. For example, MySQL taxes over :3306, and HAProxy takes over :80 and :443. In order to run multiple projects simultaneously, this has to end.

Addressing Ports

A simple solution, as suggested by Shane, would be to run let Docker assign host port bindings, and run Magento on that port, and address everything as https://whatever.loc:32763. This would work, but introduces complexities with Local Storage and such, and some developers find port numbers distasteful for anything more than kicking the tyres of a piece of software.

Nevertheless, this will be available for those who want it.

Proxy Server

It would be relatively straightforward to run a little HTTP server on the host that bound to ports :80 and :443, and handled routing towards projects.

The natural implementation language for this would be PHP, but since PHP’s built in server is single threaded, it’s disqualified. Two more suitable options are:

  • A real web server like Nginx.
  • A custom HTTP server running in NodeJS.

While Nginx is a proven platform and we have reasonable experience configuring it, managing a service like that on a Mac can be cumbersome, and automating its configuration would be a pain.

NodeJS is my preference here, because its HTTP APIs are mature and easy to use, and updating it to read configuration, starting and stopping it.

The exact specification would be subject to design and such, but I’m imagining the following:

  • When Workflow initialises containers for a project, a call is made to register a proxy mapping, using the port associated with the HAProxy container, and a base URL, either obtained from local.env or from the project directory name. In the absence of an override, all subdomains of that URL would be mapped to allow for the configuration of multiple stores.
  • When Workflow starts containers, there is either a NodeJS server running, or not. This will be determined by the presence of a file in /var/run/pid (or equivalent). If it’s not there, the server is started by running a node command, which starts the server in the background, and creates the PID file. This will require a privilege escalation.
  • A node command is called that registers the proxy, and the server updates itself.
  • If workflow stop is run, we can garbage collect and deregister it. But garbage collection is not a huge concern. A new registration would replace an old one if the URLs collide, and routing to a dead port is not a terrible problem.
  • A status of the server, listing all currently running proxies will be available, and it will be possible to read stdout of the server process to get logging on registrations and requests.

License

Currently this is an OS project and has no license... we should defo fix.

Add test files to new project builds

I believe I missed a few files for tests in the new project command. e.g phpunit.xml

Worth just going over one of the existing projects to see what is required and double check

Feature Request: setting urls + cookie domain

When importing DB's from anywhere -> local, I always end up needing something like the following, can we make a helper command for it?

LOCAL_DOMAIN="daylong.m2"
LOCAL_URL="https://daylong.m2/"
workflow setup:store-config:set --base-url-secure="$LOCAL_URL"
workflow setup:store-config:set --base-url="$LOCAL_URL"
workflow sql -s "UPDATE core_config_data t SET t.value = '$LOCAL_DOMAIN' WHERE t.path = 'web/cookie/cookie_domain'"

Workflow Watch Fails with Timestamp is in the Future

The Workflow watch is randomly erroring with the following:

Fatal error: Uncaught Jh\Workflow\ProcessFailedException: tar: app/code/Neom/Bundle/etc/di.xml: time stamp 2017-11-16 09:28:58 is 6.879016097 s in the future
tar: app/code/Neom/Bundle/Pricing/Adjustment/Calculator.php: time stamp 2017-11-16 09:28:58 is 6.878885584 s in the future
 in /Users/anthony/.composer/vendor/wearejh/workflow/src/CommandLine.php:94

The appears to be during automatic file syncing when a change has been made.

CLI PHP 7.0.22, macOS 10.12.6

Remove Xdebug warning ?

We could (on bootstrap or before the composer run) do export COMPOSER_DISABLE_XDEBUG_WARN=1 to prevent xdebug warning messages

Reduce image size

Copy-pasting from the Slack conversation:

@annybs on the topic of docker chewing through disk space - i suspect one of our key 'problems' is that the official php images are ~700mb. for my own purposes on my week off, i spent a little time working on alpine-based php5/7 images, and these come out at <100mb with most every magento dependency preinstalled, plus xdebug https://github.com/annybs/php-alpine

i'm not sure these are adequate for jh purposes because the php version is pretty loose, based on what's available in alpine repos rather than compiled from a specific source - which at the moment, means that php 7.2 is not available. but perhaps curating our own base images is something for us to think about, in the interest of reducing the size of project images on dev machines - if i have 10 projects installed at ~100mb each, rather than the current ~700mb, then that's already 7gb saved and a much less bloated docker disk

n.b. i'm aware that the official php image has alpine variant tags, but for some reason php is compiled with a non-standard ini dir which means packages installed from apk won't be recognised as they're put in the wrong place https://github.com/docker-library/php/blob/master/7.2/alpine3.7/fpm/Dockerfile#L40

@maxbucknell Not all Magento versions work with Alpine. magento/magento2#8070

I think this is a worthy goal despite potential technical issues, and maybe something worth visiting in the future (which is why I'm creating an issue here, rather than leaving the conversation to scroll away into the void!)

docker-compose files

Noticed we have docker-compose files in this project and I'm not actually entirely sure why. Worth checking out and cleaning up as required.

Log all commands executed for debug purposes

can have a --debug flag which will log all commands to a file or something, or even output them on the console to debug errors.

Might have to create our own process class to wrap around the sync/async processes with a psr3 logger.

workflow build: COPY fails at step 20

COPY failed: stat /var/lib/docker/tmp/docker-builder693034594/.docker/composer-cache: no such file or directory

In CommandLine.php line 75:
                                                                                                                   
  COPY failed: stat /var/lib/docker/tmp/docker-builder693034594/.docker/composer-cache: no such file or directory  
                                                                                                                   

build [-p|--prod] [--no-cache]```

`composer-cache` dir does not exist initially and Docker `COPY` fails here until it is created manually. Would be good to fix in `fit` and `new`

No option to run commands in a different directory

The most convenient way to run tests is to be in the test directory (dev/tests/whatever) and run ../../../vendor/bin/phpunit, so that the phpunit.xml file doesn't have to be specified.

There should be a field on workflow exec to specify a working directory.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.