GithubHelp home page GithubHelp logo

common-workflow-language / workflow-service Goto Github PK

View Code? Open in Web Editor NEW
36.0 12.0 21.0 331 KB

Implementation of the GA4GH Workflow Execution Service, a REST service for running workflows

License: Apache License 2.0

Python 87.44% Jupyter Notebook 9.97% Shell 0.57% Dockerfile 2.02%
python

workflow-service's Introduction

Workflow as a Service

This is a client and server implementation of the GA4GH Workflow Execution Service 1.0.0 API.

It provides Arvados and Toil backends. It also works with any cwl-runner that supports the CWL standard command line interface: http://www.commonwl.org/v1.0/CommandLineTool.html#Executing_CWL_documents_as_scripts

Installation:

pip install wes-service

Usage

Client configuration

Command line parameter or environment variable.

--host or WES_API_HOST

The host to contact.

--proto or WES_API_PROTO

The protocol (http or https) to use.

--auth or WES_API_AUTH

Credentials. Format is 'Header: value' or just 'value'. If header name is not provided, value goes in the 'Authorization'.

Get service info

$ wes-client --info

Submit a workflow to run:

Attachments must be accessible from the filesystem. Workflow runners may also support http URLs or other storage systems.

$ wes-client --attachments="testdata/dockstore-tool-md5sum.cwl,testdata/md5sum.input" testdata/md5sum.cwl testdata/md5sum.cwl.json

List workflows

$ wes-client --list

Get workflow status

$ wes-client --get <run-id>

Get stderr log from workflow:

$ wes-client --log <run-id>

Server Configuration

Run a standalone server with default cwl-runner backend:

$ wes-server

Run a standalone server with Arvados backend:

$ pip install arvados-cwl-runner
$ wes-server --backend=wes_service.arvados_wes

Run a standalone server with Toil backend:

$ pip install toil[all]
$ wes-server --backend=wes_service.toil_wes --opt extra=--clean=never

Use alternate executable with cwl-runner backend

$ pip install cwltool
$ wes-server --backend=wes_service.cwl_runner --opt runner=cwltool --opt extra=--logLevel=CRITICAL

Pass parameters to cwl-runner

Use "--opt" following by "key=value"

$ wes-server --backend=wes_service.cwl_runner --opt extra=--workDir=/tmp/work

Development

If you would like to develop against workflow-service make sure you pass the provided test and it is flake8 compliant

Install from Source

$ virtualenv venv && source venv/bin/activate && pip install toil[all] && pip install . --process-dependency-links && pip install -r dev-requirements.txt

Running Tests

From path workflow-service run

$ pytest && flake8

workflow-service's People

Contributors

achave11-ucsc avatar bencvdb avatar dailydreaming avatar david4096 avatar dependabot-preview[bot] avatar dependabot[bot] avatar gijzelaerr avatar jaeddy avatar mr-c avatar snyk-bot avatar tetron avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

workflow-service's Issues

Test/demonstrate workflow_engine_params

The default workflow engine params key allows a WES to tell a client what the default flags are when running a workflow, above and beyond what appears in the workflow descriptor. In practice, this is similar to the interaction of working with help messages from CLI applications.

First, one can request the default workflow engine params from a service. This returns the various flags and "other settings" an engine can take and their defaults.

The client can then construct a workflow_engine_params key in the WorkflowRequest that can override these defaults as necessary.

To close this, add code that returns the defaults from service info https://github.com/ga4gh/workflow-execution-service-schemas/blob/develop/openapi/workflow_execution_service.swagger.yaml#L384 . These should be read from the help message of the runner, or argparse if available!

Then, add the ability to read the workflow_engine_params from the request in the runner. https://github.com/ga4gh/workflow-execution-service-schemas/blob/develop/openapi/workflow_execution_service.swagger.yaml#L531

Add a test that shows the parameters being accepted in the request and passed down for execution. The workflow run itself can fail, as long as we show the parameters are modifiable and that the above interaction pattern can be carried out.

Install using pip fails

On both python 2.7 and python 3.5, in virtualenvs, installing via pip install wes-service fails:

Collecting wes-service
  Downloading wes-service-2.1.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-build-61IIFM/wes-service/setup.py", line 16, in <module>
        long_description=open(README).read(),
    IOError: [Errno 2] No such file or directory: '/tmp/pip-build-61IIFM/wes-service/README.md'

pip and setuptools were both upgraded using pip install --upgrade before running pip install wes-service

Running python setup.py install successfully installs the package.

ImportError: cannot import name 'FileStorage'

Following the docs shown on https://pypi.org/project/wes-service/, I just installed via 'pip install wes-service' into a python 3.6 virtual env. When I run 'wes-server' I get the error:

$ wes-server
Traceback (most recent call last):
  File "/Users/golharr/workspace/ga4gh/wes-env/bin/wes-server", line 7, in <module>
    from wes_service.wes_service_main import main
  File "/Users/golharr/workspace/ga4gh/wes-env/lib/python3.6/site-packages/wes_service/wes_service_main.py", line 8, in <module>
    import connexion
  File "/Users/golharr/workspace/ga4gh/wes-env/lib/python3.6/site-packages/connexion/__init__.py", line 3, in <module>
    from .apis import AbstractAPI  # NOQA
  File "/Users/golharr/workspace/ga4gh/wes-env/lib/python3.6/site-packages/connexion/apis/__init__.py", line 1, in <module>
    from .abstract import AbstractAPI  # NOQA
  File "/Users/golharr/workspace/ga4gh/wes-env/lib/python3.6/site-packages/connexion/apis/abstract.py", line 14, in <module>
    from ..operation import Operation
  File "/Users/golharr/workspace/ga4gh/wes-env/lib/python3.6/site-packages/connexion/operation.py", line 7, in <module>
    from .decorators import validation
  File "/Users/golharr/workspace/ga4gh/wes-env/lib/python3.6/site-packages/connexion/decorators/validation.py", line 9, in <module>
    from werkzeug import FileStorage
ImportError: cannot import name 'FileStorage'

Document using other backends

I was able to use the cwlrunner backend, but more documentation needs to be provided before I can tell whether or not I'm setting up the other backends properly.

Incorrect exit code / status for CWL run with Toil backend

When running the md5sum.cwl workflow using Toil, the job seems to hang indefinitely in "RUNNING" state. Looking at the run log, the workflow appears to have completed successfully, but the exit code is -1.

Note: this occurred after installing from master, not my fork/branch used in #57.

$ wes-server --backend=wes_service.toil_wes
$ wes-client --host=localhost:8080 --proto=http --attachments="testdata/dockstore-tool-md5sum.cwl,testdata/md5sum.input" testdata/md5sum.cwl testdata/md5sum.cwl.json
INFO:root:Workflow run id is 1851f4f73ffb444ba0f33eefc9145c34

(eventually had to keyboard interrupt out of this step...)

$ wes-client --host=localhost:8080 --proto=http --get 1851f4f73ffb444ba0f33eefc9145c34
{
    "run_id": "1851f4f73ffb444ba0f33eefc9145c34",
    "workflow_log": {
        "stdout": "",
        "start_time": "1536268000.05",
        "cmd": [
            "['toil-cwl-runner', '--outdir=/Users/jaeddy/workflows/1851f4f73ffb444ba0f33eefc9145c34/outdir', '--jobStore=file:/Users/jaeddy/workflows/1851f4f73ffb444ba0f33eefc9145c34/toiljobstore', '/var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU/wes_workflow.cwl', '/var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU/wes_input.json']"
        ],
        "exit_code": -1,
        "end_time": "1536268000.06",
        "stderr": "C02SW7RPG8WN 2018-09-06 14:06:41,145 MainThread INFO toil.lib.bioio: Root logger is at level 'INFO', 'toil' logger at level 'INFO'.\nC02SW7RPG8WN 2018-09-06 14:06:41,188 MainThread INFO toil.jobStores.abstractJobStore: The workflow ID is: '2a94ac86-23b0-4dc1-8645-c7cc24a2664e'\nResolved '/var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU/wes_workflow.cwl' to 'file:///var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU/wes_workflow.cwl'\nC02SW7RPG8WN 2018-09-06 14:06:41,191 MainThread INFO cwltool: Resolved '/var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU/wes_workflow.cwl' to 'file:///var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU/wes_workflow.cwl'\nC02SW7RPG8WN 2018-09-06 14:06:42,525 MainThread INFO toil.common: Using the single machine batch system\nC02SW7RPG8WN 2018-09-06 14:06:42,525 MainThread WARNING toil.batchSystems.singleMachine: Limiting maxCores to CPU count of system (8).\nC02SW7RPG8WN 2018-09-06 14:06:42,525 MainThread WARNING toil.batchSystems.singleMachine: Limiting maxMemory to physically available memory (17179869184).\nC02SW7RPG8WN 2018-09-06 14:06:42,538 MainThread INFO toil.common: Created the workflow directory at /var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/toil-2a94ac86-23b0-4dc1-8645-c7cc24a2664e-83779605114901\nC02SW7RPG8WN 2018-09-06 14:06:42,539 MainThread WARNING toil.batchSystems.singleMachine: Limiting maxDisk to physically available disk (25154871296).\nC02SW7RPG8WN 2018-09-06 14:06:42,550 MainThread INFO toil.common: User script ModuleDescriptor(dirPath='/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages', name='toil.cwl.cwltoil', fromVirtualEnv=False) belongs to Toil. No need to auto-deploy it.\nC02SW7RPG8WN 2018-09-06 14:06:42,550 MainThread INFO toil.common: No user script to auto-deploy.\nC02SW7RPG8WN 2018-09-06 14:06:42,551 MainThread INFO toil.common: Written the environment for the jobs to the environment file\nC02SW7RPG8WN 2018-09-06 14:06:42,551 MainThread INFO toil.common: Caching all jobs in job store\nC02SW7RPG8WN 2018-09-06 14:06:42,551 MainThread INFO toil.common: 0 jobs downloaded.\nC02SW7RPG8WN 2018-09-06 14:06:42,573 MainThread INFO toil: Running Toil version 3.15.0-0e3a87e738f5e0e7cff64bfdad337d592bd92704.\nC02SW7RPG8WN 2018-09-06 14:06:42,573 MainThread INFO toil.realtimeLogger: Real-time logging disabled\nC02SW7RPG8WN 2018-09-06 14:06:42,590 MainThread INFO toil.toilState: (Re)building internal scheduler state\nC02SW7RPG8WN 2018-09-06 14:06:42,590 MainThread INFO toil.leader: Found 1 jobs to start and 0 jobs with successors to run\nC02SW7RPG8WN 2018-09-06 14:06:42,591 MainThread INFO toil.leader: Checked batch system has no running jobs and no updated jobs\nC02SW7RPG8WN 2018-09-06 14:06:42,591 MainThread INFO toil.leader: Starting the main loop\nC02SW7RPG8WN 2018-09-06 14:06:45,407 MainThread INFO toil.leader: Finished the main loop: no jobs left to run\nC02SW7RPG8WN 2018-09-06 14:06:45,408 MainThread INFO toil.serviceManager: Waiting for service manager thread to finish ...\nC02SW7RPG8WN 2018-09-06 14:06:45,602 MainThread INFO toil.serviceManager: ... finished shutting down the service manager. Took 0.193918943405 seconds\nC02SW7RPG8WN 2018-09-06 14:06:45,603 MainThread INFO toil.statsAndLogging: Waiting for stats and logging collator thread to finish ...\nC02SW7RPG8WN 2018-09-06 14:06:45,622 MainThread INFO toil.statsAndLogging: ... finished collating stats and logs. Took 0.0194990634918 seconds\nC02SW7RPG8WN 2018-09-06 14:06:45,622 MainThread INFO toil.leader: Finished toil run successfully\nC02SW7RPG8WN 2018-09-06 14:06:45,641 MainThread INFO toil.common: Successfully deleted the job store: <toil.jobStores.fileJobStore.FileJobStore object at 0x10aa34b10>\n"
    },
    "outputs": {},
    "request": {
        "workflow_url": "file:///var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU/md5sum.cwl",
        "workflow_attachment": "file:///var/folders/xt/2mpzwr2972g3_yqgjz5dbsg40000gq/T/tmpF3dDPU",
        "workflow_params": {
            "input_file": {
                "path": "md5sum.input",
                "class": "File"
            }
        },
        "workflow_type_version": "v1.0",
        "workflow_type": "CWL"
    },
    "state": "RUNNING",
    "task_logs": []
}

Test harness, unit tests

Some test harness should be added to make it easier to write tests, as well as unit tests based on the current test data.

Update: Install dependencies process in doc.

Update virtualenv set up and requirements process.

virtualenv venv && source venv/bin/activate && pip install toil==3.16.0 && pip install . --process-dependency-links && pip install -r dev-requirements.txt

Important to execute in virtual environment.

When installing the packages in the regular host, there is issues installing a package required by one of the dependencies. Specifically jsonschema and this can be avoided by running the commands inside a virtual environment.

Job fails for WDL with Toil backend

When running the md5sum.wdl workflow using Toil, the job never leaves the "QUEUED" state. After attempting to retrieve the run log, a 500 error is returned.

Note: this occurred after installing from master, not my fork/branch used in #57.

$ wes-client --host=localhost:8080 --proto=http --attachments="testdata/md5sum.input" testdata/md5sum.wdl testdata/md5sum.wdl.json
INFO:root:Workflow run id is 1939085ec06e4629b931ec76fa3f9805

At this point, the run shows up as "QUEUED" in the server console output (see below), but never changes state.

(eventually had to keyboard interrupt out of this step...)

$ wes-client --host=localhost:8080 --proto=http --get 1939085ec06e4629b931ec76fa3f9805
ERROR:root:{u'status': 500, u'type': u'about:blank', u'detail': u'The server encountered an internal error and was unable to complete your request.  Either the server is overloaded or there is an error in the application.', u'title': u'Internal Server Error'}

After attempting to view the run log, I can see that there's an error related to the WDL version of the workflow (draft-2 instead of v1.0), but I'm not sure why that would fail here and pass in integration tests.

Full console outputs from the local server:

INFO:werkzeug:127.0.0.1 - - [06/Sep/2018 14:13:10] "POST /ga4gh/wes/v1/runs HTTP/1.1" 200 -
Process Process-1:
Traceback (most recent call last):
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/multiprocessing/process.py", line 267, in _bootstrap
    self.run()
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/Users/jaeddy/code/github/projects/workflow-service/wes_service/toil_wes.py", line 200, in run
    '"workflow_type_version" to be "v1.0": ' + str(version))
RuntimeError: workflow_type "cwl", "wdl" requires "workflow_type_version" to be "v1.0": draft-2
INFO:root:Workflow 1939085ec06e4629b931ec76fa3f9805: QUEUED
INFO:werkzeug:127.0.0.1 - - [06/Sep/2018 14:13:10] "GET /ga4gh/wes/v1/runs/1939085ec06e4629b931ec76fa3f9805/status HTTP/1.1" 200 -
INFO:root:Workflow 1939085ec06e4629b931ec76fa3f9805: QUEUED
INFO:werkzeug:127.0.0.1 - - [06/Sep/2018 14:13:18] "GET /ga4gh/wes/v1/runs/1939085ec06e4629b931ec76fa3f9805/status HTTP/1.1" 200 -
INFO:root:Workflow 1939085ec06e4629b931ec76fa3f9805: QUEUED
INFO:werkzeug:127.0.0.1 - - [06/Sep/2018 14:13:26] "GET /ga4gh/wes/v1/runs/1939085ec06e4629b931ec76fa3f9805/status HTTP/1.1" 200 -
INFO:root:Workflow 1939085ec06e4629b931ec76fa3f9805: QUEUED
ERROR:flask.app:Exception on /ga4gh/wes/v1/runs/1939085ec06e4629b931ec76fa3f9805 [GET]
Traceback (most recent call last):
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/flask/app.py", line 2292, in wsgi_app
    response = self.full_dispatch_request()
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/flask/app.py", line 1815, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/flask/app.py", line 1718, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/flask/app.py", line 1813, in full_dispatch_request
    rv = self.dispatch_request()
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/flask/app.py", line 1799, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/connexion/decorators/decorator.py", line 73, in wrapper
    response = function(request)
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/connexion/decorators/validation.py", line 293, in wrapper
    return function(request)
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/connexion/decorators/decorator.py", line 44, in wrapper
    response = function(request)
  File "/Users/jaeddy/anaconda/envs/synorchestrator/lib/python2.7/site-packages/connexion/decorators/parameter.py", line 219, in wrapper
    return function(**kwargs)
  File "/Users/jaeddy/code/github/projects/workflow-service/wes_service/toil_wes.py", line 326, in GetRunLog
    return job.getlog()
  File "/Users/jaeddy/code/github/projects/workflow-service/wes_service/toil_wes.py", line 134, in getlog
    with open(self.request_json, 'r') as f:
IOError: [Errno 2] No such file or directory: u'/Users/jaeddy/workflows/1939085ec06e4629b931ec76fa3f9805/request.json'
INFO:werkzeug:127.0.0.1 - - [06/Sep/2018 14:13:41] "GET /ga4gh/wes/v1/runs/1939085ec06e4629b931ec76fa3f9805 HTTP/1.1" 500 -

Latest version of `wes-service` doesn't install

We were trying to use the WES client in Toil but couldn't get version 3.3 to install (see DataBiosphere/toil#4052 (comment)). When running pip install wes-service with Python 3.8, it goes into an infinite loop with the following error message:

Collecting ruamel.yaml<=0.15.77,>=0.12.4
  Using cached ruamel.yaml-0.15.77.tar.gz (312 kB)
    ERROR: Command errored out with exit status 1:
     command: /Users/wlgao/venv/toil_env/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-install-5dulfm8i/ruamel-yaml_f066f798e7d540fe9a54ea93917184b0/setup.py'"'"'; __file__='"'"'/private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-install-5dulfm8i/ruamel-yaml_f066f798e7d540fe9a54ea93917184b0/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-pip-egg-info-zn1lia8l
         cwd: /private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-install-5dulfm8i/ruamel-yaml_f066f798e7d540fe9a54ea93917184b0/
    Complete output (11 lines):
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-install-5dulfm8i/ruamel-yaml_f066f798e7d540fe9a54ea93917184b0/setup.py", line 211, in <module>
        pkg_data = _package_data(__file__.replace('setup.py', '__init__.py'))
      File "/private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-install-5dulfm8i/ruamel-yaml_f066f798e7d540fe9a54ea93917184b0/setup.py", line 184, in _package_data
        data = literal_eval("".join(lines))
      File "/private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-install-5dulfm8i/ruamel-yaml_f066f798e7d540fe9a54ea93917184b0/setup.py", line 156, in literal_eval
        return _convert(node_or_string)
      File "/private/var/folders/0t/322xl0gx0zj0bbqq46p3fwp40000gn/T/pip-install-5dulfm8i/ruamel-yaml_f066f798e7d540fe9a54ea93917184b0/setup.py", line 95, in _convert
        if isinstance(node, Str):
    NameError: name 'Str' is not defined
    ----------------------------------------

ModuleNotFoundError: No module named 'jsonschema.compat'

Folllowing the install instructions in README.md, I get this error.

python3 -m venv ga4gh-python-env
source ga4gh-python-env/bin/activate
pip install wes-service
wes-client --info
Traceback (most recent call last):
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/bin/wes-client", line 7, in <module>
    from wes_client.wes_client_main import main
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/wes_client/wes_client_main.py", line 11, in <module>
    from wes_client.util import modify_jsonyaml_paths, WESClient
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/wes_client/util.py", line 10, in <module>
    from wes_service.util import visit
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/wes_service/util.py", line 7, in <module>
    import connexion
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/connexion/__init__.py", line 3, in <module>
    from .apis import AbstractAPI  # NOQA
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/connexion/apis/__init__.py", line 1, in <module>
    from .abstract import AbstractAPI  # NOQA
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/connexion/apis/abstract.py", line 11, in <module>
    from swagger_spec_validator.validator20 import validate_spec
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/swagger_spec_validator/__init__.py", line 8, in <module>
    from swagger_spec_validator.util import validate_spec_url
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/swagger_spec_validator/util.py", line 9, in <module>
    from swagger_spec_validator import validator12
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/swagger_spec_validator/validator12.py", line 29, in <module>
    from swagger_spec_validator.ref_validators import default_handlers
  File "/Users/golharr/workspace/ga4gh/ga4gh-python-env/lib/python3.6/site-packages/swagger_spec_validator/ref_validators.py", line 14, in <module>
    from jsonschema.compat import iteritems
ModuleNotFoundError: No module named 'jsonschema.compat'

Add travis release process

Add a travis,yml that will release to pypi on named tags.

Add a RELEASE.md describing how to maintain the process.

Running wes-server without parameters fails

Running wes-server without specifying any options results in a crash.
This occurs in both python 2.7 and python 3.5

Traceback (most recent call last):
  File "/usr/local/bin/wes-server", line 11, in <module>
    load_entry_point('wes-service==2.1', 'console_scripts', 'wes-server')()
  File "build/bdist.linux-x86_64/egg/wes_service/__init__.py", line 25, in main
  File "build/bdist.linux-x86_64/egg/wes_service/cwl_runner.py", line 173, in create_backend
  File "build/bdist.linux-x86_64/egg/wes_service/util.py", line 13, in __init__
TypeError: 'NoneType' object is not iterable

Version in service info set incorrectly

From James Eddy, the version in the ServiceInfo response does not match the schema version.

To close, set it to a value that matches the OpenAPI. Stretch goal to automate gathering the version number from the schema description.

python3 support broken, no indication in readme py2 only

$   .venv/bin/wes-client --proto http --host=locahost:8080 --list
Traceback (most recent call last):
  File ".venv/bin/wes-client", line 7, in <module>
    from wes_client.wes_client_main import main
  File "/home/gijs/Work/buis/.venv/lib/python3.6/site-packages/wes_client/wes_client_main.py", line 11, in <module>
    from wes_client.util import modify_jsonyaml_paths, WESClient
  File "/home/gijs/Work/buis/.venv/lib/python3.6/site-packages/wes_client/util.py", line 12, in <module>
    from urllib import urlopen
ImportError: cannot import name 'urlopen'

Test packaging

The CLI entrypoint and installation should be under test.

To close this add tests that demonstrate that the CLI entrypoints are behaving as expected.

Workflow API usage examples

This project would benefit having some examples in the README showing how this API is expected to be used.

This is also particularly relevant because the workflow execution request object contains many fields with a generic name. It's not easily understandable how to specify the workflow to be executed, the required input/output objects and any other required data.

There's a small example in the reference implementation project but it doesn't seem to be aligned with the current schema definition.

Make a new release

The current release is broken on installation for me. To close this we ought to make a new release that fixes the package on pypi.

Tests with inputs, descriptors available via HTTP

The workflow service can download workflow descriptions that provided as HTTP links. To properly test this surface we should host a workflow description via a SimpleHTTPServer or similar.

This could also be used to test workflows that have inputs that must be downloaded/provisioned, by hosting the inputs needed to perform the test.

To close this add integration tests that show the different ways workflow descriptors can be provided (both HTTP and string).

Implement/test workflow_attachments

The multipart upload feature enables uploading multiple attachments to stage a workflow as suggested in the schemas: https://github.com/ga4gh/workflow-execution-service-schemas/blob/develop/openapi/workflow_execution_service.swagger.yaml#L123

To properly demonstrate this, the workflow run endpoint should check for the presence of a workflow attachments form upload. The filename of this should be checked following the suggestion in the schemas so that malicious rewrites aren't easy. The presence of these files should be guaranteed in a new test, which uploads a local tool description in addition to the workflow descriptor.

Toil tests need updating.

I've updated the versioning for both toil and connexion since https://github.com/ga4gh/cloud-interop-testing is currently using this as a dependency.

Unfortunately:

  • test_dockstore_md5sum
  • test_local_md5sum
  • test_run_attachments

Are now failing for toil. I've overwritten their inheritance in the testing (#72) until they can be fixed.

No idea where in space and time I'll be able to get to this unfortunately, but assigning myself anyway.

Add paging

Currently all workflows in a directory are returned.

Anonymous usage statistics

Adding the ability to opt-in to a heartbeat monitor would allow us to have a better understanding of who, and where this service is being used.

wes-docker.sh build process fails

$ ./wes-docker.sh 
+ python setup.py sdist
running sdist
running egg_info
creating wes_service.egg-info
writing wes_service.egg-info/PKG-INFO
writing dependency_links to wes_service.egg-info/dependency_links.txt
writing entry points to wes_service.egg-info/entry_points.txt
writing requirements to wes_service.egg-info/requires.txt
writing top-level names to wes_service.egg-info/top_level.txt
writing manifest file 'wes_service.egg-info/SOURCES.txt'
reading manifest file 'wes_service.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'wes_service.egg-info/SOURCES.txt'
running check
creating wes-service-4.0
creating wes-service-4.0/test
creating wes-service-4.0/wes_client
creating wes-service-4.0/wes_service
creating wes-service-4.0/wes_service.egg-info
creating wes-service-4.0/wes_service/openapi
copying files to wes-service-4.0...
copying MANIFEST.in -> wes-service-4.0
copying README.md -> wes-service-4.0
copying README.pypi.rst -> wes-service-4.0
copying setup.py -> wes-service-4.0
copying test/test_client_util.py -> wes-service-4.0/test
copying test/test_integration.py -> wes-service-4.0/test
copying wes_client/__init__.py -> wes-service-4.0/wes_client
copying wes_client/util.py -> wes-service-4.0/wes_client
copying wes_client/wes_client_main.py -> wes-service-4.0/wes_client
copying wes_service/__init__.py -> wes-service-4.0/wes_service
copying wes_service/arvados_wes.py -> wes-service-4.0/wes_service
copying wes_service/cwl_runner.py -> wes-service-4.0/wes_service
copying wes_service/toil_wes.py -> wes-service-4.0/wes_service
copying wes_service/util.py -> wes-service-4.0/wes_service
copying wes_service/wes_service_main.py -> wes-service-4.0/wes_service
copying wes_service.egg-info/PKG-INFO -> wes-service-4.0/wes_service.egg-info
copying wes_service.egg-info/SOURCES.txt -> wes-service-4.0/wes_service.egg-info
copying wes_service.egg-info/dependency_links.txt -> wes-service-4.0/wes_service.egg-info
copying wes_service.egg-info/entry_points.txt -> wes-service-4.0/wes_service.egg-info
copying wes_service.egg-info/not-zip-safe -> wes-service-4.0/wes_service.egg-info
copying wes_service.egg-info/requires.txt -> wes-service-4.0/wes_service.egg-info
copying wes_service.egg-info/top_level.txt -> wes-service-4.0/wes_service.egg-info
copying wes_service/openapi/workflow_execution_service.swagger.yaml -> wes-service-4.0/wes_service/openapi
Writing wes-service-4.0/setup.cfg
creating dist
Creating tar archive
removing 'wes-service-4.0' (and everything under it)
+ docker build --build-arg version=4.0 --build-arg arvversion=2.2.1 -t commonworkflowlanguage/workflow-service .
[+] Building 0.7s (12/18)                                                                                                                                                              
 => [internal] load build definition from Dockerfile                                                                                                                              0.0s
 => => transferring dockerfile: 1.88kB                                                                                                                                            0.0s
 => [internal] load .dockerignore                                                                                                                                                 0.0s
 => => transferring context: 2B                                                                                                                                                   0.0s
 => [internal] load metadata for docker.io/library/debian:buster                                                                                                                  0.5s
 => CANCELED [ 1/14] FROM docker.io/library/debian:buster@sha256:f9182ead292f45165f4a851e5ff98ea0800e172ccedce7d17764ffaae5ed4d6e                                                 0.0s
 => => resolve docker.io/library/debian:buster@sha256:f9182ead292f45165f4a851e5ff98ea0800e172ccedce7d17764ffaae5ed4d6e                                                            0.0s
 => => sha256:f9182ead292f45165f4a851e5ff98ea0800e172ccedce7d17764ffaae5ed4d6e 1.85kB / 1.85kB                                                                                    0.0s
 => => sha256:2f4b2c4b44ec6d9f4afc9578d8475f7272dc50783268dc672d19b0db6098b89d 529B / 529B                                                                                        0.0s
 => => sha256:2b6f409b1d24285cbae8f2c6baa6969741151bed561a9b47c57b7ff1ee33829a 1.46kB / 1.46kB                                                                                    0.0s
 => [internal] load build context                                                                                                                                                 0.0s
 => => transferring context: 14.28kB                                                                                                                                              0.0s
 => CACHED [ 2/14] ADD keys/58118E89F3A912897C070ADBF76221572C52609D.asc keys/561F9B9CAC40B2F7.asc keys/docker-archive-keyring.gpg /tmp/                                          0.0s
 => CACHED [ 3/14] RUN apt-get update &&     apt-get install -y dirmngr gnupg &&     apt-key add --no-tty /tmp/561F9B9CAC40B2F7.asc &&     apt-get install -y apt-transport-http  0.0s
 => CACHED [ 4/14] RUN apt-get update &&     apt-get install -y --no-install-recommends passenger python3-setuptools build-essential python3-dev python3-pip git &&     pip3 ins  0.0s
 => CACHED [ 5/14] RUN apt-get install -y --no-install-recommends libcurl4-openssl-dev libssl-dev                                                                                 0.0s
 => CACHED [ 6/14] RUN mv /tmp/docker-archive-keyring.gpg /usr/share/keyrings/docker-archive-keyring.gpg                                                                          0.0s
 => CACHED [ 7/14] RUN mkdir -p /etc/apt/sources.list.d &&     echo         "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.c  0.0s
 => ERROR [ 8/14] COPY dist/arvados-cwl-runner-2.2.1.tar.gz /root                                                                                                                 0.0s
------
 > [ 8/14] COPY dist/arvados-cwl-runner-2.2.1.tar.gz /root:
------
failed to compute cache key: "/dist/arvados-cwl-runner-2.2.1.tar.gz" not found: not found

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.