GithubHelp home page GithubHelp logo

bids-apps / example Goto Github PK

View Code? Open in Web Editor NEW
16.0 3.0 31.0 115 KB

This an example app that can serve as a template.

Home Page: https://doi.org/10.1371/journal.pcbi.1005209

License: Apache License 2.0

Python 69.42% Perl 20.36% Dockerfile 7.69% Singularity 2.53%
bids bids-apps

example's People

Contributors

bennet-umich avatar chrisgorgo avatar erinb90 avatar gkiar avatar glatard avatar ntraut avatar oesteban avatar pre-commit-ci[bot] avatar remi-gau avatar sappelhoff avatar yarikoptic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

example's Issues

new image will not builld as long as the the base_fsl image is not functional

https://app.circleci.com/pipelines/github/bids-apps/example/12/workflows/7e344e8d-1a44-462d-9ca4-794eb7eb77f7/jobs/133

Sending build context to Docker daemon  196.6kB


Step 1/7 : FROM bids/base_fsl
latest: Pulling from bids/base_fsl


30d541b48fc0: Pulling fs layer 


8ecd7f80d390: Pulling fs layer 


46ec9927bb81: Pulling fs layer 


2e67a4d67b44: Pulling fs layer 


7d9dd9155488: Pulling fs layer 


40cb1f9fa9ae: Pulling fs layer 


edeefccca31a: Pulling fs layer 


bb150d733733: Pulling fs layer 

7d9dd9155488: Waiting 

40cb1f9fa9ae: Waiting 

edeefccca31a: Waiting 

bb150d733733: Waiting 

2e67a4d67b44: Waiting 

46ec9927bb81: Downloading     357B/357B

46ec9927bb81: Verifying Checksum 

46ec9927bb81: Download complete 

8ecd7f80d390: Downloading  16.38kB/71.56kB

8ecd7f80d390: Download complete 

7d9dd9155488: Downloading     162B/162B

7d9dd9155488: Verifying Checksum 

7d9dd9155488: Download complete 

2e67a4d67b44: Downloading     680B/680B

2e67a4d67b44: Verifying Checksum 

2e67a4d67b44: Download complete 

30d541b48fc0: Downloading  535.5kB/65.69MB

30d541b48fc0: Downloading  13.36MB/65.69MB

40cb1f9fa9ae: Downloading  206.3kB/19.36MB

30d541b48fc0: Downloading  19.23MB/65.69MB

edeefccca31a: Downloading  102.8kB/10.1MB

40cb1f9fa9ae: Downloading  13.58MB/19.36MB

40cb1f9fa9ae: Verifying Checksum 

40cb1f9fa9ae: Download complete 

30d541b48fc0: Downloading  26.23MB/65.69MB

30d541b48fc0: Downloading  29.97MB/65.69MB

edeefccca31a: Downloading  9.781MB/10.1MB

edeefccca31a: Verifying Checksum 

edeefccca31a: Download complete 

30d541b48fc0: Downloading  33.71MB/65.69MB

bb150d733733: Downloading    538kB/262.4MB

30d541b48fc0: Downloading  49.84MB/65.69MB

bb150d733733: Downloading  11.87MB/262.4MB

30d541b48fc0: Downloading  59.57MB/65.69MB

bb150d733733: Downloading   23.2MB/262.4MB

30d541b48fc0: Verifying Checksum 

30d541b48fc0: Download complete 

30d541b48fc0: Extracting  557.1kB/65.69MB

bb150d733733: Downloading   28.6MB/262.4MB

30d541b48fc0: Extracting  3.342MB/65.69MB

30d541b48fc0: Extracting  6.685MB/65.69MB

30d541b48fc0: Extracting   11.7MB/65.69MB

bb150d733733: Downloading  33.97MB/262.4MB

30d541b48fc0: Extracting  13.93MB/65.69MB

bb150d733733: Downloading   42.5MB/262.4MB

bb150d733733: Downloading  51.68MB/262.4MB

30d541b48fc0: Extracting  16.15MB/65.69MB

bb150d733733: Downloading  60.85MB/262.4MB

30d541b48fc0: Extracting  18.38MB/65.69MB

bb150d733733: Downloading  67.24MB/262.4MB

30d541b48fc0: Extracting  22.28MB/65.69MB

bb150d733733: Downloading  76.38MB/262.4MB

30d541b48fc0: Extracting  23.95MB/65.69MB

bb150d733733: Downloading  86.63MB/262.4MB

30d541b48fc0: Extracting  26.18MB/65.69MB

bb150d733733: Downloading  95.81MB/262.4MB

30d541b48fc0: Extracting  27.85MB/65.69MB

bb150d733733: Downloading  101.2MB/262.4MB

30d541b48fc0: Extracting  29.52MB/65.69MB

bb150d733733: Downloading  108.2MB/262.4MB

30d541b48fc0: Extracting   31.2MB/65.69MB

bb150d733733: Downloading  116.3MB/262.4MB

30d541b48fc0: Extracting  33.42MB/65.69MB

bb150d733733: Downloading  127.6MB/262.4MB

30d541b48fc0: Extracting  37.32MB/65.69MB

bb150d733733: Downloading  129.2MB/262.4MB

30d541b48fc0: Extracting  39.55MB/65.69MB

30d541b48fc0: Extracting  41.78MB/65.69MB

bb150d733733: Downloading  134.5MB/262.4MB

30d541b48fc0: Extracting  45.12MB/65.69MB

bb150d733733: Downloading  143.6MB/262.4MB

30d541b48fc0: Extracting  46.79MB/65.69MB

bb150d733733: Downloading  151.7MB/262.4MB

30d541b48fc0: Extracting  47.91MB/65.69MB

bb150d733733: Downloading    163MB/262.4MB

30d541b48fc0: Extracting  50.69MB/65.69MB

bb150d733733: Downloading  167.8MB/262.4MB

30d541b48fc0: Extracting  53.48MB/65.69MB

bb150d733733: Downloading  176.4MB/262.4MB

bb150d733733: Downloading  188.8MB/262.4MB

30d541b48fc0: Extracting  54.59MB/65.69MB

bb150d733733: Downloading  197.9MB/262.4MB

30d541b48fc0: Extracting  56.26MB/65.69MB

bb150d733733: Downloading  201.7MB/262.4MB

bb150d733733: Downloading  210.2MB/262.4MB

30d541b48fc0: Extracting  57.38MB/65.69MB

bb150d733733: Downloading  223.7MB/262.4MB

30d541b48fc0: Extracting   59.6MB/65.69MB

bb150d733733: Downloading  230.7MB/262.4MB

30d541b48fc0: Extracting  62.95MB/65.69MB

bb150d733733: Downloading    235MB/262.4MB

30d541b48fc0: Extracting  65.69MB/65.69MB

bb150d733733: Downloading  238.7MB/262.4MB

bb150d733733: Downloading  251.6MB/262.4MB

30d541b48fc0: Pull complete 

8ecd7f80d390: Extracting  32.77kB/71.56kB

8ecd7f80d390: Extracting  71.56kB/71.56kB

8ecd7f80d390: Extracting  71.56kB/71.56kB

bb150d733733: Downloading  262.3MB/262.4MB

bb150d733733: Verifying Checksum 

bb150d733733: Download complete 

8ecd7f80d390: Pull complete 

46ec9927bb81: Extracting     357B/357B

46ec9927bb81: Extracting     357B/357B

46ec9927bb81: Pull complete 

2e67a4d67b44: Extracting     680B/680B

2e67a4d67b44: Extracting     680B/680B

2e67a4d67b44: Pull complete 

7d9dd9155488: Extracting     162B/162B

7d9dd9155488: Extracting     162B/162B

7d9dd9155488: Pull complete 

40cb1f9fa9ae: Extracting  196.6kB/19.36MB

40cb1f9fa9ae: Extracting  4.522MB/19.36MB

40cb1f9fa9ae: Extracting   9.83MB/19.36MB

40cb1f9fa9ae: Extracting  11.21MB/19.36MB

40cb1f9fa9ae: Extracting  11.99MB/19.36MB

40cb1f9fa9ae: Extracting  12.78MB/19.36MB

40cb1f9fa9ae: Extracting  16.12MB/19.36MB

40cb1f9fa9ae: Extracting  19.07MB/19.36MB

40cb1f9fa9ae: Extracting  19.36MB/19.36MB

40cb1f9fa9ae: Pull complete 

edeefccca31a: Extracting  131.1kB/10.1MB

edeefccca31a: Extracting  5.112MB/10.1MB

edeefccca31a: Extracting  6.554MB/10.1MB

edeefccca31a: Extracting  7.602MB/10.1MB

edeefccca31a: Extracting  8.389MB/10.1MB

edeefccca31a: Extracting  8.782MB/10.1MB

edeefccca31a: Extracting  9.175MB/10.1MB

edeefccca31a: Extracting  9.568MB/10.1MB

edeefccca31a: Extracting   10.1MB/10.1MB

edeefccca31a: Pull complete 

bb150d733733: Extracting  557.1kB/262.4MB

bb150d733733: Extracting  2.785MB/262.4MB

bb150d733733: Extracting  6.685MB/262.4MB

bb150d733733: Extracting  11.14MB/262.4MB

bb150d733733: Extracting  16.15MB/262.4MB

bb150d733733: Extracting  21.17MB/262.4MB

bb150d733733: Extracting  26.18MB/262.4MB

bb150d733733: Extracting   31.2MB/262.4MB

bb150d733733: Extracting  36.21MB/262.4MB

bb150d733733: Extracting  38.99MB/262.4MB

bb150d733733: Extracting  41.22MB/262.4MB

bb150d733733: Extracting  44.01MB/262.4MB

bb150d733733: Extracting  49.58MB/262.4MB

bb150d733733: Extracting  55.15MB/262.4MB

bb150d733733: Extracting   59.6MB/262.4MB

bb150d733733: Extracting  65.73MB/262.4MB

bb150d733733: Extracting  67.96MB/262.4MB

bb150d733733: Extracting  74.09MB/262.4MB

bb150d733733: Extracting  80.77MB/262.4MB

bb150d733733: Extracting  86.34MB/262.4MB

bb150d733733: Extracting  92.47MB/262.4MB

bb150d733733: Extracting  96.93MB/262.4MB

bb150d733733: Extracting  101.4MB/262.4MB

bb150d733733: Extracting    107MB/262.4MB

bb150d733733: Extracting  112.5MB/262.4MB

bb150d733733: Extracting  118.7MB/262.4MB

bb150d733733: Extracting  123.1MB/262.4MB

bb150d733733: Extracting  128.7MB/262.4MB

bb150d733733: Extracting  133.7MB/262.4MB

bb150d733733: Extracting  138.7MB/262.4MB

bb150d733733: Extracting    156MB/262.4MB

bb150d733733: Extracting  172.1MB/262.4MB

bb150d733733: Extracting  187.7MB/262.4MB

bb150d733733: Extracting  198.3MB/262.4MB

bb150d733733: Extracting    210MB/262.4MB

bb150d733733: Extracting  225.1MB/262.4MB

bb150d733733: Extracting  237.9MB/262.4MB

bb150d733733: Extracting  243.4MB/262.4MB

bb150d733733: Extracting  246.8MB/262.4MB

bb150d733733: Extracting  250.1MB/262.4MB

bb150d733733: Extracting    254MB/262.4MB

bb150d733733: Extracting  255.7MB/262.4MB

bb150d733733: Extracting  258.5MB/262.4MB

bb150d733733: Extracting  261.8MB/262.4MB

bb150d733733: Extracting  262.4MB/262.4MB

bb150d733733: Pull complete 
Digest: sha256:245acc062e088768fd5344301e13b45ed77c8596011f5b066a3b6dc4f110ceb7
Status: Downloaded newer image for bids/base_fsl:latest
 ---> ab1b9b230aba
Step 2/7 : ARG DEBIAN_FRONTEND="noninteractive"
 ---> Running in d195134dcdec
Removing intermediate container d195134dcdec
 ---> 4be71a21db21
Step 3/7 : RUN apt-get update -qq &&     apt-get install -q -y --no-install-recommends python3 python3-pip python3-numpy &&     pip3 install nibabel==2.0 &&     apt-get remove -y python3-pip &&     rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
 ---> Running in 7752a3e80a27
W: Failed to fetch https://deb.nodesource.com/node_6.x/dists/trusty/main/source/Sources  server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none

W: Failed to fetch https://deb.nodesource.com/node_6.x/dists/trusty/main/binary-amd64/Packages  server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none

E: Some index files failed to download. They have been ignored, or old ones used instead.
The command '/bin/sh -c apt-get update -qq &&     apt-get install -q -y --no-install-recommends python3 python3-pip python3-numpy &&     pip3 install nibabel==2.0 &&     apt-get remove -y python3-pip &&     rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*' returned a non-zero code: 100

Exited with code exit status 100

example produces no sensible output

There was a concern about bids-apps and git-annex'es symlinks. I have tried to run the BIDS-app/example on two sample datasets (ds000001 and 5 from openneuro) to arrive to the same results regardless of the files being symlinks or not. Since ds005 was originally "demonstrated" in README.md here is full protocol for its run:

smaug:/mnt/datasets/datalad/tmp
$> datalad install -g -s ///openneuro/ds000005      
[INFO   ] Cloning http://datasets.datalad.org/openneuro/ds000005 [1 other candidates] into '/mnt/datasets/datalad/tmp/ds000005' 
[INFO   ] access to dataset sibling "s3-PRIVATE" not auto-enabled, enable with:                                                                                                                                         
| 		datalad siblings -d "/mnt/datasets/datalad/tmp/ds000005" enable -s s3-PRIVATE 
install(ok): /mnt/datasets/datalad/tmp/ds000005 (dataset)
action summary:                                                                                                                                                                                                         
  get (ok: 81)                                                                                                                                                                                                          
  install (ok: 1)                                                                                                                                                                                                       
datalad install -g -s ///openneuro/ds000005  85.82s user 22.10s system 365% cpu 29.496 

$> docker run -i --rm -v $PWD/ds000005:/bids_dataset:ro -v $PWD/:/outputs bids/example /bids_dataset /outputs group        
	1: This file is not part of the BIDS specification, make sure it isn't included in the dataset by accident. Data derivatives (processed data) should be placed in /derivatives folder. (code: 1 - NOT_INCLUDED)
		/.datalad/.gitattributes
			Evidence: .gitattributes
		/.datalad/config
			Evidence: config
		/.gitattributes
			Evidence: .gitattributes

	2: You should define 'SliceTiming' for this file. If you don't provide this information slice time correction will not be possible. (code: 13 - SLICE_TIMING_NOT_DEFINED)
		/sub-01/func/sub-01_task-mixedgamblestask_run-01_bold.nii.gz
		/sub-01/func/sub-01_task-mixedgamblestask_run-02_bold.nii.gz
		/sub-01/func/sub-01_task-mixedgamblestask_run-03_bold.nii.gz
		/sub-02/func/sub-02_task-mixedgamblestask_run-01_bold.nii.gz
		/sub-02/func/sub-02_task-mixedgamblestask_run-02_bold.nii.gz
		/sub-02/func/sub-02_task-mixedgamblestask_run-03_bold.nii.gz
		/sub-03/func/sub-03_task-mixedgamblestask_run-01_bold.nii.gz
		/sub-03/func/sub-03_task-mixedgamblestask_run-02_bold.nii.gz
		/sub-03/func/sub-03_task-mixedgamblestask_run-03_bold.nii.gz
		/sub-04/func/sub-04_task-mixedgamblestask_run-01_bold.nii.gz
		... and 38 more files having this issue (Use --verbose to see them all).

        Summary:                 Available Tasks:          Available Modalities: 
        498 Files, 3.54GB        mixed-gambles task        T1w                   
        16 - Subjects                                      inplaneT2             
        1 - Session                                        bold                  


/run.py:86: RuntimeWarning: Mean of empty slice.
  fp.write("Average brain size is %g voxels"%numpy.array(brain_sizes).mean())
/usr/local/lib/python3.4/dist-packages/numpy/core/_methods.py:80: RuntimeWarning: invalid value encountered in double_scalars
  ret = ret.dtype.type(ret / rcount)

$> cat avg_brain_size.txt 
Average brain size is nan voxels%  

so would be nice to have example to provide sensible measure in the output

forcing the validation of datasets prior to launch limits the use of apps on "far" datasets

Hi team!

My use case is the following: I have several datasets, all BIDS validated, stored "far away" (i.e. not where I wish to do processing). When I launch the BIDS apps I wish to run on them, such as this one, the validator forces the crawling of the dataset which means many queries and file accesses to my far away dataset, resulting in a huge amount of a) unnecessary data and metadata download (as I'd launch a job for a single participant on isolated nodes, and each would perform a full dataset validation), and b) latency while I wait for this download and crawl and validation to take place.

My proposal is that if developers of BIDS apps wish to have the validator run prior to their tool, it be added with a --validate option on the command-line, so that default behaviour is more cloud/distributed computing friendly. I think this is also a fair re-distribution of labour since it puts the onus of dataset validation on data storage platforms, rather than independent tasks (which could feasibly be running different versions of the validator, etc.).

Happy to hear opinions of others, or explain in more detail exactly how my data is stored and being accessed (I omitted these details here since this issue is consistent, though varies in impact, for other types of network or symbolic mounts; in my particular case this is prohibitive and I cannot run apps which use the validator).

Cheers,
G

Tutorial and example app

Not sure what is wrong, so the question may be semi-coherent.

I am trying to work through the tutorial at http://bids-apps.neuroimaging.io/tutorial/

I have Docker version 1.10.3, build d381c64-unsupported as installed by yum on CentOS 7.

Our Docker is currently locked down so only root can talk to the daemon, so I have to

$ sudo docker run -i --rm \
    -v /tmp/bids/ds005_R1.1.0:/bids_dataset:ro \
    -v /tmp/bids/ds005_R1.1.0/output:/outputs \
    bids:example \
    /bids_dataset /outputs participant --participant_label 01

using the ds005 data set downloaded from the example files today. It runs

bet /bids_dataset/sub-01/anat/sub-01_T1w.nii.gz /outputs/sub-01_brain.nii.gz

and produces

$ ls -l ds005_R1.1.0/output/
total 1216
-rw-r--r-- 1 root root 1242149 Sep  4 15:47 sub-01_brain.nii.gz

I then try to convert the docker image to a Singularity image (this is on a machine in the cluster) using

sudo docker run --privileged -ti --rm \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -v /tmp/singularity:/output docker2singularity \
    bids:example

after first grabbing the filos/docker2singularity Docker bundle and building a local version, setting the Docker version to match ours. That seems to go OK, and produces

$ sudo docker run --privileged -ti --rm \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -v /tmp/singularity:/output docker2singularity \
    bids:example

Size: 1615 MB for the singularity container
Creating a sparse image with a maximum size of 1615MiB...
Using given image size of 1615
Formatting image (/sbin/mkfs.ext3)
Done. Image can be found at: /output/bids_example-2016-09-04-a2e66f1e787d.img

Singularity: sexec (U=0,P=144)> Command=exec, Container=/output/bids_example-2016-09-04-a2e66f1e787d.img, CWD=/, Arg1=/bin/sh
Fixing permissions.
Singularity: sexec (U=0,P=150)> Command=exec, Container=/output/bids_example-2016-09-04-a2e66f1e787d.img, CWD=/, Arg1=/bin/sh
Singularity: sexec (U=0,P=179)> Command=exec, Container=/output/bids_example-2016-09-04-a2e66f1e787d.img, CWD=/, Arg1=/bin/sh
Adding mount points
Singularity: sexec (U=0,P=8600)> Command=exec, Container=/output/bids_example-2016-09-04-a2e66f1e787d.img, CWD=/, Arg1=/bin/sh
Stopping container, please wait.
a2e66f1e787d

However, no combination that I can find produces a run of bet. When running from Docker, the path mapping from the host machine to the container has to be specified on the command line. I am not seeing anything that looks like that in the conversion to singularity or in the singularity invocation. My suspicion is that it's failing because no files are found because /bids_dataset and /output are not getting properly mapped.

Have I done something wrong? Are the tutorial instructions incomplete?

How do we tell the Singularity application where the data and output files are?

Sorry if this is the wrong place to post; I don't see a place to post for text in the tutorial.

minimize example docker image size

ATM it is whooping 1.13GB so takes time to download it. And then together with #24 it just becomes somewhat disappointing to wait to just get no sensible output.

Port to CircleCI 2.0

Any BIDS Apps that were initially built using this example as a template will no longer pass Continuous Integration testing:

https://circleci.com/sunset1-0/

It would likely be useful for existing and future BIDS App developers if this baseline example repo were updated from CircleCI 1.0 to 2.0, and the requisite changes encapsulated within the corresponding PR.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.