Comments (4)
@prvst @sauloal @bgruening @hroest @timosachsenberg @BioDocker/contributors
About this two questions, I have a major question about how we will handle the containers in other repositories. Lot of users and contributors wants to handle their own containers in their repositories.
Making biodocker less centralize system. Some of things that needs to be clear:
- To be a Biodocker container needs to be biodocker compliant.
- The deploy should be done under biodocker hub (DockerHub)
Things that needs to be solve:
- How we can notice from our web/registry/github-container repo that those containers exist.
- How we can be sure those containers works and are ready to be use.
We need to improve our documentation to been able to handle those cases.
from specs.
I think the two first points are essential. In order to be part of the project, the images must follow most, if no all, requirements and,most importantly, they must be provided by BioDocker hub account. By doing that we can assure a certain level of quality and at the same time, reinforce the "brand" associated with every image. People can still use the images outside the BioDocker hub by building them to end users by themselves, but I think it is very important to distribute them using the "official" hub account.
How we can notice from our web/registry/github-container repo that those containers exist.
Can you explain better this point, I'm not sure if I understood.
How we can be sure those containers works and are ready to be use.
If we want to decentralize, we can still use out official hub account to distribute third-party images. Considering that someone has a new repository with a Dockerfile to be added to BioDocker, we can check if the Dockerfile complaints with the necessary requirements and then we can point the automated build to that repository. The owner will be responsible for the image maintenance, there is no need to clone it into our containers list and if something goes wrong with the build we get a notification from the hub.
from specs.
@prvst Where is defined the first two points?.
from specs.
In many of your docker files I see that you change to a biodocker user or that you set as working directory the directory of a tool, but I was wondering is those behaviours are indicated somewhere, are they necessary requirements, etc.
Yeah, we need to be more explicit with this and fine-tune our recommendation. I don't think we enforce to use the biococker user, isn't it?
As a sideline to this, in one of our important use cases β running galaxy with dockerized tools within a container orchestration environment β having the containerised tool running ability bound to be in a defined working directory breaks things. Iβm not saying that youβre actively doing this, but if that is the reason for setting the working directory on the docker file, then it would be a problem for us. This is because galaxy needs to control working directories to detect variably named output files. The only case where we see this happening is when the main entrypoint is something like , like Rscript myScript.R (receiving arguments from the outside). Our current solution to this is to wrap this call in shell file, pass arguments to the wrapped call, leave this wrapper in the path and make it executable.
I guess we can add more flexibility to Galaxy, but I would actually fix the Rscript myScript.R
example. We should not have this entrypoint, isn't it?
Another question. We are currently building docker files within the our project for a number of metabolomics tools. However, for project reporting statistics and deliverables, we need to keep this under the ownership of the our github organization.
I know this is an entire political decision and I hate it. It makes our work and everyone else just harder. Why does it not count that someone contributed to a already existing community. Ideally you should gain more reputation and visibility if they contribute to biodocker.
That said, if someone really needs to have their own organisation and branding I think we can create a "BioDocker approved" sticker/badge that you can apply to your organisation. We could also setup DockerHub repositories and grant you permissions, again just a lot of work for us that I would like to avoid if possible.
Do you see a way in which we could both host these docker files in our github repo while also making them available at the biodocker repo? Would the git submodules work for this?
I think the hard part is figuring out how we can solve the biodocker dockerhub integration. Do we really want to have two identical images?
Is a biodocker forked repo in your organisation, with modified readme file maybe enough for your funding agency?
I know this user can not do much for such funding-agency requirements, sorry if I sounded too harsh ... such things just make me cry.
Things that needs to be solve:
- How we can notice from our web/registry/github-container repo that those containers exist.
We need a web-page that sits on top of Dockerhub and makes our container searchable.
- How we can be sure those containers works and are ready to be use.
We need to have a strong CI integration like mulled has to build containers and test them.
from specs.
Related Issues (20)
- Bowtie2 missing in Bismark container HOT 2
- Security Considerations for BioContainers Project HOT 1
- sandbox link on specs main page broken HOT 2
- Tag most recent container with truncated version
- Add labels to BioContainers made from Bioconda packages HOT 1
- MAINTAINER instruction is deprecated
- Would be great to have in the registry the latest version of the software
- contributing link broken HOT 1
- why specifying base image in labels? HOT 1
- Hackathon in October
- BioContainers Specifications for Training HOT 3
- Containers with no "latest" HOT 4
- Containers should have maintainers that can be group or consortiums HOT 2
- license field should use SPDX identifiers, not upstream URLs (too fragile) HOT 12
- BioTools and BioContainers integration. HOT 32
- More metadata into the recipes of BioConda and BioContainers HOT 1
- GDPR considerations for the future. HOT 1
- Additions to best practices/specs
- Broken links in the docs HOT 2
- Use label-schema.org compliant labels HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from specs.