Wrapped applications can easily be shipped by deploying the entire assembly. If the latter is a custom-made distribution assembly, this might already be it (see pgit distro). Custom distributions needs maintenance though, and they are not really doing anything useful except for bringing together what belongs together.
This task can easily be automated, which would allow the following workflow:
- maintain one big assembly which contains all programs and applications (like bdevel)
- deploy any application configured with bprocess for use on one or more target platforms.
The created deployments are made to be relocatable, and may contain pre-made or generated readme files and license information.
It could be considered to provide cached/pickled configuration to prevent changes to it, and speed up loading times even further.
A vital feature I see for such a tool is to compress python packages (if zip safe) to reduce loading times considerably thanks to less IO. It should be possible to byte-compile and optimize code, to end up with deployments with the following properties
- suitable for one or more platforms
- work with one or more python interpreters
- contains one or more executables
- python packages are source code, or byte compiled (with optional optimizations)
- may contain mkdocs documentation, doxygen docs, or additional files or packages
- directory layout may be customized
- can be zipped
Possible Applications
- Create software demos, or deploy entire pipelines with it
- Ship individual assets for review, with all applications needed to do so (similar to the blender assets for download on blender cloud, but including blender)
- generate a bootstrap script, which installs all requirements from github into the lib folder. This would be another application of some sort of asset management, see further below. All this gravitates towards something like pip, and it should better be used.
- Use some sort of LIBPATH to install dependencies into, allowing to build a multi-version repository
Proposed Solutions
The command can, as usual, be configured on the commandline as well as via the kvstore, to be useful both in automated environments and for interactive use.
$ be deploy
> [lists all deployable items]
$ be deploy blender-demo
> [creates a deployment according to it's configuration]
$ be deploy blender-demo -s platforms=linux python.versions=3.3 python.compile=OOO python.packages=zip
> [a deployment just for linux, using python 3.3, with compiled and OO optimized bytecode, in a zip file]
Into the Future
Deployments contain plenty of dependencies, which can be reused by other deployments.
It would be nice to have a repository of various versions of packages, which can be downloaded only when needed.
That way, the binary executed to launch a program could download whichever dependencies required, as needed. If it knows a central location, it could look there first to prevent duplicate downloads.
Definitely keep this in mind while implementing it
Asset Management
deploy
and keeping a repository of dependencies is actually nothing else than asset management. Similar things you want to do when pushing computations to the cloud for example.
Having an integrated solution here can be vital in many ways, and it might be worthwhile of seeing this problem in the broader context.