Part of an internship at the Image processing for enhanced cinematography research group.
This README aims to explain how to use the code (Quickstart), then how it works.
- If you have set up the whole Mondrians project (only available for UPF - IP4EC members), you could go to Use.
Clone this remote and personalized_tools, that you may store in the same directory.
git clone [email protected]:tourfl/mondrian_factory.git
git clone [email protected]:tourfl/personalized_tools.git
Then you clone the HDR Toolbox, on your local Matlab folder (assuming this is ~/Documents/MATLAB/), with the following command:
git clone https://github.com/banterle/HDR_Toolbox.git ~/Documents/MATLAB/HDR_Toolbox
Next, you copy the startup.m file (:warning: it assumes you have put the two repositories in your Matlab folder) to your Matlab folder. It will automaticaly add the good folders to your Matlab path.
It is now installed! π«
All you need to modify is the main.m file. The parameters are the following:
The output images are stored at ../images/ (same level as mondrian_factory/).
For more theoretical explanations, see the experimental report. This is only about how the code is working.
- personalized_tools: shared classes and superclasses (e.g. MondrianHandler)
- HDR_Toolbox: I/O on PFM images, required for using PFM images
- data: mainly .mat files with color matching functions, shape description, munsell colors reflectances, illuminants powerness
Mainly refers to color space. For LMS, the cones fundamentals are used, and for RGB, an RGB color matching function. The data are from the Colour & Vision database from the University College of London. Concerning the HDR space this is the same color matching function as RGB but the images are saved as PFM images. This file format allow to work with RGB values that are out of the [0 1] range. Yet, this is mainly used because an algorithm (private) requires PFM as input.
Currently there is only one shape available, this is the one from Land and McCann's experiment. You could build another shape (with the same number of areas), following the model of the existing one (data/shape/Landshape.mat, modify the italic part).
trying to emulate Land's illuminations, 5 attempts are available:
- To have best RGB white
- Landβs XYZ
- All Landβs illuminants powerness
- One of Landβs illuminants powerness
- D65 values
According to Land's paper, there are 5 experiments: gray, red, green, blue, yellow. Those correspond to the actual color that is rendered as gray thanks to the illumination.
The output images would be stored in the images/ folder, at the same level as mondrian_factory/.
..
βββ images
βΒ Β βββ HDR
βΒ Β βΒ Β βββ solution1
βΒ Β βΒ Β βΒ Β βββ blueexp_s1_HDR.pfm
βΒ Β βΒ Β βΒ Β βββ blueexp_s1_HDR_percepted.pfm
βΒ Β βΒ Β βΒ Β βββ grayexp_s1_HDR.pfm
βΒ Β βΒ Β βΒ Β βββ grayexp_s1_HDR_percepted.pfm
βΒ Β βΒ Β βΒ Β βββ greenexp_s1_HDR.pfm
βΒ Β βΒ Β βΒ Β βββ greenexp_s1_HDR_percepted.pfm
βΒ Β βΒ Β βΒ Β βββ redexp_s1_HDR.pfm
βΒ Β βΒ Β βΒ Β βββ redexp_s1_HDR_percepted.pfm
βΒ Β βΒ Β βΒ Β βββ yellowexp_s1_HDR.pfm
βΒ Β βΒ Β βΒ Β βββ yellowexp_s1_HDR_percepted.pfm
βΒ Β βΒ Β βββ solution2
βΒ Β βΒ Β βββ ...
βΒ Β βββ LMS
βΒ Β βββ RGB
βββ mondrian_factory
βββ personalized_tools
For each experiment (blue, red,...), there is an experimental and a perceptual version, according to McCann's paper. The two versions are built in the following way:
- experimental: actual color labels & experimental illumination
- perceptual: perceptual color labels & white illumination
the color labels and illuminations, are from McCann's.
This is Matlab code in an object-oriented fashion. See the UML class diagram below.