GithubHelp home page GithubHelp logo

mccd's Introduction

Build Status PyPI version PyPI pyversions PyPI pyversions arXiv

MCCD PSF Modelling

Multi-CCD Point Spread Function Modelling.


Main contributor: Tobias Liaudat
Email: [email protected]
Documentation: https://cosmostat.github.io/mccd/
Article: DOI - A&A
Current release: 17/02/2023


The non-parametric MCCD PSF modelling, or MCCD for short, is a Point Spread Function modelling pure python package.
It is used to generate a PSF model based on stars observations in the field of view. Once trained, the MCCD PSF model can then recover the PSF at any position in the field of view.

Contents

  1. Dependencies
  2. Installation
  3. Quick usage
  4. Recommendations

Dependencies

The following python packages should be installed with their specific dependencies:

It is of utmost importance that the PySAP package is correctly installed as we will be using the wavelet transforms provided by it.

Note: The GalSim package was removed from requirements.txt, it is expected to be installed (preferably with conda) before installing the MCCD package.

Installation

After installing all the dependencies one can perform the MCCD package installation:

Locally

git clone https://github.com/CosmoStat/mccd.git
cd mccd
python setup.py install

To verify that the PySAP package is correctly installed and that the MCCD package is accessing the needed wavelet transforms one can run: python setup.py test and check that all the tests are passed.

From Pypi

pip install mccd

Quick usage

The easiest usage of the method is to go through the configuration file config_MCCD.ini using the helper classes found in auxiliary_fun.py (documentation). Description of the parameters can be found directly in the configuration file config_MCCD.ini. The MCCD method can handle SExtractor dataset as input catalogs given that they follow an appropriate naming convention.

The main MCCD model parameters are:

  • LOC_MODEL: Indicating the type of local model to be used (MCCD-HYB, MCCD-RCA, or MCCD-POL),
  • N_COMP_LOC: Indicating the number of eigenPSFs to use in the local model.
  • D_COMP_GLOB: Indicating the maximum polynomial degree for the global model.

After setting up all the parameters from the configuration file there are three main functions, one to fit the model, one to validate the model and the last one to fit and then validate the model. The usage is as follows:

import mccd

config_file_path = 'path_to_config_file.ini'

run_mccd_instance = mccd.auxiliary_fun.RunMCCD(config_file_path,
                                               fits_table_pos=1)

run_mccd_instance.fit_MCCD_models()

For the validation one should replace the last line with:

run_mccd_instance.validate_MCCD_models()

Finally for the fit and validation one should change the last line to:

run_mccd_instance.run_MCCD()

All the output file will be saved on the directories specified on the configuration files.

PSF recovery

To recover PSFs from the model at specific positions test_pos from the CCD ccd_id one could use the following example:

import numpy as np
import mccd

config_file_path = 'path_to_config_file.ini'
mccd_model_path = 'path_to_fitted_mccd_model.npy'
test_pos = np.load(..)
ccd_id = np.load(..)
local_pos = True

mccd_instance = mccd.auxiliary_fun.RunMCCD(
    config_file_path,
    fits_table_pos=1
)

rec_PSFs = mccd_instance.recover_MCCD_PSFs(
    mccd_model_path,
    positions=test_pos,
    ccd_id=ccd_id,
    local_pos=local_pos
)

See the documentation of the recover_MCCD_PSFs() function for more information.

Recommendations

Some notebook examples can be found here.

Changelog

  • Changed from travis deployment to github actions. Changed the github pages template. Now using the one from pyralid-template from sfarrens.

  • Added new module for realisitic simulations dataset_generation.py. It is capable of simulating realistic simulations from the UNIONS/CFIS survey, including realistic atmospherical simulations following a realisation of a Von Kármán model. See the above-mentioned module documentation for more information. See also the testing-realistic-data.ipynb in the notebook folder for an example.

  • Added outlier rejection based on a pixel residual criterion. The main parameters, RMSE_THRESH and CCD_STAR_THRESH can be found in the MCCD config file. See then parameter documentation for more information.

  • For usage inside shape measurement pipelines: new PSF interpolation function included in the MCCD model interpolate_psf_pipeline(). This function allows to output interpolated PSFs with a specific centroid.

  • New handling of position polynomials, local as well as global polynomials. Increased model performance.

  • New functionalities added. Handling of the max polynomial degree in the local hybrid model by the D_HYB_LOC. Also adding a parameter MIN_D_COMP_GLOB to remove lower polynomial degrees in the global polynomial model.

  • Increased default number of iterations to have a better convergence in the PSF wings.

  • Algorithm updates to increase performance: Dropping the normalisation proximal operator. Harder sparsity constraint for spatial variations. Forcing RBF interpolation for the global part. Skipping the last weight optimization so that we always finish with a features/components optimization.

  • Set default denoising to zero as wavelet denoising (using starlets) introduce an important bias in the ellipticity estimates of the model.

mccd's People

Contributors

tobias-liaudat avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

mccd's Issues

MCCD terminal output

Issue from MK when running the MCCD algorithm on a huge amount of data.

MCCD and libraries used within that code produce a lot of terminal output, which on a computer cluster results in enormously large log files.
Examples:
WARNING: Making input data immutable.
and progrss updates:
100% (30 of 30) |##########################################################################################################################| Elapsed Time: 0:01:19 Time: 0:01:19

I tried to trace some of those, without success. They might be produced in non-MCCD libraries. Is there a way to suppresse those outputs? The config file VERBOSE flag does not prevent those.

[Install error] cannot import name 'soft_unicode' from 'markupsafe'

On candide, I'm trying to install mccd on the shapepipe environment. I get the following otuput:

(shapepipe) [<username>@c03 ~/github]$ pip pip install mccd
Collecting mccd
  Using cached mccd-1.1.1-py3-none-any.whl (74 kB)
Collecting modopt==1.5.1
  Using cached modopt-1.5.1-py3-none-any.whl (59 kB)
Requirement already satisfied: scipy>=1.5.4 in /home/<username>/.conda/envs/shapepipe/lib/python3.9/site-packages (from mccd) (1.8.0)
Requirement already satisfied: numpy>=1.19.5 in /home/<username>/.conda/envs/shapepipe/lib/python3.9/site-packages (from mccd) (1.22.3)
Collecting python-pysap==0.0.5
  Using cached python-pySAP-0.0.5.tar.gz (656 kB)
  Preparing metadata (setup.py) ... done
Requirement already satisfied: astropy>=4.0.2 in /home/<username>/.conda/envs/shapepipe/lib/python3.9/site-packages (from mccd) (5.0)
Collecting scipy>=1.5.4
  Downloading scipy-1.5.4-cp39-cp39-manylinux1_x86_64.whl (25.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 25.8/25.8 MB 22.1 MB/s eta 0:00:00
Collecting importlib-metadata==3.7.0
  Using cached importlib_metadata-3.7.0-py3-none-any.whl (11 kB)
Collecting numpy>=1.19.5
  Downloading numpy-1.19.5-cp39-cp39-manylinux2010_x86_64.whl (14.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.9/14.9 MB 23.8 MB/s eta 0:00:00
Collecting progressbar2==3.53.1
  Using cached progressbar2-3.53.1-py2.py3-none-any.whl (25 kB)
Collecting matplotlib==3.3.4
  Downloading matplotlib-3.3.4-cp39-cp39-manylinux1_x86_64.whl (11.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.5/11.5 MB 26.7 MB/s eta 0:00:00
Collecting astropy>=4.0.2
  Downloading astropy-4.1.tar.gz (7.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.8/7.8 MB 24.3 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [28 lines of output]
      Traceback (most recent call last):
        File "/home/<username>/.conda/envs/shapepipe/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
          main()
        File "/home/<username>/.conda/envs/shapepipe/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/home/<username>/.conda/envs/shapepipe/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 130, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/setuptools/build_meta.py", line 177, in get_requires_for_build_wheel
          return self._get_build_requires(
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/setuptools/build_meta.py", line 159, in _get_build_requires
          self.run_setup()
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/setuptools/build_meta.py", line 174, in run_setup
          exec(compile(code, __file__, 'exec'), locals())
        File "setup.py", line 70, in <module>
          ext_modules=get_extensions())
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/extension_helpers/_setup_helpers.py", line 67, in get_extensions
          ext_modules.extend(setuppkg.get_extensions())
        File "./astropy/modeling/setup_package.py", line 59, in get_extensions
          from jinja2 import Environment, FileSystemLoader
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/jinja2/__init__.py", line 33, in <module>
          from jinja2.environment import Environment, Template
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/jinja2/environment.py", line 15, in <module>
          from jinja2 import nodes
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/jinja2/nodes.py", line 19, in <module>
          from jinja2.utils import Markup
        File "/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/jinja2/utils.py", line 642, in <module>
          from markupsafe import Markup, escape, soft_unicode
      ImportError: cannot import name 'soft_unicode' from 'markupsafe' (/tmp/pip-build-env-uvwtpde6/overlay/lib/python3.9/site-packages/markupsafe/__init__.py)
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Any suggestions?

Galsim installation issue

Hi @tobias-liaudat,

I have encountered an issue with the current Galsim installation used by MCCD.

MCCD Issue

When installing MCCD from PyPi,

pip install mccd

Galsim is installed via pip as a dependency. Then when attempting to verify the current version of MCCD,

python -c "import mccd; print(mccd.__version__)"

I get the following errors:

macOS

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/sfarrens/Documents/Library/miniconda3/envs/mccd/lib/python3.7/site-packages/mccd/__init__.py", line 11, in <module>
    from .mccd import MCCD, mccd_quickload
  File "/Users/sfarrens/Documents/Library/miniconda3/envs/mccd/lib/python3.7/site-packages/mccd/mccd.py", line 22, in <module>
    import galsim as gs
  File "/Users/sfarrens/Documents/Library/miniconda3/envs/mccd/lib/python3.7/site-packages/galsim/__init__.py", line 112, in <module>
    from .position import Position, PositionI, PositionD
  File "/Users/sfarrens/Documents/Library/miniconda3/envs/mccd/lib/python3.7/site-packages/galsim/position.py", line 19, in <module>
    from . import _galsim
ImportError: dlopen(/Users/sfarrens/Documents/Library/miniconda3/envs/mccd/lib/python3.7/site-packages/galsim/_galsim.cpython-37m-darwin.so, 2): Library not loaded: @rpath/libgalsim.2.2.dylib
  Referenced from: /Users/sfarrens/Documents/Library/miniconda3/envs/mccd/lib/python3.7/site-packages/galsim/_galsim.cpython-37m-darwin.so
  Reason: image not found

CentOS Linux

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/sfarrens/.conda/envs/mccd/lib/python3.7/site-packages/mccd/__init__.py", line 11, in <module>
    from .mccd import MCCD, mccd_quickload
  File "/home/sfarrens/.conda/envs/mccd/lib/python3.7/site-packages/mccd/mccd.py", line 22, in <module>
    import galsim as gs
  File "/home/sfarrens/.conda/envs/mccd/lib/python3.7/site-packages/galsim/__init__.py", line 112, in <module>
    from .position import Position, PositionI, PositionD
  File "/home/sfarrens/.conda/envs/mccd/lib/python3.7/site-packages/galsim/position.py", line 19, in <module>
    from . import _galsim
ImportError: libgalsim.so.2.2: cannot open shared object file: No such file or directory

ShapePipe Issue

This also creates an issue for ShapePipe (in particular on Candide, which uses CentOS Linux). ShapePipe installs Galsim from conda-forge, which a) allows MCCD to work fine but b) creates a conflict between the two Galsim installations on certain systems.

Proposed Solution

I recommend removing Galsim from your requirements.txt and making an new release of MCCD. This will resolve the ShapePipe issues.

With regards to the stand-alone MCCD installation, I think you can include a note that MCCD requires Galsim but it would be better to install this from source or via Conda.

Propagate new parameters to config file

Two new parameters added:

  • Polynomial degree d for the local component in the hybrid model.
  • Minimum number of polynomial degree for the global component.

Right now both of this are initialised to, 2 and None respectively.

(None is equivalent as not having a minimum polynomial degree).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.