GithubHelp home page GithubHelp logo

simonsobs / soliket Goto Github PK

View Code? Open in Web Editor NEW
12.0 12.0 14.0 49.61 MB

SO Likelihoods and Theories

Home Page: https://soliket.readthedocs.io/

License: MIT License

Jupyter Notebook 93.13% Python 4.21% HTML 2.65% Shell 0.01%

soliket's People

Contributors

cmbant avatar ggalloni avatar htjense avatar itrharrison avatar mgerbino avatar nbatta avatar pablo-lemos avatar paganol avatar qujia7 avatar raphkou avatar reneehlozek avatar sgiardie avatar timothydmorton avatar veragluscevic avatar xzackli avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

soliket's Issues

Make tensorflow via cosmopower an optional dependency

Multiple people have experienced problems with timeouts creating the SOLikeT environment because CosmoPower has tensorflow as a dependency. The tensorflow pip package is >600MB and this causes timeouts when installing it.

  • Make CosmoPower and optional dependency
  • Add test environment which also tests optional modules
  • Update readme and installation instructions to reflect this

xcorr: add CLASS compatability

I should fix the xcorr compatibility with CLASS, which I think should only be a case of translating variable and function names where necessary.

Tests fail on new camb version

Two tests against precomputed likelihood values now fail see here, I'm pretty sure due to the bump in camb version to v1.4.

e.g.

FAILED test_clusters.py::test_clusters_loglike - assert array([False])
 +  where array([False]) = (array([-854.8981993]), -855.0)

I can just update the test against the new precomputed likelihood values, or change the tolerance on the isclose.

Implement bias models as cobaya theory

After chatting with @Pablo-Lemos on Friday, we felt it could be a good idea to implement galaxy bias models at the level of a cobaya theory class, in order that the parameters can be seen by multiple likelihoods (and the code only needs to be implemented or hooked into once etc).

I would start this at either end of the complexity scale:

  • A 1 parameter scale and redshift linear bias model
  • Using the velocileptors implementation of an LPT bias model as was part of the original xcorr and is #25

Does this seem sensible to others?

Implement test coverage

We should implement code test coverage checking, to ensure that all relevant parts of SOLikeT are being adequately tested. I will have a look at both coveralls and codecov for doing this as part of the github actions.

Add SZ likelihood

We plan to add an SZ likelihood to forward-model theoretical gas density and pressure into kSZ and tSZ profiles. There is an existing repository Mop-c-GT (Model-to-observable projection code for galaxy thermodynamics, https://github.com/timothydmorton/Mop-c-GT) used in Schaan et al. 2021 (https://ui.adsabs.harvard.edu/abs/2021PhRvD.103f3513S/abstract) and Amodeo et al. 2021 (https://ui.adsabs.harvard.edu/abs/2021PhRvD.103f3514A/abstract) performing line-of-sight projection along with beam convolution and aperture photometry for ACT data that we propose to integrate with the SO framework to be able to make similar predictions.

A few example figures are included below, with the theoretical curves in the left panels and the projected SZ profiles in the right panels. These are simply example gNFW curves to show the process, they do not show real data.
rho_gnfw
pth_gnfw

xcorr: add LPT bias model

The original xcorr code includes support for an LPT bias model via velocileptors. This seems a widely-applied and relevant LPT bias model (other opinions welcome).

soliket.xcorr should be enhanced to include the velocileptors LPT bias model.

velocileptors would then become a(n optional?) dependency, along with pyfftw.

Bring in binned cluster counts likelihood

The current clusters likelihood is for unbinned clusters. Following Planck (and others) @eunseongleee has implemented a binned cluster counts likelihood following the Cash statistics. This is now at the stage of having been well tested in situ, so that a fiducial likelihood value can be calculated.

This work is currently on the binned_cluster branch, and contains quite a lot of development cruft and overlap with other likelihoods.

We should start a new branch dev-binned_clusters and start bringing things which are strictly necessary across from the binned_clusters branch.

The first things to implement would be a dummy likelihood which is a subclass of CashLikelihood (#54 ) which returns the fiducial likelihood value, and a test which... tests this against the fiducial value.

Install soliket on macos M1

This is a compilation of instructions I followed to make SOLikeT work (and pass the tox tests) on the new macos M1.

System: Monterey 12.6.3, M1 chip, arm64

The main issue was to have tensorflow correctly installed (see here) and have cosmopower find it.

  1. Download latest miniconda installer (e.g., here: Download Conda environment) and properly rename it (e.g., -> miniconda.sh)
  2. Install miniconda and tensor flow-deps
    bash ~/miniconda.sh -b -p $HOME/miniconda
    source ~/miniconda/bin/activate
    conda install -c apple tensorflow-deps
  3. Create your virtual env
    conda env create -n my_env -f soliket-tests.yml
    conda activate my_env
  4. Install tensorflow-macos and metal with correct versioning (I had them work with 2.9 and 0.5)
    pip install tensorflow-macos==2.9
    pip install tensorflow-metal==0.5.0
  5. Download cosmopower manually
    git clone https://github.com/alessiospuriomancini/cosmopower
    cd cosmopower
  6. Open requirements.txt and change the tensorflow dep as follows:
    tensorflow>2.0 -> tensorflow-macos>2.0
  7. Install cosmopower manually
    pip install .
  8. Go back to soliket folder and install it
    cd path/to/your/soliket
    pip install -e .

The above should let you install SOLikeT.
In order to pass tests (both with pytest and within the tox env), I needed to add camb to soliket-tests.yml in the conda requirements to be installed via conda-forge (might work also via pip, not tried). Simply, add camb before the pip requirements.

Implement halo model as a cobaya Theory

There are a couple of places already where ingredients of a halo model (such as the halo mass function) are calculated within individual likelihoods.

We should instead move these calculations to cobaya Theories, as part of a halo model calculator.

@borisbolliet already has some very good halo model code in class_sz and it would be good to reuse as much of that as possible.

Implement CCL tracer galaxy shear x CMB kappa likelihood

I will implement a simple galaxy shear x CMB kappa Cl likelihood using the CCL tracers we have access to.

This is to support some DESxACT work, but I'm sure is worthwhile for SOLikeT too.

@borisbolliet mentioned that he has some code which calculates galaxy shear x y (in Cls and position space) which could be extended to calculate this too.

Add redshift dependence to NLA Intrinsic Alignments

Currently the CCL implementation of galaxy intrinsic alignments for weak lensing kernels is wrapped in such a way that only a single amplitude is allowed.

Current evidence (e.g. Table III of Secco et al) is that such non-evolving IA models are disfavoured compared to ones with redshift evolution, even if this is relatively weak.

The current implementation would be easy to extend for two evolving models:

  • The full NLA model including the eta and z0 parameters, where a redshift dependence of (1 + z / 1 + z0)^eta is included.
  • No eta parameter, but a different NLA amplitude parameter per tomographic bin.

handling constants

Agree on a way to handle constants in soliket. Do we want to import them from an external module/library (e.g., pyccl)? Do we want to write an soliket library of constants? I am in favour of the second option, since it makes it easier to customize, check, add/remove whatever is needed.

identify sharable theory predictions

Identify theory predictions that are shared by many likelihoods (e.g., Pk interpolator needed by clusters and xcorr) and compute them once for all the likelihoods.

Tests for test_lensing[classy] and test_lensing_lite[classy] fail

After installing SOLikeT and running "pytest -v ." as specified in the documentation, the tests for test_lensing.py and test_lensing_lite.py fail from the same failed installation of CLASS.

The error message indicates that the version for gcc is too low, needing to be newer than 6.4.
The entire error message is copied below:

'''
theory = 'classy'

    @pytest.mark.parametrize("theory", ["camb", "classy"])
    def test_lensing(theory):
>       model, test_point = get_demo_lensing_model(theory)

soliket/tests/test_lensing.py:87: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
soliket/tests/test_lensing.py:64: in get_demo_lensing_model
    install(info, path=packages_path, skip_global=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

infos = ({'likelihood': {'soliket.LensingLikelihood': {'stop_at_error': True}}, 'params': {'H0': {'prior': {'max': 100, 'min':...{'prior': {'max': 1.2, 'min': 0.8}}}, 'theory': {'classy': {'extra_args': {'output': 'lCl, tCl'}, 'path': 'global'}}},)
kwargs = {'path': '/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages', 'skip_global': True}
debug = None, path = '/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages'
abspath = '/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages'
kwargs_install = {'code': True, 'data': True, 'force': False, 'no_progress_bars': None}
what = 'data', spath = '/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages/data'
failed_components = ['theory:classy'], skip_keywords_arg = set(), skip_keywords_env = set()
skip_keywords = set()

    def install(*infos, **kwargs):
        debug = kwargs.get("debug")
        # noinspection PyUnresolvedReferences
        if not log.root.handlers:
            logger_setup(debug=debug)
        path = kwargs.get("path")
        if not path:
            path = resolve_packages_path(infos)
        if not path:
            raise LoggedError(
                log, "No 'path' argument given, and none could be found in input infos "
                     "(as %r), the %r env variable or the config file. "
                     "Maybe specify one via a command line argument '-%s [...]'?",
                "packages_path", packages_path_env, packages_path_arg[0])
        abspath = os.path.abspath(path)
        log.info("Installing external packages at '%s'", abspath)
        kwargs_install = {"force": kwargs.get("force", False),
                          "no_progress_bars": kwargs.get("no_progress_bars")}
        for what in (code_path, data_path):
            kwargs_install[what] = kwargs.get(what, True)
            spath = os.path.join(abspath, what)
            if kwargs_install[what] and not os.path.exists(spath):
                try:
                    os.makedirs(spath)
                except OSError:
                    raise LoggedError(
                        log, "Could not create the desired installation folder '%s'", spath)
        failed_components = []
        skip_keywords_arg = set(kwargs.get("skip", []) or [])
        # NB: if passed with quotes as `--skip "a b"`, it's interpreted as a single key
        skip_keywords_arg = set(chain(*[word.split() for word in skip_keywords_arg]))
        skip_keywords_env = set(
            os.environ.get(install_skip_env, "").replace(",", " ").lower().split())
        skip_keywords = skip_keywords_arg.union(skip_keywords_env)
        used_components, components_infos = get_used_components(*infos, return_infos=True)
        for kind, components in used_components.items():
            for component in components:
                print()
                print(create_banner(kind + ":" + component,
                                    symbol=_banner_symbol, length=_banner_length), end="")
                print()
                if _skip_helper(component.lower(), skip_keywords, skip_keywords_env, log):
                    continue
                info = components_infos[component]
                if isinstance(info, str) or "external" in info:
                    log.warning("Component '%s' is a custom function. "
                                "Nothing to do.", component)
                    continue
                try:
                    class_name = (info or {}).get("class")
                    if class_name:
                        log.info("Class to be installed for this component: %r", class_name)
                    imported_class = get_resolved_class(
                        component, kind=kind, component_path=info.pop("python_path", None),
                        class_name=class_name)
                except ImportError as excpt:
                    log.error("Component '%s' not recognized. [%s].", component, excpt)
                    failed_components += ["%s:%s" % (kind, component)]
                    continue
                else:
                    if _skip_helper(imported_class.__name__.lower(), skip_keywords,
                                    skip_keywords_env, log):
                        continue
                is_compatible = getattr(imported_class, "is_compatible", lambda: True)()
                if not is_compatible:
                    log.info(
                        "Skipping %r because it is not compatible with your OS.", component)
                    continue
                log.info("Checking if dependencies have already been installed...")
                is_installed = getattr(imported_class, "is_installed", None)
                if is_installed is None:
                    log.info("%s.%s is a fully built-in component: nothing to do.",
                             kind, imported_class.__name__)
                    continue
                install_path = abspath
                get_path = getattr(imported_class, "get_path", None)
                if get_path:
                    install_path = get_path(install_path)
                has_been_installed = False
                with NoLogging(None if debug else logging.ERROR):
                    if kwargs.get("skip_global"):
                        has_been_installed = is_installed(path="global", **kwargs_install)
                    if not has_been_installed:
                        has_been_installed = is_installed(path=install_path, **kwargs_install)
                if has_been_installed:
                    log.info("External dependencies for this component already installed.")
                    if kwargs.get("test", False):
                        continue
                    if kwargs_install["force"] and not kwargs.get("skip_global"):
                        log.info("Forcing re-installation, as requested.")
                    else:
                        log.info("Doing nothing.")
                        continue
                else:
                    log.info("Check found no existing installation")
                    if not debug:
                        log.info(
                            "(If you expected this to be already installed, re-run "
                            "`cobaya-install` with --debug to get more verbose output.)")
                    if kwargs.get("test", False):
                        continue
                    log.info("Installing...")
                try:
                    install_this = getattr(imported_class, "install", None)
                    success = install_this(path=abspath, **kwargs_install)
                except KeyboardInterrupt:
                    raise
                except:
                    traceback.print_exception(*sys.exc_info(), file=sys.stdout)
                    log.error("An unknown error occurred. Delete the external packages "
                              "folder %r and try again. "
                              "Please, notify the developers if this error persists.",
                              abspath)
                    success = False
                if success:
                    log.info("Successfully installed! Let's check it...")
                else:
                    log.error("Installation failed! Look at the error messages above. "
                              "Solve them and try again, or, if you are unable to solve, "
                              "install the packages required by this component manually.")
                    failed_components += ["%s:%s" % (kind, component)]
                    continue
                # test installation
                with NoLogging(None if debug else logging.ERROR):
                    successfully_installed = is_installed(path=install_path, check=False,
                                                          **kwargs_install)
                if not successfully_installed:
                    log.error("Installation apparently worked, "
                              "but the subsequent installation test failed! "
                              "Look at the error messages above, or re-run with --debug "
                              "for more more verbose output. "
                              "Try to solve the issues and try again, or, if you are unable "
                              "to solve them, install the packages required by this "
                              "component manually.")
                    failed_components += ["%s:%s" % (kind, component)]
                else:
                    log.info("Installation check successful.")
        print()
        print(create_banner(" * Summary * ",
                            symbol=_banner_symbol, length=_banner_length), end="")
        print()
        if failed_components:
            bullet = "\n - "
            raise LoggedError(
                log, "The installation (or installation test) of some component(s) has "
                     "failed: %s\nCheck output of the installer of each component above "
                     "for precise error info.\n",
>               bullet + bullet.join(failed_components))
E           cobaya.log.LoggedError: The installation (or installation test) of some component(s) has failed: 
E            - theory:classy
E           Check output of the installer of each component above for precise error info.

../../.local/lib/python3.6/site-packages/cobaya/install.py:196: LoggedError
--------------------------------------- Captured stdout call ----------------------------------------
 2022-03-01 15:37:29,071 [install] Installing external packages at '/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages'

================================================================================
theory:classy
================================================================================

 2022-03-01 15:37:29,076 [install] Checking if dependencies have already been installed...
 2022-03-01 15:37:29,078 [install] Check found no existing installation
 2022-03-01 15:37:29,078 [install] (If you expected this to be already installed, re-run `cobaya-install` with --debug to get more verbose output.)
 2022-03-01 15:37:29,078 [install] Installing...
 2022-03-01 15:37:29,078 [classy] Installing pre-requisites...
Collecting cython
  Downloading Cython-0.29.28-py2.py3-none-any.whl (983 kB)
Installing collected packages: cython
Successfully installed cython-0.29.28
 2022-03-01 15:37:34,464 [classy] Downloading classy...
 2022-03-01 15:37:37,982 [classy] Downloaded filename class_public-2.9.3.tar.gz
 2022-03-01 15:37:38,177 [classy] class_public v2.9.3 downloaded and decompressed correctly.
 2022-03-01 15:37:38,251 [classy] *ERROR* Your gcc version is too low! CLASS would probably compile, but it would leak memory when running a chain. Please use a gcc version newer than 6.4. You can still compile CLASS by hand, maybe changing the compiler in the Makefile. CLASS has been downloaded into '/private/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages/code/classy'
 2022-03-01 15:37:38,252 [install] *ERROR* Installation failed! Look at the error messages above. Solve them and try again, or, if you are unable to solve, install the packages required by this component manually.

================================================================================
likelihood:soliket.LensingLikelihood
================================================================================

 2022-03-01 15:37:38,253 [install] Checking if dependencies have already been installed...
 2022-03-01 15:37:38,254 [install] External dependencies for this component already installed.
 2022-03-01 15:37:38,254 [install] Doing nothing.

================================================================================
* Summary * 
================================================================================

 2022-03-01 15:37:38,254 [install] *ERROR* The installation (or installation test) of some component(s) has failed: 
 - theory:classy
Check output of the installer of each component above for precise error info.

--------------------------------------- Captured stderr call ----------------------------------------
3.31MiB [00:02, 1.16MiB/s]
----------------------------------------- Captured log call -----------------------------------------
INFO     install:install.py:64 Installing external packages at '/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages'
INFO     install:install.py:117 Checking if dependencies have already been installed...
INFO     install:install.py:143 Check found no existing installation
INFO     install:install.py:146 (If you expected this to be already installed, re-run `cobaya-install` with --debug to get more verbose output.)
INFO     install:install.py:150 Installing...
INFO     classy:classy.py:573 Installing pre-requisites...
INFO     classy:classy.py:578 Downloading classy...
INFO     classy:install.py:242 Downloaded filename class_public-2.9.3.tar.gz
INFO     classy:install.py:292 class_public v2.9.3 downloaded and decompressed correctly.
ERROR    classy:classy.py:595 Your gcc version is too low! CLASS would probably compile, but it would leak memory when running a chain. Please use a gcc version newer than 6.4. You can still compile CLASS by hand, maybe changing the compiler in the Makefile. CLASS has been downloaded into '/private/var/folders/15/lt28zrgs54nbr15_scf2bjtm0000gn/T/lensing_packages/code/classy'
ERROR    install:install.py:166 Installation failed! Look at the error messages above. Solve them and try again, or, if you are unable to solve, install the packages required by this component manually.
INFO     install:install.py:117 Checking if dependencies have already been installed...
INFO     install:install.py:134 External dependencies for this component already installed.
INFO     install:install.py:140 Doing nothing.
ERROR    install:log.py:35 The installation (or installation test) of some component(s) has failed: 
 - theory:classy
Check output of the installer of each component above for precise error info.
_______________________________________ test_lensing[classy] ________________________________________

theory = 'classy'

    @pytest.mark.parametrize("theory", ["camb", "classy"])
    def test_lensing(theory):
>       model = get_demo_lensing_model(theory)

soliket/tests/test_lensing_lite.py:61: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
soliket/tests/test_lensing_lite.py:55: in get_demo_lensing_model
    model = get_model(info)
../../.local/lib/python3.6/site-packages/cobaya/model.py:1189: in get_model
    stop_at_error=info.get("stop_at_error", False))
../../.local/lib/python3.6/site-packages/cobaya/model.py:150: in __init__
    timing=timing)
../../.local/lib/python3.6/site-packages/cobaya/theory.py:397: in __init__
    info, packages_path=packages_path, timing=timing, name=name))
../../.local/lib/python3.6/site-packages/cobaya/theory.py:67: in __init__
    standalone=standalone)
../../.local/lib/python3.6/site-packages/cobaya/component.py:95: in __init__
    self.initialize()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = classy

    def initialize(self):
        """Importing CLASS from the correct path, if given, and if not, globally."""
        # Allow global import if no direct path specification
        allow_global = not self.path
        if not self.path and self.packages_path:
            self.path = self.get_path(self.packages_path)
        self.classy_module = self.is_installed(path=self.path, allow_global=allow_global,
                                               check=False)
        if not self.classy_module:
            raise NotInstalledError(
>               self.log, "Could not find CLASS. Check error message above.")
E           cobaya.install.NotInstalledError: Could not find CLASS. Check error message above.

../../.local/lib/python3.6/site-packages/cobaya/theories/classy/classy.py:186: NotInstalledError
--------------------------------------- Captured stdout call ----------------------------------------
 2022-03-01 15:37:41,379 [classy] Importing *global* CLASS.
 2022-03-01 15:37:41,381 [classy] *ERROR* Could not import global CLASS installation. Specify a Cobaya or CLASS installation path, or install the CLASS Python interface globally with 'cd /path/to/class/python/ ; python setup.py install'
 2022-03-01 15:37:41,381 [classy] *ERROR* Could not find CLASS. Check error message above.
----------------------------------------- Captured log call -----------------------------------------
INFO     classy:classy.py:545 Importing *global* CLASS.
ERROR    classy:classy.py:558 Could not import global CLASS installation. Specify a Cobaya or CLASS installation path, or install the CLASS Python interface globally with 'cd /path/to/class/python/ ; python setup.py install'
ERROR    classy:log.py:35 Could not find CLASS. Check error message above.
====================================== short test summary info ======================================
FAILED soliket/tests/test_lensing.py::test_lensing[classy] - cobaya.log.LoggedError: The installat...
FAILED soliket/tests/test_lensing_lite.py::test_lensing[classy] - cobaya.install.NotInstalledError...
'''

Create unified sacc file for tests data

At the moment there are a number of tests which are run using data files from different simulations. See the list below.

Eventually these simulations should be as coherent as possible.

For now I will check which of these data files are actually used and collect them together into a single sacc file which is used by the tests.

The list of data files I can see are:

soliket/data/simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109/
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_binned_covmat.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_00_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_01_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_02_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_03_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_04_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_05_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_06_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_07_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_08_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_09_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_10_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_11_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_12_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_13_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_14_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_15_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_16_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_17_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_18_bandpowers.txt
	simulated_clkk_SO_Apr17_mv_nlkk_deproj0_SENS1_fsky_16000_iterOn_20191109_sim_19_bandpowers.txt

soliket/data/xcorr_simulated/
	clgg_noiseless.txt
	clkg_noiseless.txt
	dndz.txt

soliket/clusters/data/
	ACTPol_Cond_scatv5.fits
	E-D56Clusters.fits
	selFn_equD56/
		QFit.fits
		RMSMap_Arnaud_M2e14_z0p4.fits.gz
		RMSTab.fits
		areaMask.fits.gz
		fRelWeights.fits

soliket/lensing/data/
	binnedauto.txt
	binnedcov.txt
	binningmatrix.txt

Implement versioned releases

We should implement a system for tagging and/or releasing versions of SOLikeT, so it can be more effectively shared and used outside of the active developers.

Create smooth fiducial data for testing

We should create a set of smooth simulated data (i.e., direct predictions from Boltzmann solvers) that share the same set of input cosmo params for testing purposes. This is mostly relevant when testing combined likelihoods.

Add comparison to pre-computed loglike to tests

To check the stability of likelihood calculations after any refactoring is done, we should add a test which compares the loglike at a specified point(s) in parameter space to a pre-computed value stored in the test.

Add CosmoPower P(k, z) emulation

The current CosmoPower implementation only gives acess to the network for CMB C_ell. Seeing as e.g. my current ACT x DES runs are annoyingly slow (>2 days) and the time for each likelihood evaluation is dominated by CAMB:

[soliket.cross_correlation.shearkappalikelihood] Average evaluation time for soliket.cross_correlation.ShearKappaLikelihood: 0.228822 s  (9 evaluations)
[camb.transfers] Average evaluation time for camb.transfers: 0.625354 s  (9 evaluations)
[camb] Average evaluation time for camb: 0.962215 s  (9 evaluations)
[soliket.ccl.ccl] Average evaluation time for soliket.ccl.CCL: 0.0253867 s  (9 evaluations

I will implement the P(k, z) emulation in our CosmoPower Boltzmann Theory.

Make soliket/mflike even with mflike/mflike

Current version of mflike and theoryforge in the native mflike repo are ahead of soliket/mflike. In particular, native mflike now allows for use of compute_foreground_model function without initialization of mflike model. Implement same functionality in soliket/mflike.

Update lensing likelihood

We should update the lensing likelihood:

  • move theory calculations (e.g., addition of noise controbutions) in separate blocks (e.g., cobaya theory blocs)
  • use sacc format for data products, and add related routines for I/O
  • use sacc format for corrections components
  • make computation of fiducial cls optional in case we want to read fiducial from file

Lensing likelihood installation failing

As can be seen in https://github.com/simonsobs/SOLikeT/runs/7712680362?check_suite_focus=true from #73 the tests involving the lensing likelihood now fail. I have marked them as xfail for now, but they should be fixed.

This seems to be a problem with them being an InstallableLikelihood, in particular with the specification of packages_path:

ERROR    soliket.lensing.LensingLikelihood:InstallableLikelihood.py:85 The given installation path does not exist: '/tmp/LAT_packages/data/LensingLikelihood'
ERROR    soliket.lensing.LensingLikelihood:log.py:42 The data for this likelihood has not been correctly installed. To install it, run `cobaya-install soliket.lensing.LensingLikelihood`

The packages_path is specified here:

packages_path = os.environ.get("COBAYA_PACKAGES_PATH") or os.path.join(
    tempfile.gettempdir(), "lensing_packages"
)

I tried removing the part of this before the or (packages_path is previously specified as part of the MFLike tests) and it made no difference.

CosmoPower not actually optional

Though CosmoPower is supposed to be an optional requirement, it currently still tries to import even when not installed, meaning SOLikeT can't run.

I will change this so it works more gracefully, i.e.:



try:
    import cosmopower  # noqa F401
except ImportError:
    HAS_COSMOPOWER = False
else:
    HAS_COSMOPOWER = True

get clusters likelihood to work

@nbatta let's keep this conversation alive here asynchronously as necessary.

currently, you should be able to go to the solike/tests directory, run pytest test_cluster.py, and get the following output:

---------------------------------------- Captured stdout call ----------------------------------------
[prior] *WARNING* No sampled parameters requested! This will fail for non-mock samplers.
[camb] Importing *global* CAMB.
mock catalog
[solike.clusterlikelihood] *ERROR* Error at evaluation: TypeError("Prob_per_cluster() missing 1 required positional argument: 'tsz_signal_err'")
----------------------------------------- Captured log call ------------------------------------------
WARNING  prior:log.py:153 No sampled parameters requested! This will fail for non-mock samplers.
INFO     camb:camb.py:231 Importing *global* CAMB.
ERROR    solike.clusterlikelihood:log.py:34 Error at evaluation: TypeError("Prob_per_cluster() missing 1 required positional argument: 'tsz_signal_err'")

We haven't finished implementing this of course, which is why this is happening, but that is where you can pick up if you are able to before I am. If you're looking at this and on slack, feel free to ping me.

Opening a new cluster branch

This includes both unbinned and binned cluster likelihood. Starting to combine the two likelihoods in a compact way by removing overlapping parts. Specifics related to mass function or conversion are not decided yet.

Fix tests in dev-instrLAT branch

tests must be updated to reflect the new nature of theoryforge as a cobaya theory component (and subsequent restructuring of the soliket/mflike package)

Add Hartlap and post-Hartlap corrections as option in likelihoods

Where covariance matrices are estimated from simulations, the Hartlap correction is used to account for the fact that the result inverse of the covariance matrix is biased.

Computing it requires knowledge of the number of simulations N and the number of rows in the data vector p.

Since the original Hartlap paper, people have posed additional corrections. Sellentin and Heavens show that not only the inverse covariance but also the likelihood should be altered.

We should include an ability to apply a Hartlap correction and the Sellentin and Heavens correction.

I would suggest two things:

  • Add the option to include a Hartlap correction in the GaussianLikelihood
  • Add a new SellentinHeavensLikelihood or StudentstLikelihood which is a subclass of GaussianLikelihood and includes the alteration as described on Alan Heavens' website:

If you have been using the Hartlap correction, you need to change one line of your code. With the Hartlap correction a=(n-p-2)/(n-1), and χ2=(x-μ)T(Σ')-1(x-μ),

REPLACE

lnL = constant - a χ2/2

WITH

lnL = constant - n ln[1 - χ2/(1-n)]/2

Remove inactive branches from repo

We should remove inactive branches from the repository, and make clear the development path (open Issues and/or PRs) for branches which are still active.

PyPI packaging of SOLikeT

We should make SOLikeT installable via pip when we have a versioned release. This Issue will attempt to keep track of things which need to happend for that.

SOLikeT unable to install mflike when installing on NERSC

Hi,

I'm trying to install SOLikeT on NERSC, following the directions on the readme, but when I get to the pip install -e . step, it fails with the following error message:

`ERROR: Command errored out with exit status 1:
command: /global/common/cori/software/python/3.6-anaconda-5.2/bin/python -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-fnwdimo9/mflike_3088dc3315ff48079bb30c55e1b06eb9/setup.py'"'"'; file='"'"'/tmp/pip-install-fnwdimo9/mflike_3088dc3315ff48079bb30c55e1b06eb9/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-pj858uq6
cwd: /tmp/pip-install-fnwdimo9/mflike_3088dc3315ff48079bb30c55e1b06eb9/
Complete output (7 lines):
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-install-fnwdimo9/mflike_3088dc3315ff48079bb30c55e1b06eb9/setup.py", line 6, in
readme = readme_file.read()
File "/global/common/cori/software/python/3.6-anaconda-5.2/lib/python3.6/encodings/ascii.py", line 26, in decode
return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc2 in position 2057: ordinal not in range(128)

WARNING: Discarding git+https://github.com/simonsobs/lat_mflike. Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.`

Then, at the end of the output, it says:
ERROR: Could not find a version that satisfies the requirement mflike (unavailable) (from soliket) (from versions: none) ERROR: No matching distribution found for mflike (unavailable)

It seems like SOLikeT is unable to find and install the necessary mflike module, which is supposed to be installed automatically. Do you have any suggestions for how to fix this problem?

Thank you so much,
Harrison Winch

Identify adequate stability of likelihood values

In e.g. #112 and #97 we have seen precomputed likelihood values change on O(0.01%) depending on the CAMB version. At the moment likelihood value stability is checked using np.isclose which has default settings on accuracy of rtol=1.e05, atol=1.e-8.

We should identify what rtol and atol values are actually appropriate for the SO likelihood codes.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.