GithubHelp home page GithubHelp logo

desimodel's Introduction

desimodel

GitHub Actions CI Status Test Coverage Status Documentation Status

Introduction

This product contains information about DESI hardware designs in machine readable formats to be used by simulations. It is intended to be correct but non-authoritative. If in question, the DocDB version of a design parameter is correct. This is intended to reflect the same information while being more conveniently organized and formatted for simulations.

PLEASE KEEP THIS FILE IN SYNC WITH THE EQUIVALENT FILE IN SVN.

Desimodel Code

The svn product described below contains only the data associated with desimodel. The code is in Github: https://github.com/desihub/desimodel.

Desimodel Data

Getting the Data

The data that accompanies the desimodel code is not stored with the code. Due to its size, it is kept in the DESI svn repository. The public, read-only URL for svn access is https://desi.lbl.gov/svn/code/desimodel, with the usual trunk/, tags/ and branches/ directories.

Once you have installed this package, using either pip or desiInstall, there are two ways to install the accompanying data. For most every case, you should install the tag in svn that corresponds to the same tag in git.

There are two methods to install the data, "by hand" and "scripted."

For "by hand" installs:

  1. Find the tag you are interested in:

    svn ls https://desi.lbl.gov/svn/code/desimodel/tags

    We'll use 0.10.3 in the examples below.

  2. Define the environment variable DESIMODEL:

    export DESIMODEL=/Users/desicollaborator/Data/desimodel/0.10.3

    Note how the tag name is included.

  3. Create the directory and switch to it:

    mkdir -p $DESIMODEL
    cd $DESIMODEL
  4. Export:

    svn export https://desi.lbl.gov/svn/code/desimodel/tags/0.10.3/data

    Note how the tag name is the same as in the DESIMODEL variable.

  5. You may now want to add DESIMODEL to your shell startup scripts.

For "scripted" installs:

  • Installing this package will create the command-line script install_desimodel_data. It should appear in your PATH assuming a successful install. install_desimodel_data --help will show you how to use this script. Basically it is just a wrapper on the "by hand" method described above.
  • You can also call the function desimodel.install.install() from inside other Python code.

Regardless of which method you choose, you should set the DESIMODEL environment variable to point to the directory containing the data/ directory. The only real difference among all these methods is exactly when you define the DESIMODEL variable.

Data Files

Files in data/inputs/ are copied from DocDB and are not intended to be used directly. These are parsed and reformatted to produce files in other data/ directories which are for use.

data/desi.yaml

Basic scalar parameters, organized in a nested tree.

data/focalplane/

Information about positioner locations and platescale.

data/footprint/

The areas of the sky that will be observed by DESI, with RA, Dec of tiles.

data/sky/

Sample sky spectra.

data/specpsf/

Spectrograph point-spread-function (PSF) for specter CCD pixel-level simulations.

data/spectra/

Example benchmark spectra.

data/targets/

Expected n(z) information per target class for cosmology projections.

data/throughput/

Throughput versus wavelength (also contained in specpsf).

data/weather/

Historical weather data.

Branches

There are several permanent branches that were used for testing alternative designs. Some of these will never be merged into trunk but we will keep them around for the record:

altccd

500 micron versus 250 micron thick CCDs.

bb

Recreating assumptions used during early BigBOSS projections.

newtiles

An improved tiling dither pattern from Eddie Schlafly, intended to be merged prior to the start of the DESI survey.

update_inputs

This branch might be present during updates to the inputs to the desimodel data files. See the updating desimodel inputs document for further details.

In addition to these historical branches, there are a set of test-* branches that contain smaller versions of the desimodel files. These branches are intended for use in desimodel unit tests. See the desimodel testing document for further details.

Tagging

If either the data or the code changes, a new tag should be created in both git and svn.

Full Documentation

Please visit desimodel on Read the Docs

License

desimodel is free software licensed under a 3-clause BSD-style license. For details see the LICENSE.rst file.

desimodel's People

Contributors

andreufont avatar apcooper avatar belaa avatar dkirkby avatar dstndstn avatar geordie666 avatar julienguy avatar michaeljwilson avatar moustakas avatar profxj avatar rainwoodman avatar sbailey avatar schlafly avatar tskisner avatar weaverba137 avatar woodywang153 avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

desimodel's Issues

Wavelength limits of throughput and resolution

The files data/specpsf/psf-quicksim.fits and data/throughput/thru-*.fits have different wavelength coverage:

   Type     b(min)   b(max)   r(min)   r(max)   z(min)   z(max) 
           Angstrom Angstrom Angstrom Angstrom Angstrom Angstrom
---------- -------- -------- -------- -------- -------- --------
       PSF 3569.000 5949.000 5625.000 7741.000 7435.000 9834.000
Throughput 3533.000 5997.900 5564.000 7805.000 7360.000 9913.000

Questions:

  • Do the PSF wavelength ranges effectively define the limit of active pixels on the CCD?
  • How should the ~50A regions of non-zero throughput extending beyond the PSF limits be interpreted?
  • Do photons do beyond the PSF limits actually enter the camera, potentially dispersing into an active pixel, or are they masked / collimated somewhere upstream?
  • If dispersion of wavelengths just beyond the PSF limits into the spectrum is possible, how should the dispersion FWHM be extrapolated to model this? I guess a constant extrapolation of the edge FWHM is ok?

This is relevant to getting the edge effects right in specsim (desihub/specsim#17 and desihub/specsim#18).

Improve description of optical distortion

The current fiberloss calculation in bin/fiberloss.py reads data/inputs/throughput/DESI-0347-throughput.txt from this package, and uses the blur values plotted below:
distortion1
This file also includes a column of lateral offset values (also in the plot), but the code uses a fixed value of 0.2" instead.

This issue is to provide a more detailed tabulation of the optical distortion that matches this graph based on ray tracing spots from the "geometric blur" tab of the excel spreadsheet in desi-doc-347-v10:
distortion2

The main difference is that distortion is tabulated simultaneously in field angle and wavelength, and so captures significant correlations that are missing from the existing platescale.txt table.

I also want to understand how the values in DESI-0347-throughput.txt were calculated from the ray tracing spots.

Readout noise units

Read noise values in desi.yaml are currently given in units of electrons:

        readnoise: 3.0      # electrons RMS
        darkcurrent: 3.0    # electrons/hour/pixel

Shouldn't that be electrons / pixel ?

Get desimodel ready for Python 3

In order to get desimodel ready for Python 3, we need to

  • Move code that is currently in the bin directory into the Python package.
  • Increase unit test coverage (and resolve #5).
  • Resolve #15.
  • Actually do the Python 3 conversion.

This would also be a good opportunity to resolve other desimodel issues, of which there are several, but these are not directly related to Python 3 compatibility.

Feature request: add capability to dither

Hi there,

Last week @sbailey and I discussed adding a dithering feature to the specsim code. Currently the code (through desimodels) allows for specification of the position of the spot on the focal plane, but as far as I can tell it doesn't allow dithering around a source. So I'd like to add this, but before making any changes, a few questions:

  1. I think desimodels is the place to make this addition. Am I right?
  2. I do not see hooks in the existing code for dithering. Did I miss anything?
  3. I'm willing to add the necessary hooks but don't want to collide with changes planned by @dkirkby; I'm told that some refactoring of the project is in the works.

So... if desimodels is the right place and someone else isn't planning to add something similar I'll add the dithering feature to a branch and submit a pull request.

Duplicate KPNO extinction tables

Why do we have two versions of the KPNO zenith extinction coefficients:

  1. data/spectra/ZenithExtinction-KPNO.dat covers 3500-10,000A in 0.1A steps with 7 digits of precision.
  2. data/sky/kpnoextinct_lunarmodel.dat covers 3200-10,600A in 1.0A steps with 6 digits of precision.

Both tables have identical values (except for the extra digit of precision) for the wavelengths they have in common.

@crockosi Is there a reason to prefer the lower-resolution, wider coverage file for your lunarmodel.pro code? I propose to use the higher-resolution file for desihub/specsim#9.

need MWS and BGS numbers in targets.dat

We need updated canonical BGS and MWS numbers in targets.dat (in svn, not git). This is waiting upon a DocDB bright time survey planning document that can be used as a reference for the real numbers. In the meantime, there are a few placeholder numbers just so that we can proceed with something for brightime sims:

#- BGS numbers from DESI-1377v1 (which are probably out of date)
area_BGS: 14000  # BG survey sky area in sq deg (note that this is not used in
                 # cosmology projections, i.e., don't change and expect those 
                 # to change (true also of main DESI area))
ntarget_BG: 800  # number of targeting BGS r<19.5
nobs_BG: 760     # nobs = ntarget * 0.95 fiber assignment efficiency
success_BG: 0.95 # fraction of BG observed targets with successful redshift

#- PLACEHOLDER MWS numbers; not documented or defined anywhere yet
ntarget_MWS: 736   # number of targeted MWS stars
nobs_MWS: 700      # nobs = ntarget * 0.95 fiber assignment efficiency 
success_MWS: 0.99  # fraction of MWS targets with successful redshift

Meaning of gain parameters in desi.yaml

The gain parameters in desi.yaml are all given as:

gain: 1.0 # electrons per ADU

Since the value is 1.0, this question is mostly academic but I would like to be sure I am using this constant correctly in specsim in case someone ever runs with a different value.

The units indicate that I should divide the simulated electrons per bin by the gain. Is that correct?

Please help describe the data/spectra files

There appear to be only two types of files in the data/spectra directory: spec- and sn-spec-. Is it safe to assume that all files of a given type have the same format? Is there a document (e.g. in DocDB) that describes all these files?

This directory also contains one of the duplicate KPNO files (#8) and a couple of README files (#47) that should be moved to more high-level documentation (such as the data model).

sign flip in desimodel.focalplane.radec2xy and xy2radec

desimodel.focalplane.radec2xy has a sign flip in the transformation from (RA,dec) -> (x,y).

The transforms should be:

  • Larger RA -> smaller (more negative) X (fiberassign has the opposite)
  • Larger DEC -> larger (more positive) Y (fiberassign is correct here)

See DESI-0481 and DESI-0742 for background material on DESI coordinates.

Short (though perhaps not illuminating) explanation: when facing south and looking up from the primary mirror towards the focal plane and the sky, RA increases to the left (east) and dec increases up (north). In this orientation the focal plane is "upside-down" with X increasing to the left (east) and Y increasing down (south). BUT: the light bounces off a mirror on its way from the sky to the focal plane, thus flipping signs: eastern targets with larger RA toward the left of the sky end up on the right hand side of the focal plane, i.e. smaller (more negative) X. And more northern targets (larger DEC) end up at the lower (larger Y) portion of the focal plane. It helps to draw this on a piece of paper and hold it above your head while looking up at it...

radec2xy
looking-south

Also see desihub/fiberassign#118 which has the same sign flip.

Fix angular size of random offset maps

Segev BenZvi just noticed that the random offsets generated and saved by doc/nb/DESI-0347_Throughput.ipynb only cover a square region that is 1.6deg on a side (due to my confusion between field-of-view and cone opening angle). This issue is to update the notebook and re-make the 3 sample realizations (which live in the svn repo for this package).

Update to current DESI-doc versions

v13 of DESI-347 was released 26-Jun-2018 with the following release notes:

Corrected error in Cell B18, Throughput.  Added 'Estimated QSO SNR' tab to address L2.5.4 in DESI-
318.  Updated coating performance for C2.  Coating performance adjusted per DESI-3158 (scratches 
on C3).  Corrected formula in row 26, Throughput, to correct for 1-d sigma (divide by 2).   Modified 
calculation of defocus, using measured values for fiber axial position variation, positioner axial position 
variation, GFA mount plate variation, GFA CCD axial position tolerance, see rows 32-35, Throughput. 
Updated CCD noise and dark current, Noise tab.  Added trending plots to throughput and SNR tabs.  
Updated 'Estimated ELG SNR" tab, per J. Guy email, see notes on that tab.

Currently (12-July-2018), desimodel is based on v12 of DESI-347. The git package is at tag 0.9.5 and the most recent revision 118070 to the svn data/ was 4 weeks ago.

This issue is to propagate changes from v12 to v13 to the the desimodel svn and git repos. The next step will be to propagate these changes through some benchmark simulations to see the effects on SNR, etc.

Update BGS n(z)

The existing BGS n(z) is out of date. This issue is to propose and finalize an update (however, since these files live in SVN, the update itself will not involve a PR).

Alex Smith has provided separate files nz_bgs_bright.dat (r < 19.5) and nz_bgs_faint.dat (19.5<r<20.0) from v0.0.4 of MXXL HOD mock in a suitable format. Are we happy with these file names? (The existing file is called nz_BG.dat but all other target classes use lower case, e.g. nz_elg.dat).

The plot and table below summarize the statistics for the old (BG) and new (bgs_bright, bgs_faint) files. Note that the integrals do not exactly match the recent target densities (800/dg^2, 600/dg^2) recently circulated by @akremin so perhaps these should be rescaled slightly?

BGS Class N/dg^2 <z>
BG 583.0 0.187
bgs_bright 775.9 0.209
bgs_faint 631.2 0.291

nz_bgs

In_DESI code.

Write a piece of code to return whether or not a set of RA/DEC lies within the DESI footprint using the "official" DESI tiles.

how do we generate non-standard tiles / pointing centers?

Specifically in the context of commissioning and survey validation, how should non-standard tiling centers be generated and stored?

The nominal tiles are stored in $DESIMODEL/data/footprint as desi-tiles[.ecsv][.par][.fits] but it's likely these were generated using a combination of IDL code and "by hand" (although perhaps I'm wrong!).

How should we make non-standard pointings?

Uncomprehensible help message in install_desimodel_data

Really tiny, but ....

"""
...
is, the directory specified by the environment variable. 2. The value set with
the -d option on the command line. 3. A directory relative to the file
containing this script. This directory is currently <function
default_install_dir at 0x2aaaac07b398>. If the data directory already exists,
this script will not do anything.
...
"""

Trimming target file using desimodel.footprint.find_points_in_tiles() for fiber assignment

@sbailey : this is more of an inquiry than a ticket so please close it if irreleant.

Using desimodel.footprint.find_points_in_tiles() to trim the input targets for fiberassignment, could lead to unassigned fibers on edges of the tiles in low density fields. In other words, passing a not exactly cut-to-foot target file, improves the assignment efficiency on tile edges.

Plots below show the same fiber (3180) that is left unassigned when I only passed the "in-foot" targets that desimodel.footprint.find_points_in_tiles() gives out and gets assigned when I pass a looser region that encompasses the tile. The fiber seems to be able to reach the target so does this mean that the maximum partol radius should be added to the tile radius to allow that target to be considered "in-foot" in find_points_in_tiles() algorithm or there is a reason for the way that the tile radius is determined? there were three unassigned "edge" fibers like this in this tile (among POS fibers so I'm not talking about the ETC fibers). This is from the ALL_SKY assignment but I imagine it can be the case elsewhere as well.

image image

Eliminate verbose output from desimodel_data.sh

The script that fetches the extra data associated with desimodel, etc/desimodel_data.sh, is run with #!/bin/bash -x. The extra debugging output may be interpreted by desiInstall as an error message, and prevent normal installation.

Create new test branch with data/throughput/galsim-fiber-acceptance.fits

Update desimodel.trim_data(indir, outdir) to include data/throughput/galsim-fiber-acceptance.fits in the output dataset, and make a new test-0.9.2 branch to use for travis testing. In the meantime travis tests can use the 0.9.2 svn tag, but that is much larger to download and slower to use.

When loading tile file, check existance before looking in data folder

Currently loading a tile file in the current working directory fails, because it treats all bare filenames as relative to the package data directory:

>>> import desimodel.io
>>> tiles_data = desimodel.io.load_tiles(tilesfile='tiles_skysub.fits', cache=False)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/kisner/software/desi/desimodel/master/lib/python3.6/site-packages/desimodel-0.9.9.dev464-py3.6.egg/desimodel/io.py", line 180, in load_tiles
    with fits.open(tilesfile, memmap=False) as hdulist:
  File "/home/kisner/software/desibase/lib/python3.6/site-packages/astropy/io/fits/hdu/hdulist.py", line 151, in fitsopen
    lazy_load_hdus, **kwargs)
  File "/home/kisner/software/desibase/lib/python3.6/site-packages/astropy/io/fits/hdu/hdulist.py", line 390, in fromfile
    lazy_load_hdus=lazy_load_hdus, **kwargs)
  File "/home/kisner/software/desibase/lib/python3.6/site-packages/astropy/io/fits/hdu/hdulist.py", line 1039, in _readfrom
    fileobj = _File(fileobj, mode=mode, memmap=memmap, cache=cache)
  File "/home/kisner/software/desibase/lib/python3.6/site-packages/astropy/utils/decorators.py", line 503, in wrapper
    return function(*args, **kwargs)
  File "/home/kisner/software/desibase/lib/python3.6/site-packages/astropy/io/fits/file.py", line 178, in __init__
    self._open_filename(fileobj, mode, overwrite)
  File "/home/kisner/software/desibase/lib/python3.6/site-packages/astropy/io/fits/file.py", line 555, in _open_filename
    self._file = fileobj_open(self.name, IO_FITS_MODES[mode])
  File "/home/kisner/software/desibase/lib/python3.6/site-packages/astropy/io/fits/util.py", line 388, in fileobj_open
    return open(filename, mode, buffering=0)
FileNotFoundError: [Errno 2] No such file or directory: '/home/kisner/software/desi/desimodel/master/data/footprint/tiles_skysub.fits'

Instead it should check if the path exists first. I have already fixed this in a branch. PR coming.

fix code and tests for new desi-tiles.fits

The new desi-tiles.fits file broke some code and tests due to multi-dimensional columns and shuffling which layers are which program. Self assigning to fix at high priority, while putting into a ticket for tracking.

Tiles file does not assign units to RA, Dec.

The desi-tiles.fits file does not assign units to RA, Dec. These units should be degrees. Also, STAR_DENSITY sounds like it should also have units of some kind (arcmin^-2 ?).

Update to latest DESI-doc revisions of data files

The following files from DESI-0347 are currently based on v5 and need to be updated to v10:

  • desi.yaml
  • inputs/throughput/DESI-0347-throughput.txt

There are also files under inputs/throughput/DESI-0334-spectro from DESI-0334 v3, but that is still the latest version in DocDB.

The file focalplane/platescale.txt claims to be based on DESI-0329 v14, based on this header:

# Echo 22 focal plane parameters from Figure 7 of DESI-0329v14

However, in the more recent v15 this file has the header:

# Echo 22 focal plane parameters from Figure 7 of DESI-0329v3.

However, all other lines look identical so there's something weird in the provenance.

Please update this issue with any other out of date files you are aware of.

Why do some spectra files need to be in a subdirectory (LyaSNR)?

The data/spectra directory contains a subdirectory, LyaSNR, that contains files which appear to have the identical format to data files in the directory above. There is no possibility of name collision, so why is this directory necessary? This makes describing the data model for these files more complicated.

Document file search procedure

#97 set up a new procedure for identifying the specific tile file to be read on disk. However, the PR did not appear to contain any new documentation describing the procedure in detail. Add this documentation, and describe why this is necessary only for tile files and not other files in $DESIMODEL. Alternatively, apply this procedure to all files accessed via desimodel.io.

desi-tiles.fits updates

Several things to fix about desi-tiles.fits and how we use it:

  • Column PROGRAM ended up with trailing spaces, e.g. 'DARK ' instead of 'DARK'. Either remove from the original file, or add filter to desimodel.io.load_tiles to strip trailing space.
  • Updated desi-tiles.fits includes extra passes with PROGRAM='EXTRA' and IN_DESI=True. These are currently returned by desimodel.io.load_tiles(onlydesi=True), but we probably want to exclude them by default.

I originally that that the OBSCONDITIONS column was f8 instead of an integer type, but that seems to be ok now.

Units problems in psf-quicksim.fits

(I know this file is actually in SVN rather than this repo, but thought an issue here might be more useful than a trac ticket).

astropy.io.fits is flagging some units problems when reading desimodel/data/specpsf/psf-quicksim.fits:

WARNING: UnitsWarning: The unit 'Angstrom' has been deprecated in the FITS standard. Suggested: 10**-1 nm. [astropy.units.format.utils]
WARNING: UnitsWarning: 'Angstrom/row' did not parse as fits unit: At col 9, Unit 'row' not supported by the FITS standard.  [astropy.units.core]

These are harmless for now, but should be easy to fix the next time someone updates this file.

Python random module is not consistent from 2 to 3.

This could impact, e.g. randomize_fibers.py.

Python 2:

Python 2.7.12 |Continuum Analytics, Inc.| (default, Jul  2 2016, 17:43:17) 
[GCC 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2336.11.00)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://anaconda.org
>>> import random
>>> random.seed(2)
>>> random.sample(range(100),20)
[95, 94, 5, 8, 83, 73, 66, 30, 60, 58, 15, 43, 39, 72, 99, 54, 44, 26, 3, 2]
>>> 

Python 3:

Python 3.5.2 |Continuum Analytics, Inc.| (default, Jul  2 2016, 17:52:12) 
[GCC 4.2.1 Compatible Apple LLVM 4.2 (clang-425.0.28)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import random
>>> random.seed(2)
>>> random.sample(range(100),20)
[7, 11, 10, 46, 21, 94, 85, 39, 32, 77, 27, 4, 74, 87, 20, 55, 81, 50, 92, 65]
>>> 

It's possible that numpy.random is consistent, but I haven't checked yet.

ReadTheDocs builds time out

After the most recent merge, the automated documentation build on ReadTheDocs failed with a time out. This may be due to the addition of healpy as a dependency. It looks like we will need to separate the requirements into those required to run unit tests versus those required to build the documentation.

Add tests for desimodel.trim

desimodel.trim includes functions for trimming down a full svn desimodel/data directory into a lightweight one for testing. This includes some potentially fragile assumptions about input data formats and dimensionality and thus should be included in the unit tests.

However, as currently written, the trim code can't run on an already trimmed data directory, so such tests wouldn't work with travis. But they would still be useful to have even if we can only run them on our laptops as an automated check that any data updated don't break the trimming code (or more likely alter us that the trimming code needs to be updated too).

Use skipIf(data/inputs directory is missing) logic to not run these tests on travis.

Update Release Notes

The release notes (doc/release_notes.rst) did not get updated for tag 0.4.3. It's too late to get this into the actual tag, but it would be nice to document the changes for the future.

Update code to define and apply the footprint

We are using some new code based on healpix for area calculations and inside/outside tests to generate the end-to-end mocks. The code is currently in a jupyter notebook, so this issue is to clean it up, add docs, and put it into this package.

Update solar spectrum?

Carlos Allende Prieto has suggested that we adopt the HST CalSpec solar spectrum:

http://www.stsci.edu/instruments/observatory/cdbs/calspec.html

to replace the current data/sky/solarspec.txt. This came up in the context of the specsim scattered moon and twilight sky emission models. Quoting from Carlos:

The solar spectrum looks good, but I have compared it to the one adopted for reference in the
HST calspec collection and the latter shows less wiggles in the near infrared side (at ~ 700 nm).

I have not had a chance to look into this yet (and it does not seem urgent) but chime in if you have opinions or are interested in following up on this.

Add option to override tiles filename

The function desimodel.io.load_tiles() provides a helpful single entry point for loading tile centers
and ensures consistently across packages. However, there are use cases (e.g., small regression tests) where we would like to define a different small subset of tiles that are used consistently by all code using this entry point. This issue is to support this use case by adding an additional optional arg to load_tiles:

def load_tiles(onlydesi=True, extra=False, tilespath='footprint/desi-tiles.fits'):
   """Return DESI tiles structure.
   ...
   tilespath : str
      Name of the tile definition file to read.  A relative path refers to the data directory of the
      currently installed desimodel package, not the current working directory.  An absolute
      path can also be specified.
   """

Fix desi_quicklya.py

This short code in desimodel/bin/desi_quicklya.py is used to generate Lya SNR files that are then used in Fisher forecast.

I added this code two years ago, heavily based on quicksim.py by David Kirkby. However, it doesn't work now since it is trying to load an old module (desimodel.simulate) and the original, and I believe the original code quicksim.py does not exist.

It is not a priority, but it would be nice to fix at some point.

Reading the README.rst file throws encoding errors on some systems.

I'm trying to test installs of desimodel on the new DESI+Anaconda infrastructure.

I'm using the desi-conda/3.5-20160913 module on both edison and datatran. I'm also using the current desi-conda branch of desiutil. On datatran, everything installs fine, but on edison, when the setup.py file tries to read the README.rst file, it complains:

Error during installation: Traceback (most recent call last):
  File "setup.py", line 38, in <module>
    setup_keywords['long_description'] = readme.read()
  File "/global/common/edison/contrib/desi/conda/conda_3.5-20160913/lib/python3.5/encodings/ascii.py", line 26, in decode
    return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xce in position 3401: ordinal not in range(128)

In fact, if you look closely at the README.rst file for desimodel, it is not ASCII, it contains µ characters.

So which is more mysterious: that the install succeeded on datatran or that it failed on edison?

Error near the end of tests

When I run setup.py test on my own computer (python 2.7) i see this error at the end:

....
test_load_tiles (desimodel.test.test_io.TestIO)
Test loading of tile files. ... ok
test_spatial_real_tiles (desimodel.test.test_io.TestIO) ... [701   3   3   3   1   5   0   8  85  97  69  15   9   1]
ok
test_tiles_consistency (desimodel.test.test_io.TestIO)
Test consistency of tile files. ... ok

----------------------------------------------------------------------
Ran 21 tests in 0.992s

OK (skipped=2)
Exception TypeError: "'NoneType' object is not callable" in <object repr() failed> ignored

desimodel has a hidden dependency on xlrd

desimodel.inputs.docdb uses the package xlrd. It's OK that the dependency is optional, but it needs to be documented, and declared in .travis.yml so that tests can be run on desimodel.inputs.docdb.

Is xlrd available as an Anaconda package?

targets.dat updates

Some updates for targets.dat:

  • Review numbers in there and update as needed, including DocDB references
  • Consider renaming to targets.yaml since it is a yaml file
  • Add a desitarget.io.read_target_info I/O wrapper so that people aren't indentifying the location and reading that file by hand

spectra/README file

The data/spectra/README file contains mostly code. Is the code actually checked in anyhere? In this repository for example? If not, it should be checked in somewhere, and any comments in the file included as documentation of the code.

Host authoritative specsim default configuration file

The default DESI configuration used by specsim is currently specsim/data/config/desi.yaml. This issue is to host the authoritative version of this file in this package, and have the desisim quickgen and quickbrick scripts use this authoritative version by default.

In order to make this file authoritative, we also need to decide how to handle the 14 constants listed here:

http://specsim.readthedocs.org/en/stable/config.html#desi-configuration

These values are given in the specsim config file but are also under change control in DESI-doc-347.

An obvious question is why specsim doesn't just read desi.yaml from desimodel. The answer is that this file is incomplete (it does not have machine readable units and does not specify the additional files needed for simulation with authoritative throughputs, etc) and is not generalizable to other instruments. Specsim is designed to simulate any fiber spectrograph so needs a flexible, complete, and self-contained configuration mechanism.

A strawman solution is to provide a function get_specsim_config() in this package that returns a YAML file in the format that specsim expects and also validates that it is consistent with all quantities under change control specified by desi.yaml. The quickgen and quickbrick scripts would then use this function to retrieve their default specsim config. Comments?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.