GithubHelp home page GithubHelp logo

gregreen / dustmaps Goto Github PK

View Code? Open in Web Editor NEW
64.0 8.0 21.0 2 MB

A uniform interface for a number of 2D and 3D maps of interstellar dust reddening/extinction.

License: GNU General Public License v2.0

Python 98.59% TeX 1.41%

dustmaps's Introduction

DOI DOI

dustmaps

The dustmaps package provides a uniform interface for dealing with a number of 2D and 3D maps of interstellar dust reddening/extinction.

Supported Dust Maps

The currently supported dust maps are:

  1. Burstein & Heiles (1982; BH'82)
  2. Chen et al. (2014)
  3. Green, Schlafly, Finkbeiner et al. (2015,2018,2019; Bayestar)
  4. Marshall et al. (2006)
  5. Planck Collaboration (2013)
  6. Planck Collaboration (2016; GNILC)
  7. Sale et al. (2014; IPHAS)
  8. Schlegel, Finkbeiner & Davis (1998; SFD'98)
  9. Lenz, Hensley & Doré (2017)
  10. Peek & Graves (2010)
  11. Leike & Enßlin (2019)
  12. Leike, Glatzle & Enßlin (2020)
  13. Edenhofer et al. (2023)
  14. Chiang (2023; CSFD)

To request addition of another dust map in this package, file an issue on GitHub, or submit a pull request.

Installation

Download the repository from GitHub and then run:

python setup.py install --large-data-dir=/path/where/you/want/large/data/files/stored

Alternatively, you can use the Python package manager pip:

pip install dustmaps

Getting the Data

To fetch the data for the SFD dust map, run:

python setup.py fetch --map-name=sfd

You can download the other dust maps by changing "sfd" to "csfd", "planck", "planckGNILC", "bayestar", "iphas", "marshall", "chen2014", "lenz2017", "pg2010", "leikeensslin2019", "leike2020", "edenhofer2023" or "bh".

Alternatively, if you have used pip to install dustmaps, then you can configure the data directory and download the data by opening up a python interpreter and running:

>>> from dustmaps.config import config
>>> config['data_dir'] = '/path/where/you/want/large/data/files/stored'
>>>
>>> import dustmaps.sfd
>>> dustmaps.sfd.fetch()
>>>
>>> import dustmaps.csfd
>>> dustmaps.csfd.fetch()
>>>
>>> import dustmaps.planck
>>> dustmaps.planck.fetch()
>>>
>>> import dustmaps.planck
>>> dustmaps.planck.fetch(which='GNILC')
>>>
>>> import dustmaps.bayestar
>>> dustmaps.bayestar.fetch()
>>>
>>> import dustmaps.iphas
>>> dustmaps.iphas.fetch()
>>>
>>> import dustmaps.marshall
>>> dustmaps.marshall.fetch()
>>>
>>> import dustmaps.chen2014
>>> dustmaps.chen2014.fetch()
>>>
>>> import dustmaps.lenz2017
>>> dustmaps.lenz2017.fetch()
>>>
>>> import dustmaps.pg2010
>>> dustmaps.pg2010.fetch()
>>>
>>> import dustmaps.leike_ensslin_2019
>>> dustmaps.leike_ensslin_2019.fetch()
>>>
>>> import dustmaps.leike2020
>>> dustmaps.leike2020.fetch()
>>>
>>> import dustmaps.edenhofer2023
>>> dustmaps.edenhofer2023.fetch()

Querying the Maps

Maps are queried using astropy.coordinates.SkyCoord objects. This means that any coordinate system supported by astropy can be used as input. For example, we can query SFD'98 as follows:

>>> from dustmaps.sfd import SFDQuery
>>> from astropy.coordinates import SkyCoord
>>>
>>> sfd = SFDQuery()
>>>
>>> c = SkyCoord(
        '05h00m00.00000s',
        '+30d00m00.0000s',
        frame='icrs')
>>> print sfd(c)
0.483961

Above, we have used the ICRS coordinate system (the inputs are RA and Dec). We can use other coordinate systems, such as Galactic coordinates, and we can provide coordinate arrays. The following example uses both features:

>>> c = SkyCoord(
        [75.00000000, 130.00000000],
        [-89.00000000, 10.00000000],
        frame='galactic',
        unit='deg')
>>> print sfd(c)
[ 0.0146584   0.97695869]

Documentation

Read the full documentation at http://dustmaps.readthedocs.io/en/latest/.

Citation

If you make use of this software in a publication, please cite Green (2018) in The Journal of Open Source Software:

@ARTICLE{2018JOSS....3..695M,
       author = {{Green}, {Gregory M.}},
        title = "{dustmaps: A Python interface for maps of interstellar dust}",
      journal = {The Journal of Open Source Software},
         year = "2018",
        month = "Jun",
       volume = {3},
       number = {26},
        pages = {695},
          doi = {10.21105/joss.00695},
       adsurl = {https://ui.adsabs.harvard.edu/abs/2018JOSS....3..695M},
      adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

Development

Development of dustmaps takes place on GitHub, at https://github.com/gregreen/dustmaps. Any bugs, feature requests, pull requests, or other issues can be filed there. Contributions to the software are welcome.

dustmaps's People

Contributors

arfon avatar conornally avatar daniellenz avatar edenhofer avatar gregreen avatar hombit avatar simonkrughoff avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dustmaps's Issues

getting marshall map

this happens, whether the map is already installed or not.

import dustmaps.marshall
dustmaps.marshall.fetch()
Checking existing file to see if MD5 sum matches ...
File exists. Not overwriting.
Checking existing file to see if MD5 sum matches ...
File exists. Not overwriting.
Traceback (most recent call last):
File "", line 1, in
File "/opt/rh/rh-python35/root/usr/lib/python3.5/site-packages/dustmaps/marshall.py", line 375, in fetch
dat2hdf5(table_dir)
File "/opt/rh/rh-python35/root/usr/lib/python3.5/site-packages/dustmaps/marshall.py", line 237, in dat2hdf5
f.write(txt)
TypeError: write() argument must be str, not bytes

Results (uncertainties of reddening) from Web query form different from API query?

Hi,
When I use the web query form (at [http://argonaut.skymaps.info/query]), it gives the uncertainty of extinction \pm 0.02 for distances less than 1kpc(take equatorial coordinate (19.486306, 40.59179) for example, E(B-V) = 0.02 \pm 0.02 at 0.31 kpc).
And when I use API to query and specify
bayestar = BayestarWebQuery(version='bayestar2017') reddening= bayestar(coords , mode = 'percentile', pct = [16,50,84.])
It gives E(B-V) = [0.012 , 0.017,0.019]
So from that, it seems the uncertain for E(B-V) at that coordinate is -0.005/+0.002?

My question is: does the 16th, and 84th percentiles gives something equivalent to uncertainty of the E(B-V) from the web query form? If not, which argument do I call to obtain uncertainty for reddening at a certain coordinate using API? I tried to find it in the documentation the but maybe I missed it?
Thank you very much for your time.
Best
Cicero

Individual errors on reddening values

Hi,

I'm putting this here because I don't know where else to ask.

I'm using the Bayestar19 dustmap to retrieve reddening values. On the web tool http://argonaut.skymaps.info/ you can easily retrieve the reddening at a given co-ordinate WITH error. But when using the dustmaps package - I cannot find anywhere in the documentation how to get the errors associated with the reddening at a given co-ordinate.

I assume this information must exist, please advise.

Cheers, James

Holes in Bayestar

Hi,

I was wondering if there are any descriptions of the holes in Bayestar?

After filtering out data above -30 dec, I was encountering Nans in my code and eventually narrowed it down to the map itself. There is one coordinate that was above -30 dec but was still seeing Nans from the Bayestar query. I looked further into this and indeed Bayestar has one tiny streak of data above -30 dec that is incomplete (the yellow):

screen shot 2019-02-08 at 10 12 05 am

(y-axis is declination, x-axis is right ascension (ICRS), and the yellow shows queries which return NaNs)

I have also found another smaller hole higher up in declination. Therefore I was wondering if there are any descriptions of holes in Bayestar that I can use to filter my data?

Thanks!
Miles

Radius of regions?

Hi,
Thanks for providing such a great package!

When we use PlanckQuery or SFDQuery, as inputs we give RA and DEC to obtain E(B-V) values, but there is question.

What is size of my region? Is that circle or square? What is area of that region? Imagine situation for a single RA and DEC not an array.

When we compare values of Planck map and SFD98, Are they in same area?

That would be great if we can assign radius around each position, but I didn't find any proper documentation...

Thanks Gregory.

Error when downloading bayestar catalog

Hi!
I'm tried to download the bayestar catalog using dustmaps.bayestar.fetch() and I am getting the following error:

<ipython-input-4-42bcd98cf0e6> in <module>()
----> 1 dustmaps.bayestar.fetch()

~/anaconda3/lib/python3.6/site-packages/dustmaps/bayestar.py in fetch(version)
    596         doi,
    597         local_fname,
--> 598         file_requirements=requirements)
    599 
    600 

~/anaconda3/lib/python3.6/site-packages/dustmaps/fetch_utils.py in dataverse_download_doi(doi, local_fname, file_requirements, clobber)
    393 
    394             dataverse_download_id(file_id, md5sum,
--> 395                                   fname=local_fname, clobber=False)
    396 
    397             return

~/anaconda3/lib/python3.6/site-packages/dustmaps/fetch_utils.py in dataverse_download_id(file_id, md5sum, **kwargs)
    338 def dataverse_download_id(file_id, md5sum, **kwargs):
    339     url = '{}/api/access/datafile/{}'.format(dataverse, file_id)
--> 340     download_and_verify(url, md5sum, **kwargs)
    341 
    342 

~/anaconda3/lib/python3.6/site-packages/dustmaps/fetch_utils.py in download_and_verify(url, md5sum, fname, chunk_size, clobber, verbose)
    237                 bar = FileTransferProgressBar(content_length)
    238 
--> 239                 for k,chunk in enumerate(r.iter_content(chunk_size=chunk_size)):
    240                     f.write(chunk)
    241                     sig.update(chunk)

~/anaconda3/lib/python3.6/site-packages/requests/models.py in generate()
    743             if hasattr(self.raw, 'stream'):
    744                 try:
--> 745                     for chunk in self.raw.stream(chunk_size, decode_content=True):
    746                         yield chunk
    747                 except ProtocolError as e:

~/anaconda3/lib/python3.6/site-packages/urllib3/response.py in stream(self, amt, decode_content)
    434         else:
    435             while not is_fp_closed(self._fp):
--> 436                 data = self.read(amt=amt, decode_content=decode_content)
    437 
    438                 if data:

~/anaconda3/lib/python3.6/site-packages/urllib3/response.py in read(self, amt, decode_content, cache_content)
    382             else:
    383                 cache_content = False
--> 384                 data = self._fp.read(amt)
    385                 if amt != 0 and not data:  # Platform-specific: Buggy versions of Python.
    386                     # Close the connection when no data is returned

~/anaconda3/lib/python3.6/http/client.py in read(self, amt)
    447             # Amount is given, implement using readinto
    448             b = bytearray(amt)
--> 449             n = self.readinto(b)
    450             return memoryview(b)[:n].tobytes()
    451         else:

~/anaconda3/lib/python3.6/http/client.py in readinto(self, b)
    491         # connection, and the user is reading more bytes than will be provided
    492         # (for example, reading in 1k chunks)
--> 493         n = self.fp.readinto(b)
    494         if not n and b:
    495             # Ideally, we would raise IncompleteRead if the content-length

~/anaconda3/lib/python3.6/socket.py in readinto(self, b)
    584         while True:
    585             try:
--> 586                 return self._sock.recv_into(b)
    587             except timeout:
    588                 self._timeout_occurred = True

~/anaconda3/lib/python3.6/site-packages/urllib3/contrib/pyopenssl.py in recv_into(self, *args, **kwargs)
    278     def recv_into(self, *args, **kwargs):
    279         try:
--> 280             return self.connection.recv_into(*args, **kwargs)
    281         except OpenSSL.SSL.SysCallError as e:
    282             if self.suppress_ragged_eofs and e.args == (-1, 'Unexpected EOF'):

~/anaconda3/lib/python3.6/site-packages/OpenSSL/SSL.py in recv_into(self, buffer, nbytes, flags)
   1713         else:
   1714             result = _lib.SSL_read(self._ssl, buf, nbytes)
-> 1715         self._raise_ssl_error(self._ssl, result)
   1716 
   1717         # This strange line is all to avoid a memory copy. The buffer protocol

~/anaconda3/lib/python3.6/site-packages/OpenSSL/SSL.py in _raise_ssl_error(self, ssl, result)
   1544             pass
   1545         else:
-> 1546             _raise_current_error()
   1547 
   1548     def get_context(self):

~/anaconda3/lib/python3.6/site-packages/OpenSSL/_util.py in exception_from_error_queue(exception_type)
     52             text(lib.ERR_reason_error_string(error))))
     53 
---> 54     raise exception_type(errors)
     55 
     56 

Error: [('SSL routines', 'SSL3_GET_RECORD', 'decryption failed or bad record mac')]

This is not urgent. I successfully downloaded the map using python setup.py fetch --map-name=bayestar

Gradients

Hi @gregreen, great project - it is very helpful.

I need to read more, but I was wondering if it is possible to compute the gradient of dust with respect to position with this package (using Bayestar 2017 map)? I could estimate it numerically but I was wondering if there is a more clever way using the actual model used.

Thanks,
Miles

Cannot install on Windows

Installation fails due to the dependency of healpy on cfitsio. There's an issue on the healpy github about this that has been open since 2012, so a fix seems unlikely to be imminent.

If astropy_healpix gives you the functionality you need, using that instead could circumvent this issue.

UnicodeDecodeError during installation

During installation (both 'pip' or 'python setup.py'), I faced the error:

Traceback (most recent call last):
File "setup.py", line 135, in
long_description=readme(),
File "setup.py", line 128, in readme
return f.read()
[...]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 714: ordinal not in range(128)

The setup.py can't read the README.md file. Replacing in the latter file 'Doré' with 'Dore' solved the problem.

setup.py

on the develop branch, trying python setup.py install throws this error, I think the line needs a : instead of ,

File "setup.py", line 100
'chen2014', fetch_chen2014}
^

WebQuery not working anymore (June 2024)

I'm building a python package which include BayestarWebQuery to compute extinction based on dustmaps.
Since some days ago, it is not longer working. I suspect it is due to a failure in the web service.
The code I'm using is:

from astropy.coordinates import SkyCoord
import astropy.units as u
from astropy.table import Table
from dustmaps.bayestar import BayestarWebQuery
import numpy as np

def extinction(ra, dec, d):
    '''
    Function to compute extinction following the 
    3D model of Bayestar 2019

    Reference: dustmaps package (Green 2018. DOI: 10.21105/joss.00695)

    Arguments:
    ra, dec [deg]   Coordinates J2000
    d       [kpc]   Distance
    '''
    bayestar = BayestarWebQuery(version='bayestar2019')
    coords = SkyCoord(ra, dec, distance=d, frame='icrs')
    extinction = bayestar(coords, mode='median')
    return extinction

It was working perfectly until some days ago.
Thank you!

1.0.7 release with most recent changes

@gregreen there was a little discussion of this on a PR, but I'm moving to an issue for better visibility. I'm requesting a v1.0.7 release which will contain the most recent merged PR. This change set is needed for me to make this package a requirement of our packages. Let me know if there is anything I can do to help that along.

Thanks very much for maintaining this very useful package!

Internal Server Error when using BayestarWebQuery

When I run the example for how to use BayestarWebQuery from http://argonaut.skymaps.info/usage, it crashes with 500 Internal Server Error.

Here is what I did:

$ python3 -m venv myvenv
$ source myvenv/bin/activate
(myvenv) $ pip install dustmaps
Collecting dustmaps
Using cached dustmaps-1.0.10-py3-none-any.whl (450 kB)
...
Successfully installed PyYAML-6.0 astropy-5.2.2 certifi-2022.12.7 charset-normalizer-3.1.0 contourpy-1.0.7 cycler-0.11.0 dustmaps-1.0.10 fonttools-4.39.3 h5py-3.8.0 healpy-1.16.2 idna-3.4 kiwisolver-1.4.4 matplotlib-3.7.1 numpy-1.24.2 packaging-23.0 pillow-9.5.0 progressbar2-4.2.0 pyerfa-2.0.0.3 pyparsing-3.0.9 python-dateutil-2.8.2 python-utils-3.5.2 requests-2.28.2 scipy-1.10.1 six-1.16.0 urllib3-1.26.15
(myvenv) $ python3 dustmapsexample.py
Response received from server:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>500 Internal Server Error</title>
<h1>Internal Server Error</h1>
<p>The server encountered an internal error and was unable to complete your request.  Either the server is overloaded or there is an error in the application.</p>

Traceback (most recent call last):
File "dustmapsexample.py", line 9, in <module>
    reddening = bayestar(coords, mode='random_sample')
  File "myvenv/lib/python3.10/site-packages/dustmaps/map_base.py", line 112, in _wrapper_func
    return f(self, coords, **kwargs)
  File "myvenv/lib/python3.10/site-packages/dustmaps/map_base.py", line 512, in __call__
    return self.query(coords, **kwargs)
  File "myvenv/lib/python3.10/site-packages/dustmaps/map_base.py", line 112, in _wrapper_func
    return f(self, coords, **kwargs)
  File "myvenv/lib/python3.10/site-packages/dustmaps/map_base.py", line 359, in api_wrapper
    raise err
  File "myvenv/lib/python3.10/site-packages/dustmaps/map_base.py", line 355, in api_wrapper
    r.raise_for_status()
  File "myvenv/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: http://argonaut.skymaps.info/api/v2/bayestar2017/query
(myvenv) $ cat dustmapsexample.py
from astropy.coordinates import SkyCoord
import astropy.units as units
from dustmaps.bayestar import BayestarWebQuery

bayestar = BayestarWebQuery(version='bayestar2017')
coords = SkyCoord(90.*units.deg, 30.*units.deg,
                  distance=100.*units.pc, frame='galactic')

reddening = bayestar(coords, mode='random_sample')
print(reddening)

Getting an (Nx120) matrix rather than an (Nx1) matrix

The Jupyter notebook cellblock is:

bayestar = BayestarWebQuery(version='bayestar2019')
c = SkyCoord(ra=RAlist,dec=DEClist,
distance=Distancesu.pc, frame='icrs')
c2 = SkyCoord(c.galactic.l,c.galactic.b,
distance=Distances
u.pc, frame='galactic')
reddening = bayestar(c2, mode='percentile', pct=50.)
print(type(RAlist),type(DEClist),type(Distances))
print(np.shape(RAlist),np.shape(DEClist),np.shape(Distances))
print(np.shape(c2),np.shape(reddening))
print(np.mean(reddening),np.median(reddening),np.std(reddening))

The output is
<class 'astropy.table.column.Column'> <class 'astropy.table.column.Column'> <class 'numpy.ndarray'>
(1805,) (1805,) (1805,)
(1805,) (1805,)
nan nan nan

Also, for information's sake:
print(RAlist[0:9],DEClist[0:9],Distances[0:9])
yields:
ra
deg

85.71952670160155
331.5950430106236
353.9369156021425
9.97346964628328
93.41934818819222
210.71561378901362
87.73003127400648
10.016275437083367
13.227971468286738 dec
deg

3.565786366255105
-48.483665728252454
32.16632093528816
72.07734177175224
-35.718843946170665
-2.375407984123281
10.609960476038683
-13.884890712331707
-43.910909132674455 [333.30949824 333.27503203 333.21547772 333.20185198 333.16732067
333.11270716 333.03118245 332.99673381 332.96857719]

Signal-to-noise

Hi @gregreen,

I was wondering if there was a way to get signal-to-noise for a particular dust measurement?

My current strategy is to use random_sample to get a standard deviation, and divide the mean dust by that value. I then multiply this by flags['converged'] * flags['reliable_dist'] to return a signal-to-noise for the dust measurement.

What do you think?
Cheers,
Miles

Progress Bar Clash

When I install dustmaps and try to run

python setup.py fetch --map-name=leikeensslin2019

There is an import error. I currently have progressbar installed, but dustmaps requires progressbar2. These both get imported with the same name, so when I try to run the fetch script I get an import error:

from progressbar.widgets import DataSize, AdaptiveTransferSpeed, Bar,...
ImportError: cannot import name 'DataSize'

DataSize is not in progressbar, only progressbar2. This clash can currently be solved by installing progressbar2. It might be worth adding progressbar2 to the dustmaps requirements?

Trying to fetch a dust map raises a JSONDecodeError

I am trying to download one of the dust maps and encounter the following error, regardless of whether I try to download the map via pip [ eg. dustmaps.bayestar.fetch() ] or manually via setup.py [ eg. python setup.py fetch --map-name=bayestar ]:

File "/anaconda3/lib/python3.11/site-packages/dustmaps/bayestar.py", line 628, in fetch
    fetch_utils.dataverse_download_doi(
  File "/anaconda3/lib/python3.11/site-packages/dustmaps/fetch_utils.py", line 400, in dataverse_download_doi
    metadata = dataverse_search_doi(doi)
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/anaconda3/lib/python3.11/site-packages/dustmaps/fetch_utils.py", line 367, in dataverse_search_doi
    return json.loads(r.text)
           ^^^^^^^^^^^^^^^^^^
  File "/anaconda3/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/anaconda3/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/anaconda3/lib/python3.11/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

This error is actually raised regardless of which dust map I try to download. Any help in resolving this would be much appreciated, since I cannot currently access any of the dust maps from this package.

Gaia G,BP,BP A_X values

How do I convert Bayestart17 A to Gaia A_G, A_RB, and A_BP - couldn't find conversions on Schlafly and Finkbeiner 2011? Thanks!!

return_flags keyword arg TypeError

Hi, I am a python 3 user. Upon calling bayestar(coords,return_flags= True),
I got the following error:

<ipython-input-34-205de48190a6> in <module>()
     11 
     12 # ebv_sample()
---> 13 bayestar(coords,return_flags= True)

/anaconda/lib/python3.6/site-packages/dustmaps/map_base.py in _wrapper_func(self, coords, **kwargs)
    108         if not isinstance(coords, coordinates.SkyCoord):
    109             raise TypeError('`coords` must be an astropy.coordinates.SkyCoord object.')
--> 110         return f(self, coords, **kwargs)
    111     return _wrapper_func
    112 

/anaconda/lib/python3.6/site-packages/dustmaps/map_base.py in __call__(self, coords, **kwargs)
    433         An alias for ``WebDustMap.query``.
    434         """
--> 435         return self.query(coords, **kwargs)
    436 
    437     @ensure_coord_type

/anaconda/lib/python3.6/site-packages/dustmaps/map_base.py in _wrapper_func(self, coords, **kwargs)
    108         if not isinstance(coords, coordinates.SkyCoord):
    109             raise TypeError('`coords` must be an astropy.coordinates.SkyCoord object.')
--> 110         return f(self, coords, **kwargs)
    111     return _wrapper_func
    112 

/anaconda/lib/python3.6/site-packages/dustmaps/map_base.py in api_wrapper(self, *args, **kwargs)
    283 
    284             # Deserialize the response
--> 285             return json.loads(r.text, cls=decoder)
    286         return api_wrapper
    287     return decorator

/anaconda/lib/python3.6/json/__init__.py in loads(s, encoding, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    365     if parse_constant is not None:
    366         kw['parse_constant'] = parse_constant
--> 367     return cls(**kw).decode(s)

/anaconda/lib/python3.6/json/decoder.py in decode(self, s, _w)
    337 
    338         """
--> 339         obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    340         end = _w(s, end).end()
    341         if end != len(s):

/anaconda/lib/python3.6/json/decoder.py in raw_decode(self, s, idx)
    353         """
    354         try:
--> 355             obj, end = self.scan_once(s, idx)
    356         except StopIteration as err:
    357             raise JSONDecodeError("Expecting value", s, err.value) from None

/anaconda/lib/python3.6/site-packages/dustmaps/json_serializers.py in object_hook(self, d)
    404                     return deserialize_ndarray(d)
    405                 elif d['_type'] == 'np.dtype':
--> 406                     return deserialize_dtype(d)
    407                 elif d['_type'] == 'tuple':
    408                     return deserialize_tuple(d)

/anaconda/lib/python3.6/site-packages/dustmaps/json_serializers.py in deserialize_dtype(d)
     98                 col_descr.append(tuple(c))
     99         descr.append(tuple(col_descr))
--> 100     return np.dtype(descr)
    101 
    102 

TypeError: data type not understood```


Thank you. 

planck map implementation broken

both python 2.7x and 3.x do not work. python 3.x additionally need brush up of string handling.

[ planck]# python
Python 2.7.5 (default, Nov  6 2016, 00:28:07)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from __future__ import print_function
>>> from astropy.coordinates import SkyCoord
>>> from dustmaps.planck import PlanckQuery
>>>
>>> coords = SkyCoord('12h30m25.3s', '15d15m58.1s', frame='icrs')
>>> planck = PlanckQuery()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/dustmaps/planck.py", line 94, in __init__
    field=field)
  File "/usr/lib/python2.7/site-packages/dustmaps/healpix_map.py", line 108, in __init__
    if close_file:
UnboundLocalError: local variable 'close_file' referenced before assignment
>>> ebv = planck(coords)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'planck' is not defined

Astropy: deprecated properties

When serializing an astropy SkyCoord, recent versions of astropy warn that SkyCoord.representation is deprecated:

WARNING: AstropyDeprecationWarning: The `representation` keyword/property name is deprecated
in favor of `representation_type` [astropy.coordinates.baseframe]

This is due to the line

representation = o.representation.get_name()

in serialize_skycoord, in json_serializers.py.

How to deal with this? Check for representation_type first, and then fall back to representation if needed? The JSON decoder would probably have to be updated as well.

NaNs in Bayestar reliable distance flags

The presence of NaNs in the reliable distance flags can cause numpy to raise a RuntimeWarning (invalid value encountered in less_equal). This is due to the comparisons here.

The meaning of NaN in either DM_reliable_min or DM_reliable_max is that there were not enough main-sequence stars in the given pixel to come up with an estimate of the minimum/maximum reliable distance modulus. The correct behavior in this case is to label all distances in this range unreliable. One way to achieve this, while avoiding the numpy warning, is to change the NaNs in DM_reliable_min to +infinity, and in DM_reliable_max to -infinity.

Thanks to @paegert for pointing this bug out to me.

Removing progressbar2 as dependency

I thought I would bring this up since you mentioned you like cutting down on the number of dependencies; What do you think about replacing the current dependency on progressbar2 with a simple DIY progressbar. I have a very crude implementation of a tqdm-like default progress bar lying around that is only a couple hundred lines of code. One could verbatim copy this implementation to dustmaps and adjust it to be a progressbar for file downloads.

Allow override of default config location

I have put together a PR (#31) to allow for an environment variable to override the default location of the configuration overrides for dustmaps. There is a bit more explanation in the PR itself.

problem with fetching the Planck map

When following the installation procedure I encounter KeyError: 'content-length' while trying to fetch the Planck data. I'm using Python 3.6.6 Anaconda on Ubuntu 18.04.1 LTS 64-bit and I installed the package using pip.

I tried briefly tweaking the source code based on some suggestions but I did not manage to get it to work.

Please find details below.
Thanks,
Arash

Python 3.6.6 |Anaconda custom (64-bit)| (default, Oct  9 2018, 12:34:16) 
[GCC 7.3.0] on linux
>>> from dustmaps.config import config
>>> config['data_dir'] = './'
>>> import dustmaps.planck
>>> dustmaps.planck.fetch()
Downloading http://pla.esac.esa.int/pla/aio/product-action?MAP.MAP_ID=HFI_CompMap_ThermalDustModel_2048_R1.20.fits ...
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/.../dustmaps/planck.py", line 132, in fetch
    fetch_utils.download_and_verify(url, md5, fname=fname)
  File "/.../dustmaps/fetch_utils.py", line 236, in download_and_verify
    content_length = int(r.headers['content-length'])
  File "/.../requests/structures.py", line 52, in __getitem__
    return self._store[key.lower()][1]
KeyError: 'content-length'

How to get the full Heapix Gaia Extinction Map?

Hi,

I've been trying to use your library to obtain the full Gaia TGE map in healpix format but finding it a bit tricky to understand how.

Trying to access the map by reproducing what it is done internally in the code lead me to a map that doesn't really match Fig 24 in the original publication:

Screenshot 2023-12-20 at 17 49 32

As opposed to:
Screenshot 2023-12-20 at 17 49 48

Is it possible to access the full healpix map from this library?

Many thanks!

Double licensing

Some institutions, including the LINCC Frameworks team that I'm working with, have strict policies about the licenses of the dependency packages in use. I kindly ask that the project applies double-licensing with a more permissive license, like BSD or MIT.

If @gregreen agrees with this, I can reach out to other contributors to ask for re-licensing to happen.

Internal Server Error

Hi, I am getting a similar error #42 while using dustmap. Thanks for the solution using "BayestarQuery"

Response received from server:

<title>500 Internal Server Error</title>

Internal Server Error

The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.

Downloading IPHAS data

I'm trying to download the IPHAS dust map using

import dustmaps.iphas
dustmaps.iphas.fetch()

and at the end of the download I get the following error:

Downloading http://www.iphas.org/data/extinction/A_samp_030.tar.gz
Repacking files...
Progress: .---------------------------------------------------------------------------
NameError Traceback (most recent call last)
in ()
1 import dustmaps.iphas
----> 2 dustmaps.iphas.fetch()

//anaconda/lib/python2.7/site-packages/dustmaps/iphas.pyc in fetch(clobber)
379 # Convert from ASCII to HDF5 format
380 print('Repacking files...')
--> 381 ascii2h5(dest_dir, os.path.join(dest_dir, 'iphas.h5'))
382
383 # Cleanup

//anaconda/lib/python2.7/site-packages/dustmaps/iphas.pyc in ascii2h5(dirname, output_fname)
306
307 tar_fname_list = glob(os.path.join(dirname, 'A_samp_*.tar.gz'))
--> 308 d = np.hstack([process_tarball(fn) for fn in tar_fname_list])
309
310 print('+', end='')

//anaconda/lib/python2.7/site-packages/dustmaps/iphas.pyc in process_tarball(tarball_fname)
267 sys.stdout.flush()
268
--> 269 with closing(tarfile.open(tarball_fname, mode='r:gz')) as f_tar:
270 fnames = f_tar.getnames()
271

NameError: global name 'closing' is not defined

and my iphas directory looks like:

/Users/landerson/dustMaps/iphas
A_samp_030.tar.gz A_samp_070.tar.gz A_samp_110.tar.gz A_samp_150.tar.gz A_samp_190.tar.gz
A_samp_040.tar.gz A_samp_080.tar.gz A_samp_120.tar.gz A_samp_160.tar.gz A_samp_200.tar.gz
A_samp_050.tar.gz A_samp_090.tar.gz A_samp_130.tar.gz A_samp_170.tar.gz A_samp_210.tar.gz
A_samp_060.tar.gz A_samp_100.tar.gz A_samp_140.tar.gz A_samp_180.tar.gz

Add Lenz, Hensley, Doré map?

Hi Greg, thanks for making this package available! Would you mind including the E(B-V) map that I published a few months ago with Brandon Hensley and Olivier Doré (paper, data)?
I could also take a look at the code and make a PR.
Thanks,
Daniel

Bug or version issue in json_serializers.py?

When I try to run BayestarWebQuery, I get a type error that seems connected to the astropy.coordinates.SkyCoord object. But it traces back to dustmaps/json_serializers.py

  • Running Python 3.6, Astropy 3.0.2
  • Error reproduced on two computers
  • Works in Python 2.7, Astropy 2.0.6
import astropy.units as u
from astropy.coordinates import SkyCoord
from dustmaps.bayestar import BayestarWebQuery
DIST = np.arange(0.1, 2.501, 0.1) * u.kpc
V404 = SkyCoord(ra='20h24m03.820s', dec='+33d52m01.90s', distance=DIST)
bayestar = BayestarWebQuery(version='bayestar2017')
reddening = bayestar(V404, mode='median')

TypeError Traceback (most recent call last)
in ()
1 bayestar = BayestarWebQuery(version='bayestar2017')
----> 2 reddening = bayestar(V404, mode='median')

/anaconda3/lib/python3.6/site-packages/dustmaps/map_base.py in _wrapper_func(self, coords, **kwargs)
108 if not isinstance(coords, coordinates.SkyCoord):
109 raise TypeError('coords must be an astropy.coordinates.SkyCoord object.')
--> 110 return f(self, coords, **kwargs)
111 return _wrapper_func
112

/anaconda3/lib/python3.6/site-packages/dustmaps/map_base.py in call(self, coords, **kwargs)
433 An alias for WebDustMap.query.
434 """
--> 435 return self.query(coords, **kwargs)
436
437 @ensure_coord_type

/anaconda3/lib/python3.6/site-packages/dustmaps/map_base.py in _wrapper_func(self, coords, **kwargs)
108 if not isinstance(coords, coordinates.SkyCoord):
109 raise TypeError('coords must be an astropy.coordinates.SkyCoord object.')
--> 110 return f(self, coords, **kwargs)
111 return _wrapper_func
112

/anaconda3/lib/python3.6/site-packages/dustmaps/map_base.py in api_wrapper(self, *args, **kwargs)
267
268 # Serialize the arguments
--> 269 data = json.dumps(data, cls=encoder)
270
271 # POST request to server

/anaconda3/lib/python3.6/json/init.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
236 check_circular=check_circular, allow_nan=allow_nan, indent=indent,
237 separators=separators, default=default, sort_keys=sort_keys,
--> 238 **kw).encode(obj)
239
240

/anaconda3/lib/python3.6/json/encoder.py in encode(self, o)
197 # exceptions aren't as detailed. The list call should be roughly
198 # equivalent to the PySequence_Fast that ''.join() would do.
--> 199 chunks = self.iterencode(o, _one_shot=True)
200 if not isinstance(chunks, (list, tuple)):
201 chunks = list(chunks)

/anaconda3/lib/python3.6/json/encoder.py in iterencode(self, o, _one_shot)
255 self.key_separator, self.item_separator, self.sort_keys,
256 self.skipkeys, _one_shot)
--> 257 return _iterencode(o, 0)
258
259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,

/anaconda3/lib/python3.6/site-packages/dustmaps/json_serializers.py in default(self, o)
373 else:
374 return o
--> 375 return json.JSONEncoder.default(self, o)
376
377 return MultiJSONEncoder

/anaconda3/lib/python3.6/json/encoder.py in default(self, o)
178 """
179 raise TypeError("Object of type '%s' is not JSON serializable" %
--> 180 o.class.name)
181
182 def encode(self, o):

TypeError: Object of type 'bytes' is not JSON serializable

M Green?

Your bibtex cite block makes it look like your last name is "M. Green". Please advise.

planck map

was trying to fix this with a "brute force" solution (oxplot/fysom#1) then ran into the same bug as with with python2.75
thanks for fixing that one this quick

Python 3.5.1 (default, Oct 21 2016, 21:37:19)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux
Type "help", "copyright", "credits" or "license" for more information.

from future import print_function
from astropy.coordinates import SkyCoord
from dustmaps.planck import PlanckQuery

coords = SkyCoord('12h30m25.3s', '15d15m58.1s', frame='icrs')
planck = PlanckQuery()
Traceback (most recent call last):
File "", line 1, in
File "/opt/rh/rh-python35/root/usr/lib/python3.5/site-packages/dustmaps/planck.py", line 95, in init
field=field)
File "/opt/rh/rh-python35/root/usr/lib/python3.5/site-packages/dustmaps/healpix_map.py", line 89, in init
if isinstance(fname, basestring):
NameError: name 'basestring' is not defined

extinction density unit of Leike2020

The document gives two units of the returned density. The original texts are as below.

Returns the extinction density (in e-foldings / kpc, in Gaia G-band) at the given coordinates.

and

The extinction density, in units of e-foldings / pc, as either a numpy array or float, with the same shape as the input coords.

Encountered an error while using dustmaps.chen2014.Chen2014Query

Hello, everyone:

I want to use dustmaps.chen2014 to estimate reddening for a star,

the star info:
RA = 05 35 22.900
DEC = -05 24 57.79
distance = 332.793579

Computer information:
Ubuntu 20.04
python: 3.8.10
dustmap:1.0.10

My code is as follows:
from __future__ import print_function
from astropy.coordinates import SkyCoord
import astropy.units as u
from dustmaps.chen2014 import Chen2014Query
coordsi = SkyCoord("05 35 22.900", "-05 24 57.79", unit=(u.hourangle, u.deg), distance=ri*u.pc, frame="icrs")
ebv = Chen2014Query(coordsi)

Then I encountered an error like this:

TypeError Traceback (most recent call last)
Input In [29], in <cell line: 1>()
----> 1 ebv = Chen2014Query(coordsi)

File ~/.local/lib/python3.8/site-packages/dustmaps/chen2014.py:55, in Chen2014Query.init(self, map_fname)
52 if map_fname is None:
53 map_fname = os.path.join(data_dir(), 'chen2014', 'chen2014.h5')
---> 55 with h5py.File(map_fname, 'r') as f:
56 self._dists = f['dists'][:]
57 self._lb = f['pix_lb'][:]

File ~/.local/lib/python3.8/site-packages/h5py/_hl/files.py:509, in File.init(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, fs_strategy, fs_persist, fs_threshold, fs_page_size, page_buf_size, min_meta_keep, min_raw_keep, locking, alignment_threshold, alignment_interval, **kwds)
507 name = repr(name).encode('ASCII', 'replace')
508 else:
--> 509 name = filename_encode(name)
511 if track_order is None:
512 track_order = h5.get_config().track_order

File ~/.local/lib/python3.8/site-packages/h5py/_hl/compat.py:19, in filename_encode(filename)
11 def filename_encode(filename):
12 """
13 Encode filename for use in the HDF5 library.
14
(...)
17 filenames in h5py for more information.
18 """
---> 19 filename = fspath(filename)
20 if sys.platform == "win32":
21 if isinstance(filename, str):

TypeError: expected str, bytes or os.PathLike object, not SkyCoord

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.