GithubHelp home page GithubHelp logo

spacetelescope / pyfits Goto Github PK

View Code? Open in Web Editor NEW
27.0 21.0 13.0 14.12 MB

Git repository for the PyFITS project; PyFITS is deprecated, use Astropy (https://github.com/astropy/astropy)

Home Page: http://www.stsci.edu/resources/software_hardware/pyfits

License: Other

Python 96.81% C 3.19%
astronomy python fits

pyfits's Introduction

---------------------------
Important notice for PyFITS
---------------------------

**PyFITS is deprecated and no longer supported! This repo is
archived.**

All of the functionality of PyFITS is available in `Astropy
<http://www.astropy.org>`_ as the `astropy.io.fits
<http://docs.astropy.org/en/stable/io/fits/index.html>`_ package.
We will NOT be releasing new versions of PyFITS as
a stand-alone product.

--------------------
ARCHIVAL INFORMATION
--------------------

Documentation
=============
See the Users Guide and API documentation hosted at
http://pythonhosted.org/pyfits.

Development
===========
PyFITS is now on GitHub at:
https://github.com/spacetelescope/PyFITS

To report an issue in PyFITS, please create an account on GitHub and submit
the issue there, or send an e-mail to [email protected].  Before submitting an
issue please search the existing issues for similar problems.  Before asking
for help, please check the PyFITS FAQ for answers to your questions:
http://pythonhosted.org/pyfits/appendix/faq.html

The latest source code can be checked out from git with::

  git clone https://github.com/spacetelescope/PyFITS.git

An SVN mirror is still maintained as well::

  svn checkout https://aeon.stsci.edu/ssb/svn/pyfits/trunk

For Packagers
=============
As of version 3.2.0 PyFITS supports use of the standard CFITSIO library for
compression support.  A minimal copy of CFITSIO is included in the PyFITS
source under cextern/cfitsio.  Packagers wishing to link with an existing
system CFITSIO remove this directory and modify the setup.cfg as instructed
by the comments in that file.  CFITSIO support has been tested for versions
3.08 through 3.30.  The earliers known fully working version is 3.09.  Version
3.08 mostly works except for a bug in CFITSIO itself when decompressing some
images with BITPIX=-64.  Earlier versions *may* work but YMMV.  Please send in
any results of experimentation with other CFITSIO versions.

pyfits's People

Contributors

anntzer avatar astrofrog avatar barentsen avatar bsipocz avatar chanley avatar dougburke avatar dpshelio avatar embray avatar ericdepagne avatar hamogu avatar jhunkeler avatar joseph-long avatar justincely avatar jwoillez avatar kgullikson88 avatar larrybradley avatar mdboom avatar migueldvb avatar mwcraig avatar nparley avatar phirstgemini avatar pllim avatar saogaz avatar sargas avatar simongibbons avatar stsci-sienkiew avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pyfits's Issues

Loading image changes reported BITPIX value

Also reported to astropy: astropy/astropy#1502

With current pyfits (3.1.2) or astropy (0.2.4), the act of reading an image into a numpy array appears to force the reported value of BITPIX to -32. If the image is actually integer (e.g., BITPIX=16), pyfits/astropy.io.fits reports an incorrect BITPIX value when checked after loading pixel data.

This affects normal and Rice-compressed images. Floating-point (BITPIX -32) images are "unaffected." I suspect this is not the intended behavior.

Test image available here:
http://astro.phy.vanderbilt.edu/~siverdrj/survey11n.e.20100613.061.fits.fz

Steps to reproduce:

import pyfits as pf
hdulist = pf.open('survey11n.e.20100613.061.fits.fz')
hkeys1 = hdulist[1].header.copy()[:5]       # copy first, BITPIX = 16
scidata = hdulist[1].data                    # this changes the BITPIX
hkeys2 = hdulist[1].header.copy()[:5]       # this one has BITPIX -32 !

print "BITPIX (before): %3d" % hkeys1['BITPIX']
print "BITPIX (after):  %3d" % hkeys2['BITPIX']
print "Comparison: %s" % (hkeys1 == hkeys2)

Output:
BITPIX (before): 16
BITPIX (after): -32
Comparison: False

Normalization of append mode

A use case was pointed out to me on the astropy mailing list that used to work in PyFITS 2.4 but no longer works. This involved opening a gzip file in append mode and trying to write to it. For example, the simplest way to reproduce the problem is like so:

>>> fileobj = gzip.GzipFile('tmp.gz', 'ab+')
>>> h = pyfits.PrimaryHDU()
>>> h.writeto(fileobj)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/hdu/base.py", line 442, in writeto
    checksum=checksum)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/hdu/hdulist.py", line 669, in writeto
    hdulist = fitsopen(fileobj, mode=mode)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/hdu/hdulist.py", line 117, in fitsopen
    return HDUList.fromfile(name, mode, memmap, save_backup, **kwargs)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/hdu/hdulist.py", line 249, in fromfile
    save_backup=save_backup, **kwargs)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/hdu/hdulist.py", line 804, in _readfrom
    hdu = _BaseHDU.readfrom(ffo, **kwargs)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/hdu/base.py", line 401, in readfrom
    hdr = Header.fromfile(fileobj, endcard=not ignore_missing_end)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/header.py", line 430, in fromfile
    block = fileobj.read(actual_block_size)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/file.py", line 150, in read
    return self.__file.read(size)
  File "/internal/1/root/usr/local/lib/python2.7/gzip.py", line 240, in read
    raise IOError(errno.EBADF, "read() on write-only GzipFile object")
IOError: [Errno 9] read() on write-only GzipFile object

Indeed on some level this makes sense--GzipFile doesn't actually support a ab+ mode per se (where writes are appended but it's still possible to do random-access reads). This just gets translated to ab mode and does not allow reads.

However, PyFITS does not pick up the fact that it's in a write-only mode and that it should treat the file as if it's in "append" mode. The reason this fails is actually a bit complicated and is related to a workaround I added for Python 3 support back in ea6df40. Though my understanding of the issue at the time was incomplete. The problem has to do with how the io.FileIO object (the default file object in Python 3) handles append mode. I tried to work around it by just treating files opened in append mode as if they were opened in rb+ mode, but that obviously doesn't work in some cases. I described the problem in Python itself in more detail here: http://bugs.python.org/issue18876

What has to be done in the PyFITS end is to detect when files are opened in append mode and treat them accordingly. With GzipFile objects it's easy because they at least keep the 'a' in the mode string. FileIO objects on the other hand are trickier.

On POSIX-compliant systems we can find out if a file is in append mode easily:

fcntl.fcntl(fileobj.fileno(), fnctl.F_GETFL) & os.O_APPEND

but on Windows I am still at a loss. Windows does seem to support append mode but I have no idea how to find out if a file has actually been opened with it.

pyfits: issue with long headers when new value is unicode Edit

Copied from: astropy/astropy#468 (comment)

Assigning a long (>80char) string to a header keyword works for string values using the CONTINUE keyword with &:

newh = pyfits.PrimaryHDU()
newh.header['TEST'] = 'abcdefg' * 30

However, a warning is raised and the string is truncated if the string is unicode:

>>> newh = pyfits.PrimaryHDU()
>>> newh.header['TEST'] = u'abcdefg' * 30
WARNING: VerifyWarning: Card is too long, comment will be truncated. [pyfits.core]

This behavior is inconsistent; I believe the former is preferable.

inconsistent bytes/unicode data for string table data in python3

current git head 66a7b61 and 3.1.1
if you run this in python3 you get a unicode array for the dictionary like access and a char array for property access.
I can imagine the unicode variant could cause issues when trying to save again, but I didn't test if that is handled.

import numpy as np
import pyfits
names = np.array(['NGC1', 'NGC2', 'NGC3', 'NGC4'])
c1 = pyfits.Column(name='target', format='10A', array=names)
tbhdu = pyfits.new_table([c1])

print(repr(tbhdu.data['target']))
print(repr(tbhdu.data.target))
python3 test.py 
chararray(['NGC1', 'NGC2', 'NGC3', 'NGC4'], 
      dtype='<U10')
chararray([b'NGC1', b'NGC2', b'NGC3', b'NGC4'], 
      dtype='|S10')

DeprecationWarnings, ResourceWarnings with Python >=3.2

When warnings are enabled (using e.g. PYTHONWARNINGS="d" environmental variable), then importation of pyfits with Python >=3.2 triggers some DeprecationWarnings / ResourceWarnings. I use pyfits-3.2.

$ PYTHONWARNINGS="d" python3.3 -c 'import pyfits'
/usr/lib64/python3.3/site-packages/pyfits/version.py:80: ResourceWarning: unclosed file <_io.BufferedReader name=3>
  update_svn_info()
/usr/lib64/python3.3/site-packages/pyfits/version.py:80: ResourceWarning: unclosed file <_io.BufferedReader name=5>
  update_svn_info()
/usr/lib64/python3.3/site-packages/pyfits/core.py:124: DeprecationWarning: ErrorURLopener style of invoking requests is deprecated. Use newer urlopen functions/methods
  urllib.request._urlopener = ErrorURLopener()  # Assign the locally subclassed opener

(DeprecationWarning is with Python >=3.3.)

error python version 3.3 is needed, which was not found in registry

Hi All,

When I try to install PyFITS I get the following error

"python version 3.3 is needed, which was not found in registry"

But I do have python 3.3 installed as well as numpy and Scipy. I would appreciate any help sorting this out. Thanks

OS---Windows 7 Pro 64bit

Python----3.3.2

fitsdiff float comparison bug

Sometimes fitsdiff can produce output that looks like

   Data contains differences:
     Data differs at [3972, 1585]:
           1.28719
     Data differs at [3972, 1596]:
           -10.0794
     Data differs at [3972, 1604]:
           29.4032 

where the obvious question is "1.28719" differs from what? This bug can occur when the data contains two very slightly different float values such that repr(a) == repr(b). The fact that their reprs compare equal results in this bug.

Float values that differ (but within the specified tolerance) should display their values at least up to the first differing decimal place if not more.

HDUList.writeto loses its mind after write to sys.stdout

This is an odd one and will probably have to be addressed after the 3.2 release if I want to get it out today.

It seems to be related to some internal state that's being handled incorrectly. If I open an existing FITS file, say,

>>> hdul = pyfits.open('lib/pyfits/tests/data/test0.fits')

then I write it out to a new file (or even a file-like object like a BytesIO) that works fine:

>>> hdul.writeto('test.fits')

And if I subsequently write the same HDUList to a different file the results are, as one should expect, identical:

>>> hdul.writeto('test2.fits')
>>> open('test.fits', 'rb').read() == open('test2.fits', 'rb').read()
True

Where the weirdness begins is if I write to sys.stdout:

>>> hdul.writeto(sys.stdout)

The first time works correctly--it outputs the headers in the correct order to the screen and also writes the raw bytes of the data, the display of which depends entirely on how your terminal is configured. (On mine it just displays undecodable byte placemarkers, but they can be presumed the correct bytes).

But if I make a subsequent write immediately after that, either to sys.stdout or even a plain file, I get really odd behavior. It writes the primary HDU header, but then, while it writes the correct number of bytes, it just writes the primary header again over and over again for each HDU.

A subsequent write to a file on disk returns writeto to the correct behavior, though multiple repeated writes to sys.stdout display the faulty behavior. Very, very strange.

checksum not added with writeto(..., checksum=True) on unmodifed hdulists

reproduced with svn 2168 and 3.0.8

the manual chapter 1.6.4 Verification using FITS Checksum ... indicates that adding checksum=True to the .writeto call will add checksums to the output file.

This only works if the hdulist being written has been modified or newly created or already has a checksum.
If one just opens a file without checksums and write it out again with writeto(checksum=True) (as a new or same file) no checksum is added.
one must explicitly modify the hdulist or run add_checksum/add_datasum on all hdus manually for it to take effect.

e.g. the example from the manual:

import pyfits
pyfits.PrimaryHDU().writeto("file.fits")
# file.fits has no checksums
hdul = pyfits.open("file.fits", checksum=True) # update mode does not help either
hdul.writeto("newfile.fits", checksum=True)

newfile.fits has no checksums

writeto(clobber=True) writes a message to stderr

When explicitly forcing overwrite in a writeto method or function with the clobber=True option, a warning message is still issued (by default to stderr), while there's IMO no error nor need for a warning: everything is working as expected. Further handling of warnings and std error is therefore complicated.

Add Header event notification system

It would be endlessly useful if headers had an event notification subscription system. Any change on a header such as an addition of a keyword, a removal of a keyword, a change in value, a change in comment, etc. should publish an event to any and all subscribed listeners. Listeners should probably be stored on the Header instance as weakrefs to avoid reference cycles and various sorts of zombism.

Such a system would have a great many uses and would be very helpful for maintaining internal consistency of FITS files.

distribute_setup.py needed for installation

I'm running Mac OS 10.6.8 with Python 2.7.3. I just tried to install pyfits from the source tarball, and I got the following error:

% python setup.py install
Traceback (most recent call last):
  File "setup.py", line 6, in <module>
    from distribute_setup import use_setuptools
ImportError: No module named distribute_setup

I downloaded the source file for distribute_setup.py and dropped it in my unpacked pyfits3.1.2 directory and it installed successfully. I don't know if this is a machine/OS specific issue, but I thought I'd post the issue just in case someone else has it.

Installation to non-system path triggers SandboxViolation

I am trying install pyfits 3.1.2 for python 2.7.2 into a non-system directory, but this is failing with a SandboxViolation:

$ python setup.py install --prefix="~/local"
Traceback (most recent call last):
  File "setup.py", line 15, in <module>
    zip_safe=False
  File "/data/apps/python/2.7.2/lib/python2.7/distutils/core.py", line 112, in setup
    _setup_distribution = dist = klass(attrs)
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/dist.py", line 260, in __init__
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/dist.py", line 284, in fetch_build_eggs
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 563, in resolve
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 799, in best_match
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 811, in obtain
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/dist.py", line 327, in fetch_build_egg
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/command/easy_install.py", line 446, in easy_install
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/command/easy_install.py", line 476, in install_item
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/command/easy_install.py", line 655, in install_eggs
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/command/easy_install.py", line 930, in build_and_install
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/command/easy_install.py", line 919, in run_setup
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/sandbox.py", line 62, in run_setup
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/sandbox.py", line 105, in run
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/sandbox.py", line 64, in <lambda>
  File "setup.py", line 83, in <module>

  File "/data/apps/python/2.7.2/lib/python2.7/distutils/core.py", line 112, in setup
    _setup_distribution = dist = klass(attrs)
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/dist.py", line 260, in __init__
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/dist.py", line 284, in fetch_build_eggs
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 563, in resolve
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 799, in best_match
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 811, in obtain
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/dist.py", line 325, in fetch_build_egg
  File "/data/apps/python/2.7.2/lib/python2.7/distutils/cmd.py", line 109, in ensure_finalized
    self.finalize_options()
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/command/easy_install.py", line 158, in finalize_options
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/command/easy_install.py", line 264, in check_site_dir
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/sandbox.py", line 203, in _open
  File "/data/apps/python/2.7.2/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/sandbox.py", line 199, in _violation
setuptools.sandbox.SandboxViolation: SandboxViolation: open('./test-easy-install-35687.write-test', 'w') {}

The package setup script has attempted to modify files on your system
that are not within the EasyInstall build area, and has been aborted.

This package cannot be safely installed by EasyInstall, and may not
support alternate installation locations even if you run its setup
script by hand.  Please inform the package's author and the EasyInstall
maintainers to find out if a fix or workaround is available.

can't assign NaN value to header keyword

I'm trying to assign a python native 'NaN' value (not string) to a header keyword with:

fits[0].header[KEYWORD] = float('nan')

or

fits[0].header.update( { KEYWORD: float('nan') } )

Then, when I try to save changes it throws an error:

VerifyError:
HDU 0:
Card 112:
Card KEYWORD is not FITS standard (invalid value string: NAN.0).

If I check header before save it, the keyword which I've tried to update stores an invalid FITS value

KEYWORD = NAN.0

instead of

KEYWORD = NAN

with right padding as in floats or booleans.

Allow at least partial reads of corrupt data...?

When reading data from a corrupt file, in particular where the file is truncated, mmaping the file can fail because we try to map an address space larger than the size of the file, and most implementations won't allow this (and as far as I can tell there aren't any option flags that would make it behave otherwise).

That said, we could detect when the file is truncated and map as much of it as we can. The data would still be corrupt and we would have to modify the shape of the data to fill as much as we can. Apparently some readers like whatever ds9 uses can do this sort of thing--somewhere it was mentioned once that it was able to even partially decompress a corrupted compressed image.

I'm not against implementing this sort of functionality, but it has to be made very clear to the user that they're looking at a subset of the data and that the file is corrupt.

updating integer header key to double does not work

updating a header key that is integer to double format fails wit pyfits 3.2 and current git head:

import pyfits
print pyfits.__version__
f = pyfits.PrimaryHDU()
f.header['TEST'] = 1
h = pyfits.HDUList()
h.append(f)
h.writeto('test.fits', clobber=True)
del h
del f

f = pyfits.open('test.fits', mode='update')
print type(f[0].header['test'])
print 'assign a float'
f[0].header['test'] = 1.0
print 'new type', type(f[0].header['test'])

output:

3.2.dev
Overwriting existing file 'test.fits'.
<type 'int'>
assign a float
new type <type 'int'>

the type remains integer even though I want a float. A work around is to set it to e.g. 1.0000001 which will then trigger the conversion.

Newlines silently allowed in card values

I got an issue report involving header cards that happened to contain newlines in some of the COMMENT and HISTORY cards. PyFITS was able to read the files and display the headers fine, but did not raise any warnings about the illegal values in those cards.

If reading such a file warnings should be raised and the offending newline characters should probably be either stripped or replaced with spaces. Upon assigning such a value to a commentary keyword it might be a nice feature if the value was split into multiple commentary cards at the newlines. Otherwise it should be treated as an invalid value.

Table Checklist

This is an overview of things that need to be done and issues that need to be fixed in the process of overhauling the Table interface.

(the text of this issue is a WIP)

  • The Astropy table sub-package needs to be backported into PyFITS to start with. This will probably drop support for a few features specific to Astropy, such as Units and color printing. Later, when this work is merged into Astropy, it will be natural to add Units support back on.
  • FITS tables should be able to handle duplicate column names (though it should warn against them). See https://trac.assembla.com/pyfits/ticket/191 (this could be handled in a manner similar to duplicate keywords in headers)
  • When creating a new Column from an existing array or list, make sure the shape of the input data is sensible for the shape of the column (taking into account the repeat count of the data format and the TDIM keyword). Make sure also that the data type either matches up or can be converted to the correct type. (See https://trac.assembla.com/pyfits/ticket/212). I think the Astropy Column class will already mostly handle this, though some FITS-specific extensions may be required.
  • It needs to be possible to read data in a memory-efficient manner of large columns that require the data to be converted from the on-disk format to some other representation (see for example https://trac.assembla.com/pyfits/ticket/33, though it would be nice to do this without a separate "sections" interface). This might be very tricky, however.
  • Users should create new table data through the Table class and Column class (or some FITS-specific subclass thereof). They can then pass that data in to the BinTableHDU and TableHDU constructors to create the appropriate HDUs. Those constructors would be responsible for informing the user whether the table data can actually be serialized in that format. This is especially relevant when creating an ASCII table (TableHDU) as it supports a much more limited range of data types. (see https://trac.assembla.com/pyfits/ticket/60)
  • Table columns should support optional recognition of unsigned integers in addition to the usual tscal/tzero support (see https://trac.assembla.com/pyfits/ticket/192)--preliminary support for this is implemented in astropy/astropy#362, but it still needs to integrate better with other table data transformation code, it needs to work with variable length arrays, and should be possible to enable/disable on a per-column basis. Most of this work is progressing in the table-scaling branch but won't be ready for the v3.2 release.
  • NULL/BLANK values in both binary and ASCII tables should be handled sensibly (see https://trac.assembla.com/pyfits/ticket/207)
  • Zero-width (empty) columns should be supported (see https://trac.assembla.com/pyfits/ticket/42)

Deleting columns through ColDefs.del_col does not delete all column keywords from header

In the course of working on #41 I discovered an issue that also exists in Astropy, but was for some reason being obscured because Astropy's tests are for some reason hiding the warning that occurs. But when I run the test pyfits.tests.test_core.TestCore.test_add_del_columns2 a warning is displayed that looks like:

Column null option (TNULLn) is invalid for binary table columns of type '3A' (got -2147483647).

This is because the original test file contains a TNULL keyword for an integer-formatted column that is deleted later in the test. However, deleting that column does not result in the TNULL1 keyword being deleted. The new first column, however, is a character column and the TNULL1 keyword is invalid (even if it weren't, it should have been deleted).

For the 3.2.0 release I will include this bug so that it is at least consistent with Astropy. It should be fixed in the first bug release, however. I've confirmed this is also pretty badly broken in older versions of PyFITS as well.

TDIMn

I am running into an issue with creating a multidimension string array. Originally I discovered this issue when I loaded a fits file, but the below code should recreate the issue. I think this is the same issue as http://trac.assembla.com/pyfits/ticket/201 , but I cannot seem to login there.

import numpy as np
import pyfits
output = [pyfits.Column(name='number',format='1A',array=np.array(['a','b'])),
      pyfits.Column(name='array',format='4A',dim='(2,2)',
                    array=np.array([['a','bc'],
                                    ['cd','e']])),
      ]

hdu = pyfits.new_table(pyfits.ColDefs(output))
hdu.writeto('test.fits', clobber=True)

The error that python 2.7 running '3.2.dev' installed from the github repo gives:

ERROR: ValueError: could not broadcast input array from shape (2,2) into shape (2)     [pyfits.hdu.table]
    Traceback (most recent call last):
  File "run.py", line 145, in <module>
    hdu = pyfits.new_table(pyfits.ColDefs(output))
  File "/Users/ajmendez/usr/local/lib/python2.7/site-packages/pyfits-3.2.dev2086-py2.7-    macosx-10.5-i386.egg/pyfits/hdu/table.py", line 1229, in new_table
    field[:n] = arr[:n]
    ValueError: could not broadcast input array from shape (2,2) into shape (2)

Add missing after=True argument to Header.insert

Phil pointed out to me that some of the API docs for the Header class mention that Header.insert takes an after=True to insert after the selected index instead of before.

The more I look at it, I clearly intended to include this option but managed to forget. I think this can be treated as a "bug of omission" and go into the next bug fix release, so long as the argument is added as an optional keyword argument, so it won't interfere with any existing code.

The docs also suggest that Header.insert should accept a keyword name as its first argument. I think that was also intended but omitted.

Creating a Variable Length Array Table - CFITSIO syntax error

Dear PyFITS developers,

I have a problem with creating a FITS file from a variable length array table. In particular, the output FITS file is corrupted.

I'am using PyFITS 3.1.2.
The code is the documentation example:

>>> import pyfits
>>> import numpy as np
>>> c1 = pyfits.Column(name=’var’, format=’PJ()’, array=np.array([[45., 56], [11, 12, 13]], dtype=np.object))
>>> c2 = pyfits.Column(name=’xyz’, format=’2I’, array=[[11, 3], [12, 4]])
# the rest is the same as a regular table.
# Create the table HDU
>>> tbhdu = pyfits.new_table([c1, c2])
>>> print tbhdu.data
FITS_rec([(array([45, 56]), array([11, 3], dtype=int16)),
(array([11, 12, 13]), array([12, 4], dtype=int16))],
dtype=[(’var’, ’<i4’, 2), (’xyz’, ’<i2’, 2)])
# write to a FITS file
>>> tbhdu.writeto(’var_table.fits’)

When I try to open the FITS file, I get the following error:

syntax error in expression "-": premature end of expression
syntax error in expression "-": premature end of expression
while executing
"expr -$val"
(object "::FitsExtension::fitsTable0" method "::Table::readInTable" body line 148)
invoked from within
"readInTable"
(object "::FitsExtension::fitsTable0" method "::Table::makeTable" body line 27)
invoked from within
"makeTable $extNum
"
(object "::FitsExtension::fitsTable0" method "::FitsTable::dispTable" body line 20)
invoked from within
"$fT dispTable $extNum
$content_"
(object "::FitsFile::fitsExtension0" method "::FitsExtension::dispTable" body line 46)
invoked from within
"$fE dispTable $extNum_ $coor_ $newTable_"
(object "::fitsFile0" method "::FitsFile::openTable" body line 13)
invoked from within
"::fitsFile0 openTable 2"
(in namespace inscope "::FitsFile" script line 1)
invoked from within
"namespace inscope ::FitsFile {::fitsFile0 openTable 2}"
invoked from within
"._fhl_fitsFile0.contentframe.lwchildsite.clipper.canvas.sfchildsite.tab2 invoke"
("uplevel" body line 1)
invoked from within
"uplevel #0 [list $w invoke]"
(procedure "tk::ButtonUp" line 22)
invoked from within
"tk::ButtonUp ._fhl_fitsFile0.contentframe.lwchildsite.clipper.canvas.sfchildsite.tab2"
(command bound to event)

I think it is related to the fact that the TFORM in the header is PJ(), i.e. it does not write the maximum length.

Thank you in advance for any reply,
Valentina Fioretti

Header modifications of compressed HDU are not writen to file after flush or close

When opening a fits file in update mode, and modifying the header of a
hdu.compressed.CompImageHDU, the modifications are not written down to the fits file when closing the file.

Modification to the _header attribute on the other hand are written down. But the documentation says :
"The contents of the corresponding binary table HDU may be accessed using the hidden ._header attribute. However, all user interface with the HDU header should be accomplished through the image header (the .header attribute)"

So in my understanding, modifications to the .header attribute should be written down to file.

My version of pyfits is 3.1.2

Here is an example demonstrating the odd behavior, where /tmp/test.fits contains a compressed image.

import pyfits
# Open fits file in update mode
hdulist = pyfits.open("/tmp/test.fits", mode = 'update')

# Print the type of the second hdu
print type(hdulist[1])
>>> <class 'pyfits.hdu.compressed.CompImageHDU'>

# Assign a test value to a test keyword of the .header
hdulist[1].header['test'] = "test"
print hdulist[1].header['test']
>>> test
# Assign a test value to a test keyword of the ._header
hdulist[1]._header['test2'] = "test2"
print hdulist[1]._header['test2']
>>> test2

# Close the file. I expect that modification to the header would be written to the fits file
hdulist.close()

# Verify if it has been really updated
hdulist2 = pyfits.open("/tmp/test.fits")
print 'test' in hdulist2[1].header
>>> False
print 'test2' in hdulist2[1].header
>>> True

variable-length arrays are corrupted when copying from one file to another, and when accessed without using the field() method

To see what I mean:

In [1]: import pyfits
In [2]: pyfits.__version__                                                     
Out[2]: '3.1.2'

In [3]: f = pyfits.open("glg_cspec_n4_bn080916009_v07.rsp")                    
In [4]: data = f['SPECRESP MATRIX']                                            
In [5]: dataCopy = data.copy()                                                 
In [6]: print(len(data.data.field("MATRIX")[5]))                               
128

In [7]: print(len(dataCopy.data.field("MATRIX")[5]))                           
2

The FITS file contains a variable-length array in the column MATRIX, which has 128 values maximum. As you can see, in the original file the 6th row has indeed 128 values defined. When I do the copy, the new matrix instead has 2 values in the 6th row. Actually, it has 2 values in all rows (and is essentially garbage).

I obtain exactly the same behavior if I open using memmap=False.

I don't know if this is related, but if I use the name of the column directly, instead of using field([columnName]), I obtain a malformed answer also in the original file:

In [12]: len(data.data.field("MATRIX")[5])                                     
Out[12]: 128

In [13]: len(data.data.MATRIX[5])                                              
Out[13]: 2

Poor handling of illegal characters in Header

Per an e-mail to the Astropy mailing list:

Hi,

I want to use the "ñ" character in a fit header. The verify in astropy is giving me an error.

WARNING: VerifyWarning:         Card 'OBSERVER' is not FITS standard (invalid value string: 'Maria Rosa Mu�oz ').  Fixed 'OBSERVER' card to meet the FITS standard. [astropy.io.fits.verify]
ERROR: UnicodeDecodeError: 'ascii' codec can't decode byte 0xf1 in position 1946: ordinal not in range(128) [astropy.io.fits.header]
Traceback (most recent call last):
  File "./fixfitheader.py", line 30, in <module>
    hdulist.writeto(sys.argv[2]+'/'+filename, output_verify='fix')
  File "/usr/local/lib/python2.6/dist-packages/astropy/io/fits/hdu/hdulist.py", line 675, in writeto
    hdu._writeto(hdulist.__file)
  File "/usr/local/lib/python2.6/dist-packages/astropy/io/fits/hdu/base.py", line 497, in _writeto
    self._writeheader(fileobj)
  File "/usr/local/lib/python2.6/dist-packages/astropy/io/fits/hdu/base.py", line 428, in _writeheader
    self._header.tofile(fileobj)
  File "/usr/local/lib/python2.6/dist-packages/astropy/io/fits/header.py", line 625, in tofile
    fileobj.write(blocks.encode('ascii'))
UnicodeDecodeError: 'ascii' codec can't decode byte 0xf1 in position 1946: ordinal not in range(128)

Is there a solution for this issue?

Obviously PyFITS is correct to raise an exception here, but the specific exception given is not terribly useful. First of all, the VerificationWarning states: "Fixed 'OBSERVER' card to meet the FITS standard." which it clearly did not, because there's still a crash later one when trying to write out the header.

Need to decide what the correct behavior should be, but it should either raise an UnfixableError exception immediately upon validation, or it should have a default fix of removing (or possibly replacing) the offending character. There are a number of possible approaches for replacement--there do exist transliteration tables. But it's probably not worth the effort in an initial fix.

Table dump format doesn't handle columns with array values

Kevin sent me a file containing table columns that have arrays in some of the columns (not VLAs but just fixed-size arrays). When calling tabledump with one of these files the array value is just output to the data dump as "". This results in a crash when trying to load the table back in to a new BinTableHDU:

tbhdu = BinTableHDU.load(datafile,cdfile=cdfile,hfile=scihdr)
    File

"/usr/stsci/pyssgx/2.7.3.stsci_python/lib/python/pyfits-3.2.dev2044-py2.7
-l
inux-x86_64.egg/pyfits/hdu/table.py", line 813, in load
      data = cls._load_data(datafile, coldefs)
    File

"/usr/stsci/pyssgx/2.7.3.stsci_python/lib/python/pyfits-3.2.dev2044-py2.7
-l
inux-x86_64.egg/pyfits/hdu/table.py", line 1032, in _load_data
      data[row][col] = val
    File

"/usr/stsci/pyssgx/2.7.3.stsci_python/lib/python/pyfits-3.2.dev2044-py2.7
-l
inux-x86_64.egg/pyfits/fitsrec.py", line 97, in __setitem__
      self.array.field(indx)[self.row] = value
ValueError: could not convert string to float: 

I believe this issue also applies to 3.0.x, but I need to double-check that.

Document FAQ on installer issues on Windows

There was a user who was having difficulty with the binary installer on Windows 7; while not a PyFITS-specific issue it is annoying. Apparently the Windows installer for Python itself only makes registry entries for HKEY_LOCAL_MACHINE and not under HKEY_CURRENT_USER. But depending on an individual's User Access Control settings the package installer may only be able to look for registry entries in HKEY_CURRENT_USER and not HKLM. Yay Windows registry!

The stupid thing about the Windows binary installers for Python packages is that if it can't find the right Python version in the registry there's no way for it proceed--there's no way to force it to use a specific Python installation. So the user is completely blocked from installation.

I gave one user a script, attached below, that added the necessary registry entries to HKCU and he reported success with that, so we should mention this issue in the FAQ and give a link for users to download the script:

#
# script to register Python 2.0 or later for use with win32all
# and other extensions that require Python registry settings
#
# written by Joakim Loew for Secret Labs AB / PythonWare
#
# source:
# http://www.pythonware.com/products/works/articles/regpy20.htm
#
# modified by Valentine Gogichashvili as described in http://www.mail-archive.com/[email protected]/msg10512.html
# modified for Python 3 support by Erik Bray <[email protected]>

from __future__ import print_function


import sys

from winreg import *

# tweak as necessary
version = sys.version[:3]
installpath = sys.prefix

regpath = "SOFTWARE\\Python\\Pythoncore\\{0}\\".format(version)
installkey = "InstallPath"
pythonkey = "PythonPath"
pythonpath = "{0};{1}\\Lib\\;{2}\\DLLs\\".format(
    installpath, installpath, installpath)


def RegisterPy():
    try:
        reg = OpenKey(HKEY_CURRENT_USER, regpath)
    except EnvironmentError as e:
        try:
            reg = CreateKey(HKEY_CURRENT_USER, regpath)
            SetValue(reg, installkey, REG_SZ, installpath)
            SetValue(reg, pythonkey, REG_SZ, pythonpath)
            CloseKey(reg)
        except:
            print("*** Unable to register!")
            return
        print("--- Python", version, "is now registered!")
        return
    if (QueryValue(reg, installkey) == installpath and
        QueryValue(reg, pythonkey) == pythonpath):
        CloseKey(reg)
        print("=== Python", version, "is already registered!")
        return
    CloseKey(reg)
    print("*** Unable to register!")
    print("*** You probably have another Python installation!")

if __name__ == "__main__":
    RegisterPy()

PyFITS does not interact well with multiprocess and/or matplotlib

I have been having trouble getting matplotlib figures to save within a process generated by multiprocessing. Today I discovered that this problem only occurs if I have attempted to import pyfits. Here is a piece of test code that behaves as expected and produces a file called testfig.png:

    from multiprocessing import Process
    # import pyfits

    def makeplot(a,b):
        import matplotlib.pyplot as plt
        fig = plt.figure()
        ax = fig.add_subplot(111)
        ax.plot(a,b,'o')
        fig.savefig('testfig.png')
        plt.close()
        return None

    p = Process(target = makeplot, args = ([1,2,3,4], [5,6,7,8]))
    p.daemon = True
    p.start()
    p.join()

If I uncomment the import of pyfits, however, a figure is no longer produced and saved by this same piece of test code. Are there known reasons why pyfits changes the expected behavior of multiprocessing and/or matplotlib?

Bug assigning a mmap'd array to the data of another file

This is a corner case I found with the help of Sara Ogaz.

If I have two FITS files A and B opened with memmap=True (the default) and A open in update mode, if I do something like A[0].data = B[0].data, and then A.flush() this should replace the data for A's first HDU with the data from B's first HDU.

But in fact that doesn't happen. Somewhere along the line A loses track of the fact that its data was replaced with a completely different array. When it gets to writing the data it actually just calls .flush() on B's data. Since B is open read-only this actually silently does nothing, and doesn't modify A at all.

This is a simple test that reproduces the bug:

import pyfits
import numpy as np

arr1 = np.arange(10)
arr2 = arr1 * 2

hdu1 = pyfits.PrimaryHDU(data=arr1)
hdu1.writeto('test1.fits', clobber=True)
hdu2 = pyfits.PrimaryHDU(data=arr2)
hdu2.writeto('test2.fits', clobber=True)

hdul1 = pyfits.open('test1.fits', mode='update', memmap=True)
hdul2 = pyfits.open('test2.fits', memmap=True)
hdul1[0].data = hdul2[0].data
hdul1.close()
hdul2.close()

hdul1 = pyfits.open('test1.fits')

assert np.all(hdul1[0].data == arr2)

This is actually a pretty serious bug because not only does it fail; it fails silently. So it should be fixed ASAP.

Error setting None to existing image data

This is a regression caused by the fix to #25. Previously if setting .data = None on an Image HDU and then saving, that HDU would basically be saved as an empty, dimensionless image (NAXIS = 0).

Now trying to set .data = None on an image with existing data results in an exception:

>>> h[1].data = None
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/util.py", line 114, in __set__
    ret = self._fset(obj, val)
  File "/internal/1/root/src/PyFITS/git/lib/pyfits/hdu/image.py", line 219, in data
    self._bitpix = _ImageBaseHDU.ImgCode[data.dtype.name]
KeyError: 'object'

We should also make sure that setting table data to None is handled sensibly. Table data is a little more complicated because it must have NAXIS=2. But it may have NAXIS2 = 0 (indicating zero rows). So setting .data = None on a table should result in a rowless table (unless there are no columns defined yet either).

ZDITHER0 keyword not handled

I've been dealing off and on now for a while with a rather puzzling support request:

The images to run an example you can get with:
wget www.usm.uni-muenchen.de/~mkuemmel/DECam_00162313_47.fits.fz
wget www.usm.uni-muenchen.de/~mkuemmel/cDECam_00163235_55Sci.fits

When running the example script below, I get the error:
----------------------------------------------------------------------
mkuemmel@chatwin:~/masking/data/cosmicfits/problem/ttt$ python test.py
Traceback (most recent call last):
File "test.py", line 11, in <module>
inOutImg.flush()
File
"/usr/local/lib/python2.7/dist-packages/pyfits-3.2.dev-py2.7-linux-x86_64.egg/pyfits/util.py",
line 291, in wrapped
func(*args, **kwargs)
File
"/usr/local/lib/python2.7/dist-packages/pyfits-3.2.dev-py2.7-linux-x86_64.egg/pyfits/hdu/hdulist.py",
line 580, in flush
self._flush_update()
File
"/usr/local/lib/python2.7/dist-packages/pyfits-3.2.dev-py2.7-linux-x86_64.egg/pyfits/hdu/hdulist.py",
line 901, in _flush_update
hdu._prewriteto(checksum=hdu._output_checksum, inplace=True)
File
"/usr/local/lib/python2.7/dist-packages/pyfits-3.2.dev-py2.7-linux-x86_64.egg/pyfits/hdu/compressed.py",
line 1503, in _prewriteto
self._update_compressed_data()
File
"/usr/local/lib/python2.7/dist-packages/pyfits-3.2.dev-py2.7-linux-x86_64.egg/pyfits/hdu/compressed.py",
line 1292, in _update_compressed_data
heapsize, self.compressed_data = compression.compress_hdu(self)
RuntimeError: Error reading elements 1 thru 538976312 from column 1 

I did find a temporary workaround for this issue that works for now. The test code used to reproduce the bug, with the workaround included (but commented out) is as follows:

import glob, shutil
import pyfits
import numpy

print pyfits.__version__

for fname in glob.glob('orig/*.fits'):
    shutil.copy(fname, '.')

#inOutImg = pyfits.open('DECam_00162313_47.fits.fz', uint=True)
#raw_image = inOutImg[1].data.copy()
#inOutImg.close()

inOutImg = pyfits.open('DECam_00162313_47.fits.fz', 'update', uint=True)
inImgII  = pyfits.open('cDECam_00163235_55Sci.fits')


# Comment out this line and uncomment lines 10-12 to enable the workaround
raw_image = inOutImg[1].data

idArray = numpy.asarray(
        numpy.where((inImgII[0].data == 0.0) &
                    (raw_image - inImgII[0].data != 0.0), 1, 0),
        dtype=numpy.int16)

inOutImg[2].data += 16 * idArray

written_data = inOutImg[2].data.copy()

inOutImg.close()
inImgII.close()

h = pyfits.open('DECam_00162313_47.fits.fz', uint=True)

assert numpy.all(h[2].data == written_data)

print 'File updated correctly

The reason the workaround works is this: Without the workaround the compressed data in HDU3 is read (and decompressed) while the file is open in update mode. This means that when we flush changes to the file we have to allow for the possibility that the data has been updated and needs to be recompressed [As an aside: In this case the data has not been changed and the recompression is superfluous; in the future we really ought to track this more carefully, at least for compressed HDUs. A good checksum would do. While expensive to compute it would still, in most cases, be less expensive than unnecessarily recompressing the entire image.].

The recompression during writing is where this bug is occurring (I'll explain why further down). In the workaround we read the data from HDU3 into memory in read-only mode first. Then when we reopen the file in updated mode we do not touch HDU3 again--so long as the data hasn't been decompressed we can rest assured it hasn't been modified, so it gets left alone upon writing out.

Now the bug is that I wasn't reading the DITHER0 keyword in the dither_seed member of the fitsfile struct in CFITSIO. Without a requested dither_seed the software proceeds to compute its own and try to write it to the file. This causes all sorts of havoc, as the way PyFITS accesses CFITSIO is such that we can't allow any attempts to write to the header (CFITSIO should only handle the data).

Therefore PyFITS has to handle applying existing and generating new dithering seeds and manually pass those on to CFITSIO. We can easily do this directly within the compression module using the same randomization algorithm as CFITSIO.

.fz format file: ValueError: no compressed or uncompressed data for tile.

With the stable 3.1.2 version I am getting the error above. The file is read OK by ds9 so I assume pyfits is not recognizing some keywords possibly due to the order. The fz FITS
file is a MEF created by SWARP runnning on NOAO Blanco data from the DECam instrument.

The problem seems to be fixed in 3.2.dev and also in astropy.io.fits based on a laptop installation.

Is there a timescale for a fixed version of 3.1.n or 3.2 since we prefer to install stable releases on our science cluster.

Thanks

HDU Improvement Checklist

This is a broad overall checklist of things that need to be fixed and cleaned up a bit in the pyfits HDU API. Some of these reference existing issues, and some of them are new topics for which issues should be created. I will update this checklist as I think of more items.

  • Split verification into input verification and output verification phases; include support for full verification upon opening a file (see https://trac.assembla.com/pyfits/ticket/182, astropy/astropy#873, https://trac.assembla.com/pyfits/ticket/205)
  • Work on a specialized ndarray subclass for FITS data that has built-in support for features like scaling and other types of transformations required for FITS data (such as 'T'/'F' -> True/False on boolean columns. This could be used for both table columns and image arrays.
  • Perhaps as sort of a specialized subclass of the above mentioned FITS data array type, support a way to decompress tile-compressed images one tile at a time. This would support something like https://trac.assembla.com/pyfits/ticket/100 and might support or supersede https://trac.assembla.com/pyfits/ticket/187)
  • Add a .ver attribute in addition to .name and keep .name and .header['EXTNAME'] as well as .ver and .header['EXTVER'] fully synchronized (see https://trac.assembla.com/pyfits/ticket/185) Should also add support for the EXTLEVEL keyword and a .level attribute.
  • Relatedly, detect when HDUs in the same HDUList have an identical EXTNAME/EXTVER/EXTLEVEL combination. Although the FITS standard does not forbid this but does, for obvious reasons, recommend that they be unique. So when they're not unique this should at least produce a warning. This will be easier to tackle when #14 is implemented.
  • Streamline handling of mandatory/reserved keywords on standard HDU types. Each HDU type should have perhaps as one or more class-level attributes a set of all its reserved keywords and specify the constraints on those keywords. This would be similar to what is already specified in each HDU type's ._verify but made more declarative.
  • It might be nice if all valid HDUs had a .dtype attribute returning Numpy dtype object specifying what the type of the data array would be even if the data hasn't been read (this might also allow an iterable of dtypes to support tables though it's not clear yet whether or not multi-field dtypes will be sufficient for this purpose)

pip install https://svn6.assembla.com/svn/pyfits/trunk error

When I try this as suggested on:

http://www.stsci.edu/institute/software_hardware/pyfits/Download

I get this error:

pip install https://svn6.assembla.com/svn/pyfits/trunk

Downloading/unpacking https://svn6.assembla.com/svn/pyfits/trunk
Downloading trunk
Cannot unpack file /tmp/pip-R8chbe-unpack/trunk (downloaded from /tmp/pip-mE3Jqc-build, content-type: text/html; charset=UTF-8); cannot detect archive format
Cannot determine archive format of /tmp/pip-mE3Jqc-build
Storing complete log in /home/rgm/.pip/pip.log

I think that this is a known issue with pip and svn repo

On the download page the above method is above the section on Development which has the solution that works so maybe this pip info should be moved down the page anyway.

svn checkout https://svn6.assembla.com/svn/pyfits/trunk
cd trunk
python setup.py install works fine for me.

Assignment of iterable to FITS_rec column

When accessing a column in a Numpy recarray like recarr['colname'] the column is returned as an array. So normally one would think that in order to update the values of a column from an iterable one would have to do recarr['colname'][:] = vals. But it turns out recarrays support doing that by default when doing __setitem__ assignment. Something like recarr['colname'] = vals just works. So it should also work for FITS_rec arrays too, but currently it does not.

Replacing a table's data makes it unsaveable

I don't know if this is a bug or error on my part.

In jwstlib.models we need to occasionally replace an entire Table's data (and resize it in the process).

In [1]: import pyfits

In [2]: hdulist = pyfits.open("test.fits")

In [3]: hdulist
Out[3]: 
[<pyfits.hdu.image.PrimaryHDU at 0x1547d50>,
 <pyfits.hdu.table.BinTableHDU at 0x154b190>,
 <pyfits.hdu.image.ImageHDU at 0x154b3d0>]

In [4]: import numpy as np

In [5]: table = hdulist[1]

In [6]: table.data = np.array(table.data)

In [7]: hdulist.writeto("foo.fits", clobber=True)
Overwriting existing file 'foo.fits'.
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-7-51e3d321d70f> in <module>()
----> 1 hdulist.writeto("foo.fits", clobber=True)

/home/mdboom/python/lib/python2.7/site-packages/pyfits-3.2.dev2070-py2.7-linux-x86_64.egg/pyfits/hdu/hdulist.pyc in writeto(self, fileobj, output_verify, clobber, checksum)
    671 
    672         for hdu in self:
--> 673             hdu._prewriteto(checksum=checksum)
    674             try:
    675                 hdu._writeto(hdulist.__file)

/home/mdboom/python/lib/python2.7/site-packages/pyfits-3.2.dev2070-py2.7-linux-x86_64.egg/pyfits/hdu/table.pyc in _prewriteto(self, checksum, inplace)
    279     def _prewriteto(self, checksum=False, inplace=False):
    280         if self._data_loaded and self.data is not None:
--> 281             self.data._scale_back()
    282             # check TFIELDS and NAXIS2
    283             self._header['TFIELDS'] = len(self.data._coldefs)

AttributeError: 'numpy.ndarray' object has no attribute '_scale_back'

The following works, however:

In [8]: table.data = np.array(table.data).view(table._data_type)

In [9]: hdulist.writeto("foo.fits", clobber=True)
Overwriting existing file 'foo.fits'.

Would it make sense to add a setter for BinTableHDU.data that applies the view?

Backport fixes from Astropy

astropy/astropy#968 This included a fix for an obscure issue on Hurd-based systems and possibly others that don't fully support mmap.flush(). Might as well backport it so that we're in sync.

astropy/astropy#996 Fixed an issue with updating table column properties via the ColDefs object. Unfortunately this fix is incompatible with some difference changes that have occurred in PyFITS trunk; this will need a closer look then.

(Possibly backport to 3.1.x and 3.0.x)

Invalid verify for HIERARCH keywords

The following test case leads to an invalid verify exception:
Card keyword 'key.META_0' is not upper case

import astropy.io.fits as pyfits
cc = pyfits.create_card
header = pyfits.Header([
    cc('simple', True), 
    cc('BITPIX', 32), 
    cc('NAXIS', 0), 
    cc('EXTEND', True, 'May contain datasets'), 
    cc('HIERARCH key.META_0', 'detRow'), 
])
hdu = pyfits.PrimaryHDU(None, header)
fits = pyfits.HDUList()
fits.append(hdu)
fits.writeto('test.py', clobber=True)

What's happening in Card._verify, where self is the card for 'HIERARCH key.META_0', is that:

  • self._image is None, so that the test checking for HIERARCH keyword does fail
  • if I force self._image to be populated before the test, simply by computing str(self) the exception is not raised (as expected).

Weirdness when using `np.asanyarray` on a `FITS_rec`

I don't know if this counts as a bug or is easily fixable, but thought I'd bring it to your attention.

When you modify the datatype of the columns in a FITS_rec using np.asanyarray, the FITS keywords describing the data do not update, and, well, misreading of data ensues!

import pyfits
import numpy as np

narrow_datatype = [('TYPE', 'S16'),
                   ('T_OFFSET', np.float32),
                   ('DECAY_PEAK', np.float32),
                   ('DECAY_FREQ', np.float32),
                   ('TAU', np.float32)]

wide_datatype = [('TYPE', 'S16'),
                   ('T_OFFSET', np.float64),
                   ('DECAY_PEAK', np.float64),
                   ('DECAY_FREQ', np.float64),
                   ('TAU', np.float64)]

# Create an array with 32-bit floats, and save it to a file
arr = np.array([("string", 1., 2., 3., 4.)], dtype=narrow_datatype)
hl = pyfits.HDUList()
hl.append(pyfits.BinTableHDU(arr, name='SCI'))
hl.writeto("test.fits", clobber = True)

# We can read it in and all is well
hl = pyfits.open("test.fits")
print "Expect ('string', 1, 2, 3, 4) E"
print hl[1].data, hl[1].header['TFORM2']

# Updating the data with the result of `np.asanyarray` creates something
# where the Numpy array has changed, but the FITS_rec header keywords
# have not
hl[1].data = np.asanyarray(hl[1].data, dtype=wide_datatype)

# It sort of feels like it works, because the data member gives the correct
# results
print "Expect ('string', 1, 2, 3, 4) D"
print hl[1].data, hl[1].header['TFORM2']

# However, when you save and load it again, you end up reading the 64-bit
# floats as 32-bits.
hl.writeto("test2.fits", clobber=True)

hl = pyfits.open("test2.fits")
print "Expect ('string', 1, 2, 3, 4) D"
print hl[1].data, hl[1].header['TFORM2']

Output:

Overwriting existing file 'test.fits'.
Expect ('string', 1, 2, 3, 4) E
[('string', 1.0, 2.0, 3.0, 4.0)] E
Expect ('string', 1, 2, 3, 4) D
[('string', 1.0, 2.0, 3.0, 4.0)] E
Overwriting existing file 'test2.fits'.
Expect ('string', 1, 2, 3, 4) D
[('string', 1.875, 0.0, 2.0, 0.0)] E

Using np.asarray instead of np.asanyarray is a reasonable workaround (though it drops the FITS_rec interface of course...)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.