Table Of Contents

Previous topic

Tables

Next topic

Exceptions and Utility Classes

This Page

Images

Inheritance diagram of Section, ImageHDU

ImageHDU

class pyfits.core.ImageHDU(data=None, header=None, name=None, do_not_scale_image_data=False, uint=False)

Bases: pyfits.hdu.image._ImageBaseHDU, pyfits.hdu.base.ExtensionHDU

FITS image extension HDU class.

Construct an image HDU.

Parameters :
 

data : array

The data in the HDU.

header : Header instance

The header to be used (as a template). If header is None, a minimal header will be provided.

name : str, optional

The name of the HDU, will be the value of the keyword EXTNAME.

do_not_scale_image_data : bool, optional

If True, image data is not scaled using BSCALE/BZERO values when read.

uint : bool, optional

Interpret signed integer data where BZERO is the central value and BSCALE == 1 as unsigned integer data. For example, int16 data with BZERO = 32768 and BSCALE = 1 would be treated as uint16 data.

add_checksum(when=None, override_datasum=False, blocking='standard')

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters :
 

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard')

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters :
 

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Returns :
 

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy()

Make a copy of the HDU, both header and data are copied.

filebytes()

Calculates and returns the number of bytes that this HDU will write to a file.

Parameters :
 None :
Returns :
 Number of bytes :
fileinfo()

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Parameters :
 

None :

Returns :
 

dictionary or None :

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key

Value

file

File object associated with the HDU

filemode

Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)

hdrLoc

Starting byte location of header in file

datLoc

Starting byte location of data block in file

datSpan

Data size including padding

classmethod fromstring(data, fileobj=None, offset=0, checksum=False, ignore_missing_end=False, **kwargs)

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Parameters :
 

data : str

A byte string contining the HDU’s header and, optionally, its data. If fileobj is not specified, and the length of data extends beyond the header, then the trailing data is taken to be the HDU’s data. If fileobj is specified then the trailing data is ignored.

fileobj : file (optional)

The file-like object that this HDU was read from.

offset : int (optional)

If fileobj is specified, the offset into the file-like object at which this HDU begins.

checksum : bool (optional)

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool (optional)

Ignore a missing end card in the header data. Note that without the end card the end of the header can’t be found, so the entire data is just assumed to be the header.

kwargs : (optional)

May contain additional keyword arguments specific to an HDU type. Any unrecognized kwargs are simply ignored.

classmethod match_header(header)
classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs)

Read the HDU from a file. Normally an HDU should be opened with fitsopen() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters :
 

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

classmethod register_hdu(hducls)
req_cards(keyword, pos, test, fix_value, option, errlist)

Check the existence, location, and value of a required Card.

TODO: Write about parameters

If pos = None, it can be anywhere. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True)

Execute the verification with selected option.

scale(type=None, option='old', bscale=1, bzero=0)

Scale image data by using BSCALE/BZERO.

Call to this method will scale data and update the keywords of BSCALE and BZERO in _header. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.

Parameters :
 

type : str, optional

destination data type, use a string representing a numpy dtype name, (e.g. 'uint8', 'int16', 'float32' etc.). If is None, use the current data type.

option : str

How to scale the data: if "old", use the original BSCALE and BZERO values when the data was read/created. If "minmax", use the minimum and maximum of the data to scale. The option will be overwritten by any user specified bscale/bzero values.

bscale, bzero : int, optional

User-specified BSCALE and BZERO values.

classmethod unregister_hdu(hducls)
update_ext_name(value, comment=None, before=None, after=None, savecomment=False)

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :
 

value : str

value to be used for the new extension name

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(value, comment=None, before=None, after=None, savecomment=False)

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :
 

value : str

value to be used for the new extension version

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_header()

Update the header keywords to agree with the data.

verify(option='warn')

Verify all values in the instance.

Parameters :
 

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

verify_checksum(blocking='standard')

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard')

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(*args, **kwargs)

Works similarly to the normal writeto(), but prepends a default PrimaryHDU are required by extension HDUs (which cannot stand on their own).

ImgCode = {'uint64': 64, 'uint16': 16, 'int16': 16, 'int64': 64, 'int32': 32, 'float64': -64, 'uint8': 8, 'float32': -32, 'uint32': 32}
NumCode = {-64: 'float64', -32: 'float32', 32: 'int32', 8: 'uint8', 64: 'int64', 16: 'int16'}
data
header
is_image
section
size

Size (in bytes) of the data portion of the HDU.

CompImageHDU

class pyfits.core.CompImageHDU(data=None, header=None, name=None, compressionType='RICE_1', tileSize=None, hcompScale=0, hcompSmooth=0, quantizeLevel=16.0)

Bases: pyfits.hdu.table.BinTableHDU

Compressed Image HDU class.

Parameters :
 

data : array, optional

data of the image

header : Header instance, optional

header to be associated with the image; when reading the HDU from a file (data=DELAYED), the header read from the file

name : str, optional

the EXTNAME value; if this value is None, then the name from the input image header will be used; if there is no name in the input image header then the default name COMPRESSED_IMAGE is used.

compressionType : str, optional

compression algorithm ‘RICE_1’, ‘PLIO_1’, ‘GZIP_1’, ‘HCOMPRESS_1’

tileSize : int, optional

compression tile sizes. Default treats each row of image as a tile.

hcompScale : float, optional

HCOMPRESS scale parameter

hcompSmooth : float, optional

HCOMPRESS smooth parameter

quantizeLevel : float, optional

floating point quantization level; see note below

Notes

The pyfits module supports 2 methods of image compression.

  1. The entire FITS file may be externally compressed with the gzip or pkzip utility programs, producing a *.gz or *.zip file, respectively. When reading compressed files of this type, pyfits first uncompresses the entire file into a temporary file before performing the requested read operations. The pyfits module does not support writing to these types of compressed files. This type of compression is supported in the _File class, not in the CompImageHDU class. The file compression type is recognized by the .gz or .zip file name extension.
  2. The CompImageHDU class supports the FITS tiled image compression convention in which the image is subdivided into a grid of rectangular tiles, and each tile of pixels is individually compressed. The details of this FITS compression convention are described at the FITS Support Office web site. Basically, the compressed image tiles are stored in rows of a variable length arrray column in a FITS binary table. The pyfits module recognizes that this binary table extension contains an image and treats it as if it were an image extension. Under this tile-compression format, FITS header keywords remain uncompressed. At this time, pyfits does not support the ability to extract and uncompress sections of the image without having to uncompress the entire image.

The pyfits module supports 3 general-purpose compression algorithms plus one other special-purpose compression technique that is designed for data masks with positive integer pixel values. The 3 general purpose algorithms are GZIP, Rice, and HCOMPRESS, and the special-purpose technique is the IRAF pixel list compression technique (PLIO). The compressionType parameter defines the compression algorithm to be used.

The FITS image can be subdivided into any desired rectangular grid of compression tiles. With the GZIP, Rice, and PLIO algorithms, the default is to take each row of the image as a tile. The HCOMPRESS algorithm is inherently 2-dimensional in nature, so the default in this case is to take 16 rows of the image per tile. In most cases, it makes little difference what tiling pattern is used, so the default tiles are usually adequate. In the case of very small images, it could be more efficient to compress the whole image as a single tile. Note that the image dimensions are not required to be an integer multiple of the tile dimensions; if not, then the tiles at the edges of the image will be smaller than the other tiles. The tileSize parameter may be provided as a list of tile sizes, one for each dimension in the image. For example a tileSize value of [100,100] would divide a 300 X 300 image into 9 100 X 100 tiles.

The 4 supported image compression algorithms are all ‘loss-less’ when applied to integer FITS images; the pixel values are preserved exactly with no loss of information during the compression and uncompression process. In addition, the HCOMPRESS algorithm supports a ‘lossy’ compression mode that will produce larger amount of image compression. This is achieved by specifying a non-zero value for the hcompScale parameter. Since the amount of compression that is achieved depends directly on the RMS noise in the image, it is usually more convenient to specify the hcompScale factor relative to the RMS noise. Setting hcompScale = 2.5 means use a scale factor that is 2.5 times the calculated RMS noise in the image tile. In some cases it may be desirable to specify the exact scaling to be used, instead of specifying it relative to the calculated noise value. This may be done by specifying the negative of the desired scale value (typically in the range -2 to -100).

Very high compression factors (of 100 or more) can be achieved by using large hcompScale values, however, this can produce undesireable ‘blocky’ artifacts in the compressed image. A variation of the HCOMPRESS algorithm (called HSCOMPRESS) can be used in this case to apply a small amount of smoothing of the image when it is uncompressed to help cover up these artifacts. This smoothing is purely cosmetic and does not cause any significant change to the image pixel values. Setting the hcompSmooth parameter to 1 will engage the smoothing algorithm.

Floating point FITS images (which have BITPIX = -32 or -64) usually contain too much ‘noise’ in the least significant bits of the mantissa of the pixel values to be effectively compressed with any lossless algorithm. Consequently, floating point images are first quantized into scaled integer pixel values (and thus throwing away much of the noise) before being compressed with the specified algorithm (either GZIP, RICE, or HCOMPRESS). This technique produces much higher compression factors than simply using the GZIP utility to externally compress the whole FITS file, but it also means that the original floating point value pixel values are not exactly perserved. When done properly, this integer scaling technique will only discard the insignificant noise while still preserving all the real imformation in the image. The amount of precision that is retained in the pixel values is controlled by the quantizeLevel parameter. Larger values will result in compressed images whose pixels more closely match the floating point pixel values, but at the same time the amount of compression that is achieved will be reduced. Users should experiment with different values for this parameter to determine the optimal value that preserves all the useful information in the image, without needlessly preserving all the ‘noise’ which will hurt the compression efficiency.

The default value for the quantizeLevel scale factor is 16, which means that scaled integer pixel values will be quantized such that the difference between adjacent integer values will be 1/16th of the noise level in the image background. An optimized algorithm is used to accurately estimate the noise in the image. As an example, if the RMS noise in the background pixels of an image = 32.0, then the spacing between adjacent scaled integer pixel values will equal 2.0 by default. Note that the RMS noise is independently calculated for each tile of the image, so the resulting integer scaling factor may fluctuate slightly for each tile. In some cases, it may be desireable to specify the exact quantization level to be used, instead of specifying it relative to the calculated noise value. This may be done by specifying the negative of desired quantization level for the value of quantizeLevel. In the previous example, one could specify quantizeLevel`=-2.0 so that the quantized integer levels differ by 2.0.  Larger negative values for `quantizeLevel means that the levels are more coarsely-spaced, and will produce higher compression factors.

add_checksum(when=None, override_datasum=False, blocking='standard')

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters :
 

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard')

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters :
 

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Returns :
 

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy()

Make a copy of the table HDU, both header and data are copied.

dump(datafile=None, cdfile=None, hfile=None, clobber=False)

Dump the table HDU to a file in ASCII format. The table may be dumped in three separate files, one containing column definitions, one containing header parameters, and one for table data.

Parameters :
 

datafile : file path, file object or file-like object, optional

Output data file. The default is the root name of the fits file associated with this HDU appended with the extension .txt.

cdfile : file path, file object or file-like object, optional

Output column definitions file. The default is None, no column definitions output is produced.

hfile : file path, file object or file-like object, optional

Output header parameters file. The default is None, no header parameters output is produced.

clobber : bool

Overwrite the output files if they exist.

Notes

The primary use for the dump method is to allow viewing and editing the table data and parameters in a standard text editor. The load method can be used to create a new table from the three plain text (ASCII) files.

  • datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.

    Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the TFORM header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""). For the last data element in a row, the trailing blank in the field is replaced by a new line character.

    For column data containing variable length arrays (‘P’ format), the array data is preceded by the string 'VLA_Length= ' and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.

    For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).

  • cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (TTYPEn). The second field provides the column format (TFORMn). The third field provides the display format (TDISPn). The fourth field provides the physical units (TUNITn). The fifth field provides the dimensions for a multidimensional array (TDIMn). The sixth field provides the value that signifies an undefined value (TNULLn). The seventh field provides the scale factor (TSCALn). The eighth field provides the offset value (TZEROn). A field value of "" is used to represent the case where no value is provided.

  • hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.

filebytes()

Calculates and returns the number of bytes that this HDU will write to a file.

Parameters :
 None :
Returns :
 Number of bytes :
fileinfo()

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Parameters :
 

None :

Returns :
 

dictionary or None :

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key

Value

file

File object associated with the HDU

filemode

Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)

hdrLoc

Starting byte location of header in file

datLoc

Starting byte location of data block in file

datSpan

Data size including padding

classmethod fromstring(data, fileobj=None, offset=0, checksum=False, ignore_missing_end=False, **kwargs)

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Parameters :
 

data : str

A byte string contining the HDU’s header and, optionally, its data. If fileobj is not specified, and the length of data extends beyond the header, then the trailing data is taken to be the HDU’s data. If fileobj is specified then the trailing data is ignored.

fileobj : file (optional)

The file-like object that this HDU was read from.

offset : int (optional)

If fileobj is specified, the offset into the file-like object at which this HDU begins.

checksum : bool (optional)

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool (optional)

Ignore a missing end card in the header data. Note that without the end card the end of the header can’t be found, so the entire data is just assumed to be the header.

kwargs : (optional)

May contain additional keyword arguments specific to an HDU type. Any unrecognized kwargs are simply ignored.

get_coldefs(*args, **kwargs)

Deprecated since version 3.0: Use the columns attribute instead.

Returns the table’s column definitions.

classmethod load(datafile, cdfile=None, hfile=None, replace=False, header=None)

Create a table from the input ASCII files. The input is from up to three separate files, one containing column definitions, one containing header parameters, and one containing column data.

The column definition and header parameters files are not required. When absent the column definitions and/or header parameters are taken from the header object given in the header argument; otherwise sensible defaults are inferred (though this mode is not recommended).

Parameters :
 

datafile : file path, file object or file-like object

Input data file containing the table data in ASCII format.

cdfile : file path, file object, file-like object, optional

Input column definition file containing the names, formats, display formats, physical units, multidimensional array dimensions, undefined values, scale factors, and offsets associated with the columns in the table. If None, the column definitions are taken from the current values in this object.

hfile : file path, file object, file-like object, optional

Input parameter definition file containing the header parameter definitions to be associated with the table. If None, the header parameter definitions are taken from the current values in this objects header.

replace : bool

When True, indicates that the entire header should be replaced with the contents of the ASCII file instead of just updating the current header.

header : Header object

When the cdfile and hfile are missing, use this Header object in the creation of the new table and HDU. Otherwise this Header supercedes the keywords from hfile, which is only used to update values not present in this Header, unless replace=True in which this Header’s values are completely replaced with the values from hfile.

Notes

The primary use for the load method is to allow the input of ASCII data that was edited in a standard text editor of the table data and parameters. The dump method can be used to create the initial ASCII files.

  • datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.

    Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the TFORM header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""). For the last data element in a row, the trailing blank in the field is replaced by a new line character.

    For column data containing variable length arrays (‘P’ format), the array data is preceded by the string 'VLA_Length= ' and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.

    For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).

  • cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (TTYPEn). The second field provides the column format (TFORMn). The third field provides the display format (TDISPn). The fourth field provides the physical units (TUNITn). The fifth field provides the dimensions for a multidimensional array (TDIMn). The sixth field provides the value that signifies an undefined value (TNULLn). The seventh field provides the scale factor (TSCALn). The eighth field provides the offset value (TZEROn). A field value of "" is used to represent the case where no value is provided.

  • hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.

classmethod match_header(header)
classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs)

Read the HDU from a file. Normally an HDU should be opened with fitsopen() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters :
 

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

classmethod register_hdu(hducls)
req_cards(keyword, pos, test, fix_value, option, errlist)

Check the existence, location, and value of a required Card.

TODO: Write about parameters

If pos = None, it can be anywhere. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True)

Execute the verification with selected option.

scale(type=None, option='old', bscale=1, bzero=0)

Scale image data by using BSCALE and BZERO.

Calling this method will scale self.data and update the keywords of BSCALE and BZERO in self._header and self._image_header. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.

Parameters :
 

type : str, optional

destination data type, use a string representing a numpy dtype name, (e.g. 'uint8', 'int16', 'float32' etc.). If is None, use the current data type.

option : str, optional

how to scale the data: if "old", use the original BSCALE and BZERO values when the data was read/created. If "minmax", use the minimum and maximum of the data to scale. The option will be overwritten by any user-specified bscale/bzero values.

bscale, bzero : int, optional

user specified BSCALE and BZERO values.

classmethod tcreate(*args, **kwargs)

Deprecated since version 3.1: Use load() instead.

tdump(*args, **kwargs)

Deprecated since version 3.1: Use dump() instead.

classmethod unregister_hdu(hducls)
update()

Update header keywords to reflect recent changes of columns.

updateCompressedData()

Compress the image data so that it may be written to a file.

updateHeader()

Update the table header cards to match the compressed data.

updateHeaderData(image_header, name=None, compressionType=None, tileSize=None, hcompScale=None, hcompSmooth=None, quantizeLevel=None)

Update the table header (_header) to the compressed image format and to match the input data (if any). Create the image header (_image_header) from the input image header (if any) and ensure it matches the input data. Create the initially-empty table data array to hold the compressed data.

This method is mainly called internally, but a user may wish to call this method after assigning new data to the CompImageHDU object that is of a different type.

Parameters :
 

image_header : Header instance

header to be associated with the image

name : str, optional

the EXTNAME value; if this value is None, then the name from the input image header will be used; if there is no name in the input image header then the default name ‘COMPRESSED_IMAGE’ is used

compressionType : str, optional

compression algorithm ‘RICE_1’, ‘PLIO_1’, ‘GZIP_1’, ‘HCOMPRESS_1’; if this value is None, use value already in the header; if no value already in the header, use ‘RICE_1’

tileSize : sequence of int, optional

compression tile sizes as a list; if this value is None, use value already in the header; if no value already in the header, treat each row of image as a tile

hcompScale : float, optional

HCOMPRESS scale parameter; if this value is None, use the value already in the header; if no value already in the header, use 1

hcompSmooth : float, optional

HCOMPRESS smooth parameter; if this value is None, use the value already in the header; if no value already in the header, use 0

quantizeLevel : float, optional

floating point quantization level; if this value is None, use the value already in the header; if no value already in header, use 16

update_ext_name(value, comment=None, before=None, after=None, savecomment=False)

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :
 

value : str

value to be used for the new extension name

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(value, comment=None, before=None, after=None, savecomment=False)

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :
 

value : str

value to be used for the new extension version

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

verify(option='warn')

Verify all values in the instance.

Parameters :
 

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

verify_checksum(blocking='standard')

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard')

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(*args, **kwargs)

Works similarly to the normal writeto(), but prepends a default PrimaryHDU are required by extension HDUs (which cannot stand on their own).

columns
compData
data
header
is_image
size

Size (in bytes) of the data portion of the HDU.

Section

class pyfits.core.Section(hdu)

Bases: object

Image section.

TODO: elaborate