Operations image product data set descriptions
mission specific
The following data set descriptions are for MER Mission Reduced Data Record (RDR) products, generated by the Multi-Mission Image Processing Facility (MIPL) at JPL using the Experimental Data Records (EDRs), or raw data collected by the various instruments aboard the MER1 and MER2. The RDR products were made for use during mission operations.
Anaglyph
Data Set Overview
A stereo anaglyph is a method of displaying stereo imagery quickly and conveniently using conventional display technology (no special hardware) and red/blue glasses. This is done by displaying the left eye of the stereo pair in the red channel, and displaying the right eye in the green and blue channels. An anaglyph data product simply captures that into a single 3-band color image, which can be displayed using any standard image display program with no knowledge that it is a stereo image. The red (first) band contains the left eye image, while the green and blue (second and third) bands each contain the right eye image (so the right image is duplicated in the file).
The Anaglyph method can also apply to multi-frame mosaic products. MIPL-generated mosaic Anaglyphs occasionally required some subtle pixel-shifting of the right eye mosaic data to improve the stereo effects. Mosaic Anaglyph products are distinguishable in the Mosaic RDR filename convention.
Processing
Anaglyphs are created manually from CAHV linearized Full Framed (FFL)stereo pair EDRs or mosaics. Often times the images are stretched prior to creating the anaglyph. After stretching, the images are converted to a VICAR cube, which creates a single multi-band image. The final step involves adding the PDS label.
Data
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include the Terrain products (Mesh and Wedge).
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Disparity
Data Set Overview
A Disparity file contains 2 bands of 32-bit floating point numbers in the Band Sequential order (line, sample). Alternatively, line and sample may be stored in separate single-band files. The parallax, or difference measured in pixels, between an object location in two individual images (typically the left and right images of a stereo pair) is also called the disparity. Disparity files contain these disparity values in both the line and sample dimension for each pixel in the reference image. This reference image is traditionally the left image of a stereo pair, but could be the right image for special products. The geometry of the Disparity image is the same as the geometry of the reference image. This means that for any pixel in the reference image the disparity of the viewed point can be obtained from the same pixel location in the Disparity image.
The values in a Disparity image are the 1-based coordinates of the corresponding point in the nonreference image. Thus, the coordinates in the reference image are the same as the coordinates in the Disparity image, and the matching coordinates in the stereo partner image are the values is the Disparity image. Disparity values of 0.0 indicate no valid disparity exists, for example due to lack of overlap or correlation failure. This value is reflected in the MISSING_CONSTANT keyword.
Processing
This Operations RDR is produced by OPGS/MIPL using the Mars Suite of VICAR image processing software.
Data
2 bands, Float, DualPDS/VICAR (OPGS) binary file.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Inverse Look-up Table
Data Set Overview
This RDR is produced by OPGS to provide bit scaling that is optimized for the tools of, and completed within the time requirements imposed by, the rover planners. It is identical in fashion to the Science EDR that is produced by SOAS. If the Operations EDR is in 8-bit format as a result of onboard 12 to 8-bit scaling using a Lookup Table (LUT), then an Inverse LUT is to be used to rescale the 8 lowest bits to the 12 lowest bits in the 16-bit signed integer.
Processing
This Operations RDR is produced by OPGS/MIPL using the Mars Suite of VICAR image processing software.
If the input, the Operations EDR, is in 8-bit format as a result of onboard 12 to 8-bit scaling using a Lookup Table (LUT), then an Inverse LUT is used to rescale the 8 lowest bits to the 12 lowest bits in the 16-bit signed integer.
Data
1 band, 16-bit signed integer, Dual PDS/VICAR (OPGS) binary file.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include the Terrain products (Mesh and Wedge).
The following is a list of the types of ILUT files along with the Product Type Identifier, which is an element in the formal RDR file name:
Data Product | Linearized | Non-Linearized |
---|---|---|
Inverse LUT RDR | ILF | ILL |
Inverse LUT RDR (downsampled) | IDN | IDL |
Inverse LUT RDR (sub-frame) |
ISF | ISL |
Inverse LUT RDR (thumbnail) | ITH | IHN |
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Linearized
Data Set Overview
EDRs and single-frame RDRs are described by a camera model. This model, represented by a set of vectors and numbers, permit a point in space to be traced into the image plane, and vice-versa.
EDR camera models are derived by acquiring images of a calibration target with known geometry at a fixed azimuth/elevation. The vectors representing the model are derived from analysis of this imagery. These vectors are then translated and rotated based on the actual pointing of the camera to represent the conditions of each specific image. The results are the camera model for the EDR.
The Navcam and Pancam use a CAHVOR model, while the Hazcams use a more general CAHVORE model. Neither are linear and involve some complex calculations to transform line/sample points in the image plane to XYZ positions in the scene. To simplify this, the images are warped, or reprojected, such that they can be described by a linear CAHV model. This linearization process has several benefits:
- It removes geometric distortions inherent in the camera instruments, with the result that straight lines in the scene are straight in the image.
- It aligns the images for stereo viewing. Matching points are on the same image line in both left and right images, and both left and right models point in the same direction.
- It facilitates correlation, allowing the use of 1-D correlators.
- It simplifies the math involved in using the camera model.
However, it also introduces some artifacts in terms of scale change and/or omitted data (see the references). The linearized CAHV camera model is derived from the EDR's camera model by considering both the left and right eye models and constructing a pair of matched linear CAHV models that conform to the above criteria.
The image is then projected, or warped, from the CAHVOR/CAHVORE model to the CAHV model. This involves projecting each pixel through the EDR camera model into space, intersecting it with a surface (which matters only for Hazcams and is a sphere centered on the camera), and projecting the pixel back through the CAHV model into the output image.
C - The 3D position of the entrance pupil
A - A unit vector normal to the image plane pointing outward (towards C)
H - A vector pointing roughly rightward in the image; it is a composite of the orientation of the CCD rows, the horizontal scale, the horizontal center
V - A vector pointing roughly downward in the image; it is a composite of the orientation of the CCD columns, the vertical scale, the vertical center, and A.
If P is a point in the scene then the corresponding image locations x and y can be computed from:
x = (P-C)H / (P-C)A
y = (P-C)V / (P-C)A
Processing
This Operations RDR is produced by OPGS/MIPL using the Mars Suite of VICAR image processing software.
Single-frame RDRs are described by a camera model. This model, represented by a set of vectors and numbers, permit a point in space to be traced into the image plane, and vice-versa.
Data
1 band, 16-bit signed integer, dual PDS/VICAR (OPGS) binary file. The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include the Terrain products (Mesh and Wedge).
The following is a list of the types of Linearized files along with the Product Type Identifier, which is an element in the formal RDR file name:
Data Product |
Linearized |
---|---|
Full Frame EDR | FFL |
Sub-frame EDR | SFL |
Downsampled EDR | DNL |
Thumbnail EDR | THN |
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Mosaic Images
Data Set Overview
This data set contains images where multiple frames are mosaicked into a single RDR product. The methods for this process are applied by MIPL under OPGS, associating projections with the mosaicking process. It should be noted that these processes can be independent, and that governing methods and software can differ between OPGS and the Athena Pancam and Microscopic Imager Teams under SOAS. For instance, it is possible that OPGS and SOAS software will transform individual images to one of the projections discussed below, without involving any mosaicking. Detailed mathematical descriptions of the mosaic projections and algorithms will be available in a separate paper, Mars Mosaic Projection Algorithms.
Processing
The following is a description of each mosaic:
- Cylindrical Projection: the image is overlaid onto azimuth and elevation grid lines, with individual frame boundaries super imposed and annotated by number. In this case each pixel represents a fixed angle in azimuth and elevation. Rows are of constant elevation in Mars coordinates. The horizon is level, and columns begin clockwise from Mars north.
- Camera Point Perspective: a perspective projection with horizontal epipolar lines. The image view is as if the camera had a much larger field of view.
- Cylindrical-Perspective Projection: a 360 degree view projection similar to the Point Perspective mosaic except that this is like a pinhole camera which follows the mosaic in azimuth. The horizon is not level in order to preserve epipolar viewing.
- Polar Projection: concentric circles represent constant projected elevation. Mars nadir is at the convergent center and the horizon is corrected for lander tilt. North is up.
- Vertical Projection: the creation of this type of mosaic assumes that the field is a plane tangent to the Martian surface with up pointing north. This is not an orthorectified rendering, but was found to be useful for rapid initial orientation.
- Orthographic Projection: this type of mosaic is a generalization of the vertical projection intended primarily for use with Microscopic Imager data. It differs in that an arbitrary axis of projection (as well as X- and Y-axes in the plane of projection) can be specified.
- XYZ: this mosaic contains XYZ values for each pixel in the mosaic rather than intensity values. The inputs to the mosaic program are XYZ files (or individual X, Y, or Z components), and the pixels are interpreted in the same way - as the coordinate of the corresponding pixel in Cartesian space. Like XYZ images, they may consist of a single 3-band file with X, Y, and Z components, or separate 1 band files for each component. XYZ mosaics can be produced in any of the mosaic projections. Note: these mosaics have a consistent coordinate system used applied to each input image; the output mosaic may have only one coordinate system in which the XYZ values are defined.
- Surface Normal (UVW): similar in concept to XYZ mosaics, a UVW mosaic is simply a mosaic created from UVW (surface normal) input images. The pixels represent the surface normals at each point. Like Surface Normal (UVW) images, they can be single 3-band files or separate 1-band files for each component. As with XYZ mosaics, any projection may be used, and all output values must be defined in the same coordinate system.
Data
1 or 3 16-bit signed integer or Float, PDS (SOAS) or dual PDS/VICAR(OPGS) binary file.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Normal Images
Data Set Overview
A Surface Normal (UVW) file contains 3 bands of 32-bit floating point numbers in the Band Sequential order. Alternatively, U, V and W may be stored in separate single-band files as a U Component RDR, V Component RDR and W Component RDR, respectively. The single component RDRs are implicitly the same as the UVW file.
Processing
The pixels in a UVW image correspond to the pixels in an XYZ file, with the same image geometry. However, the pixels are interpreted as a unit vector representing the normal to the surface at the point represented by the pixel. U contains the X component of the vector, V the Y component, and W the Z component. The vector is defined to point out of the surface (e.g. upwards for a flat ground). The unit vector can be referenced to any of the MER coordinate systems (specified by the DERIVED_IMAGE_PARAMS Group in the PDS label).
Most UVW images will contain holes, or pixels for which no UVW value exists. These are caused by many factors such as differences in overlap, correlation failures, and insufficient neighbors to compute a surface normal. Holes are indicated by U, V, and W all having the same specific value. Unlike XYZ, (0,0,0) is an invalid value for a UVW file, since they are defined to be unit vectors. Thus there is no issue with the MISSING_CONSTANT as there is with XYZ, where (0.0,0.0,0.0) is valid.
Data
Surface Normal (UVW) RDR: 3 bands, Float data type, DualPDS/VICAR (OPGS) binary file.
Surface Normal U-component RDR: 1 band, Float data type, PDS (SOAS) or dual PDS/VICAR (OPGS) binary file.
Surface Normal V-component RDR: 1 band, Float data type, PDS (SOAS) or dual PDS/VICAR (OPGS) binary file.
Surface Normal W-component RDR: 1 band, Float data type, PDS (SOAS) or dual PDS/VICAR (OPGS) binary file.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include the Terrain products (Mesh and Wedge).
The following is a list of the types of Surface Normal RDRs along with the Product Type Identifier, which is an element in the formal RDR file name:
Data Product | Linearized | Non-Linearized |
---|---|---|
UVW (XYZ) Surface Normal RDR | UVW | UVL |
UVW (XYZ) Surface Normal RDR (thumbnail) | UVT | UVN |
U (X) Surface Normal RDR | UUU | UUL |
U (X) Surface Normal RDR (thumbnail) | UUT | UUN |
V (Y) Surface Normal RDR | VVV | VVL |
V (Y) Surface Normal RDR (thumbnail) | VVT | VVN |
W (Z) Surface Normal RDR | WWW | WWL |
W (Z) Surface Normal RDR (thumbnail) | WWT | WWN |
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Radiometric Corrections
Data Set Overview
The MIPLRAD method refers to radiometric correction systematically performed by MIPL (OPGS at JPL) to meet tactical time constraints imposed by rover planners, since these RDRs are integrated into terrain mesh products used for traverse planning. The method can apply to any of the camera instruments.
In the operations environment for the Prime and Extended Missions, MIPL’s radiometrically-corrected RDR filename carries the product type designator of RAD for the non-linearized case and RAL for the linearized case. However, in the PDS archive volume, the MIPL radiometrically-corrected RDR carries the product type MRD for the non-linearized case and MRL for the linearized case. Though there is no difference in image content between the operational and archived versions of MIPL’s radiometrically-corrected RDR, the distinction in their filenames is made to identify all RAD and RAL product types in the PDS archive volumes as being unique to the Athena Pancam team’s radiance correction process.
As a special note, two bugs pertaining to MIPL’s radiometric correction process were discovered during the preparation of data for PDS archival, which was after the MER Prime Mission and well into the Extended Mission. The problems involved on-board flat-field removal and temperature determination. For the purposes of this discussion, the term MIPLRAD refers to the original implementation used during Prime Mission and approximately through the first two months of Extended Mission, while MIPLRAD2 represents the corrected implementation used thereafter. Both MIPLRAD and MIPLRAD2 are valid values for RADIOMETRIC_CORRECTION_TYPE. The differences are described below. Note that all RAD/RAL/RSD/RSL types of RDRs have been reprocessed with MIPLRAD2, limiting the number of MIPLRAD-processed products in the PDS archive volume to a subset of mosaics which were generated before MIPLRAD2 was implemented. Note also that no mosaics were generated from on-board flat-field images using MIPLRAD, so the only difference in the archive data is the temperature issue.
MIPLRAD is a first-order correction only and should be considered approximate. MIPLRAD first backs out any onboard flat field that was performed. It then applies the following corrections: flat field, exposure time, temperature-compensated responsivity. The result is calibrated to physical units for MER of W/m^2/nm/sr. The actual algorithm and equations used for MIPLRAD are shown below. Each correction is applied in sequence, to every pixel:
-
If on-board flat-fielding has been applied, it is backed out
according to the parameters in FLAT_FIELD_CORRECTION_PARM, which
defines ff(x,y). MIPLRAD incorrectly multiplied by ff(x,y) rather
than divided, causing the on-board flat field to be doubled
rather than removed. MIPLRAD2 correctly divides by ff(x,y) as
follows:
output(x,y) = input(x,y) / ff(x,y)
-
For the flat-field adjustment, the x and y coordinates are
adjusted based on downsampling and subframing to find the
corresponding pixel in the flat field, then the DN is divided by
the flat field value:
output(x,y) = input(x,y) / flat_field(x',y')
- Exposure time is then removed. Exposure time comes from EXPOSURE_DURATION, converted to seconds: output(x,y) = input(x,y) / exposure_time
- The temperature responsivity is removed next. The temperature
comes from the first element of INSTRUMENT_TEMPERATURE and the
parameters R0, R1, and R2 come from the flat field parameter
file, and are different per instrument. The actual temperature
formula is as follows:
output(x,y) = input(x,y) * (R0 + R1*temp + R2*temp*temp)
For MIPLRAD, the temperature is simply the first element of INSTRUMENT_TEMPERATURE. For MIPLRAD2, the temperature is dependent on the instrument. The temperature used for each instrument is determined using the following general rules (from the MER thermal team):
- Use the CCD temp of said camera, if it exists.
- Use the CCD temp of neighboring camera (left/right partner), if available.
- Use the CCD temp of similar camera (i.e., Navcam/Pancam).
- Use CCD temperature from any camera.
- Use the electronics temperature of said camera.
- Use the electronics temperature of similar camera.
Rules e and f are a last resort in view of the fact that MER operates warmup heaters inside the electronics (during nighttime and early morning) that raise camera electronics temperatures above CCD temperatures. Thus any CCD temperature is at higher priority than any electronics temperature measurement. The most significant consequence of this is that the MI CCD is the best available proxy for all four Hazcam CCDs. A value of 0.0 is ignored as a no-reading value, and a value greater than or equal to 50.0 (degrees C) is interpreted as a broken sensor. Either value causes that temperature to be ignored and the next one on the list tested. If none of the values is valid, a default of 0.0 degrees C is used.
Processing
MER Camera Payload RDRs are considered Level 3 (Calibrated Data equivalent to NASA Level 1-A), Level 4 (Resampled Data equivalent to NASA Level 1-B), or Level 5 (Derived Data equivalent to NASA Level 1-C, 2 or 3). The RDRs are to be reconstructed from Level 2 edited data, and are to be assembled into complete images that may include radiometric and/or geometric correction.
MER Camera Payload EDRs and RDRs will be generated by JPL’s Multimission Image Processing Laboratory (MIPL) under the OPGS subsystem of the MER GDS. RDRs will also be generated by the Athena Pancam Science and Microscopic Imager Science Teams under the SOAS subsystem of the GDS.
RDR data products will be generated by, but not limited to, MIPL using the Mars Suite of VICAR image processing software at JPL, the Athena Pancam Science Team using IDL software at Cornell University and JPL, and the Microscopic Imager Science Team using ISIS software at USGS (Flagstaff) and JPL. The RDRs produced will be processed data. The input will be one or more Camera EDR or RDR data products and the output will be formatted according to this SIS. Additional meta-data may be added by the software to the PDS label.
Data
RDR products generated by MIPL will have a VICAR label wrapped by a PDS label, and their structure can include the optional EOL label after the binary data. RDR products not generated by MIPL may contain only a PDS label. Or, RDR products conforming to a standard other than PDS, such as JPEG compressed or certain Terrain products, are acceptable without a PDS header during mission operations, but may not be archivable.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include JPEG compressed products and the Terrain products.
The MIPLRAD method is a radiometric correction performed by MIPL (OPGS) at JPL. It can apply to any of the camera instruments, but only the RAD (and RAL) type is generated. MIPLRAD first backs out any onboard flat field that was performed. It then applies the following corrections: flat field, exposure time, temperature-compensated responsivity. The result is calibrated to physical units for MER of W/m^2/nm/sr. MIPLRAD is a first-order correction only and should be considered approximate.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Range Images
Data Set Overview
A Range (distance) file contains 1 band of 32-bit floating point numbers. The pixels in a Range image represent Cartesian distances from a reference point (defined by the RANGE_ORIGIN_VECTOR keyword in the PDS label) to the XYZ position of each pixel. This reference point is normally the camera position as defined by the C point of the camera model. A Range image is derived from an XYZ image and shares the same pixel geometry and XYZ coordinate system. As with XYZ images, range images can contain holes, defined by MISSING_CONSTANT. For MER, this value is 0.0.
Processing
MER Camera Payload RDRs are considered Level 3 (Calibrated Data equivalent to NASA Level 1-A), Level 4 (Resampled Data equivalent to NASA Level 1-B), or Level 5 (Derived Data equivalent to NASA Level 1-C, 2 or 3). The RDRs are to be reconstructed from Level 2 edited data, and are to be assembled into complete images that may include radiometric and/or geometric correction.
MER Camera Payload EDRs and RDRs will be generated by JPL’s Multimission Image Processing Laboratory (MIPL) under the OPGS subsystem of the MER GDS. RDRs will also be generated by the Athena Pancam Science and Microscopic Imager Science Teams under the SOAS subsystem of the GDS.
RDR data products will be generated by, but not limited to, MIPL using the Mars Suite of VICAR image processing software at JPL, the Athena Pancam Science Team using IDL software at Cornell University and JPL, and the Microscopic Imager Science Team using ISIS software at USGS (Flagstaff) and JPL. The RDRs produced will be processed data. The input will be one or more Camera EDR or RDR data products and the output will be formatted according to this SIS. Additional meta-data may be added by the software to the PDS label.
Data
RDR products generated by MIPL will have a VICAR label wrapped by a PDS label, and their structure can include the optional EOL label after the binary data. RDR products not generated by MIPL may contain only a PDS label. Or, RDR products conforming to a standard other than PDS, such as JPEG compressed or certain Terrain products, are acceptable without a PDS header during mission operations, but may not be archivable.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include JPEG compressed products and the Terrain products.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Reachability Maps
Data Set Overview
An IDD Reachability map contains information about whether or not the instruments on the IDD can reach (contact or image) the object or location represented by each pixel in the scene. It is derived from the XYZ and Surface Normal (UVW) products.
The geometry of the reachability map matches the linearized reference, XYZ, and Surface Normal (UVW) images, in that each pixel in the file directly corresponds to the pixel at the same location in the other products.
The reachability map is a 16-band byte image in standard Band Sequential order. Thus for each pixel there are 16 values. These values represent reachability for each of the 4 IDD instruments in each of its 4 configurations. The mapping between band number and instrument/configuration is given by the INSTRUMENT_BAND_ID and CONFIGURATION_BAND_ID labels.
The value of the pixel is interpreted according to the instrument. For RAT, 0 means the pixel is not reachable in that configuration, while any other number represents the maximum preload in integer Newtons that can be applied at that point. For all other instruments, 0 means the pixel is not reachable by that instrument in that configuration, while 255 means that the pixel is reachable.
Processing
MER Camera Payload RDRs are considered Level 3 (Calibrated Data equivalent to NASA Level 1-A), Level 4 (Resampled Data equivalent to NASA Level 1-B), or Level 5 (Derived Data equivalent to NASA Level 1-C, 2 or 3). The RDRs are to be reconstructed from Level 2 edited data, and are to be assembled into complete images that may include radiometric and/or geometric correction.
MER Camera Payload EDRs and RDRs will be generated by JPL’s Multimission Image Processing Laboratory (MIPL) under the OPGS subsystem of the MER GDS. RDRs will also be generated by the Athena Pancam Science and Microscopic Imager Science Teams under the SOAS subsystem of the GDS.
RDR data products will be generated by, but not limited to, MIPL using the Mars Suite of VICAR image processing software at JPL, the Athena Pancam Science Team using IDL software at Cornell University and JPL, and the Microscopic Imager Science Team using ISIS software at USGS (Flagstaff) and JPL. The RDRs produced will be processed data. The input will be one or more Camera EDR or RDR data products and the output will be formatted according to this SIS. Additional meta-data may be added by the software to the PDS label.
Data
RDR products generated by MIPL will have a VICAR label wrapped by a PDS label, and their structure can include the optional EOL label after the binary data. RDR products not generated by MIPL may contain only a PDS label. Or, RDR products conforming to a standard other than PDS, such as JPEG compressed or certain Terrain products, are acceptable without a PDS header during mission operations, but may not be archivable.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include JPEG compressed products and the Terrain products.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Roughness Maps
Data Set Overview
The roughness map contains surface roughness estimates at each pixel in an XYZ image. The roughness is computed as the maximum peak-to-peak deviation from the local plane. Units are meters; that is, a pixel value of 0.05 means that the local surface about that pixel has a maximum peak-to-peak deviation along the surface normal by 0.05m (5cm). Roughness values above some useful threshold (maximum roughness) are clipped to that threshold. If a roughness could not be computed for a pixel (e.g. because of lack of range data, or too much noise in the range data), then the roughness value at that pixel will be set to the bad roughness value (which must be greater than maximum roughness).
Processing
MER Camera Payload RDRs are considered Level 3 (Calibrated Data equivalent to NASA Level 1-A), Level 4 (Resampled Data equivalent to NASA Level 1-B), or Level 5 (Derived Data equivalent to NASA Level 1-C, 2 or 3). The RDRs are to be reconstructed from Level 2 edited data, and are to be assembled into complete images that may include radiometric and/or geometric correction.
MER Camera Payload EDRs and RDRs will be generated by JPL’s Multimission Image Processing Laboratory (MIPL) under the OPGS subsystem of the MER GDS. RDRs will also be generated by the Athena Pancam Science and Microscopic Imager Science Teams under the SOAS subsystem of the GDS.
RDR data products will be generated by, but not limited to, MIPL using the Mars Suite of VICAR image processing software at JPL, the Athena Pancam Science Team using IDL software at Cornell University and JPL, and the Microscopic Imager Science Team using ISIS software at USGS (Flagstaff) and JPL. The RDRs produced will be processed data. The input will be one or more Camera EDR or RDR data products and the output will be formatted according to this SIS. Additional meta-data may be added by the software to the PDS label.
Data
RDR products generated by MIPL will have a VICAR label wrapped by a PDS label, and their structure can include the optional EOL label after the binary data. RDR products not generated by MIPL may contain only a PDS label. Or, RDR products conforming to a standard other than PDS, such as JPEG compressed or certain Terrain products, are acceptable without a PDS header during mission operations, but may not be archivable.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include JPEG compressed products and the Terrain products.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Slope Maps
Data Set Overview
The Slope Map RDR represents the predicted slope of the terrain as determined by Pancam and Navcam stereo imaging. The Slope Map is derived from the Surface Normal (UVW) product by fitting a plane over a rover-sized patch in physical space for every image plane pixel in the Pancam and Navcam stereo images. The surface normal is then computed as the normal to the plane fit. Finally, the elevation of the surface normal vector with respect to the (X,Y) site frame is determined to be the predicted terrain slope in units of degrees.
Processing
MER Camera Payload RDRs are considered Level 3 (Calibrated Data equivalent to NASA Level 1-A), Level 4 (Resampled Data equivalent to NASA Level 1-B), or Level 5 (Derived Data equivalent to NASA Level 1-C, 2 or 3). The RDRs are to be reconstructed from Level 2 edited data, and are to be assembled into complete images that may include radiometric and/or geometric correction.
MER Camera Payload EDRs and RDRs will be generated by JPL’s Multimission Image Processing Laboratory (MIPL) under the OPGS subsystem of the MER GDS. RDRs will also be generated by the Athena Pancam Science and Microscopic Imager Science Teams under the SOAS subsystem of the GDS.
RDR data products will be generated by, but not limited to, MIPL using the Mars Suite of VICAR image processing software at JPL, the Athena Pancam Science Team using IDL software at Cornell University and JPL, and the Microscopic Imager Science Team using ISIS software at USGS (Flagstaff) and JPL. The RDRs produced will be processed data. The input will be one or more Camera EDR or RDR data products and the output will be formatted according to this SIS. Additional meta-data may be added by the software to the PDS label.
Data
RDR products generated by MIPL will have a VICAR label wrapped by a PDS label, and their structure can include the optional EOL label after the binary data. RDR products not generated by MIPL may contain only a PDS label. Or, RDR products conforming to a standard other than PDS, such as JPEG compressed or certain Terrain products, are acceptable without a PDS header during mission operations, but may not be archivable.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include JPEG compressed products and the Terrain products.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
Terrain Wedge
Data Set Overview
Terrain models are a high level product which are derived from the XYZ files and the corresponding image files. The terrain models are generated by meshing or triangulating the XYZ data based on the connectivity implied by the pixel ordering or by a volume based surface extraction. The XYZ files can be viewed as a collection of point data while the terrain models take this point data and connect it into a polygonal surface representation. The original image is referenced by the terrain models as a texture map which is used to modulate the surface color of the mesh. In this way the terrain models can be viewed as a surface reconstruction of the ground near the instrument with the mesh data capturing the shape of the surface and the original image, applied as a texture map, capturing the brightness variations of the surface. Specific terrain model formats such as VST, PFB, DEM and others can be viewed as analogous to GIF, TIFF or VICAR in image space in that each represents the data somewhat differently for slightly different purposes.
The ViSTa (VST) format consists of one terrain model for each wedge (stereo image pair), in a JPL defined binary format suitable for display by SAP. Each file contains meshes at multiple levels of detail.
Processing
MER Camera Payload RDRs are considered Level 3 (Calibrated Data equivalent to NASA Level 1-A), Level 4 (Resampled Data equivalent to NASA Level 1-B), or Level 5 (Derived Data equivalent to NASA Level 1-C, 2 or 3). The RDRs are to be reconstructed from Level 2 edited data, and are to be assembled into complete images that may include radiometric and/or geometric correction.
MER Camera Payload EDRs and RDRs will be generated by JPL’s Multimission Image Processing Laboratory (MIPL) under the OPGS subsystem of the MER GDS. RDRs will also be generated by the Athena Pancam Science and Microscopic Imager Science Teams under the SOAS subsystem of the GDS.
RDR data products will be generated by, but not limited to, MIPL using the Mars Suite of VICAR image processing software at JPL, the Athena Pancam Science Team using IDL software at Cornell University and JPL, and the Microscopic Imager Science Team using ISIS software at USGS (Flagstaff) and JPL. The RDRs produced will be processed data. The input will be one or more Camera EDR or RDR data products and the output will be formatted according to this SIS. Additional meta-data may be added by the software to the PDS label.
Data
RDR products generated by MIPL will have a VICAR label wrapped by a PDS label, and their structure can include the optional EOL label after the binary data. RDR products not generated by MIPL may contain only a PDS label. Or, RDR products conforming to a standard other than PDS, such as JPEG compressed or certain Terrain products, are acceptable without a PDS header during mission operations, but may not be archivable.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include JPEG compressed products and the Terrain products.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
XYZ Images
Data Set Overview
An XYZ file contains 3 bands of 32-bit floating point numbers in the Band Sequential order. Alternatively, X, Y and Z may be stored in separate single-band files as a X Component RDR, Y Component RDR and Z Component RDR, respectively. The single component RDRs are implicitly the same as the XYZ file, which is described below. XYZ locations in all coordinate frames for MER are expressed in meters unless otherwise noted.
The pixels in an XYZ image are coordinates in 3-D space of the corresponding pixel in the reference image. This reference image is traditionally the left image of a stereo pair, but could be the right image for special products. The geometry of the XYZ image is the same as the geometry of the reference image. This means that for any pixel in the reference image the 3-D position of the viewed point can be obtained from the same pixel location in the XYZ image. The 3-D points can be referenced to any of the MER coordinate systems (specified by DERIVED_IMAGE_PARAMS Group in the PDS label).
Most XYZ images will contain holes, or pixels for which no XYZ value exists. These are caused by many factors such as differences in overlap and correlation failures. Holes are indicated by X, Y, and Z all having the same specific value. This value is defined by the MISSING_CONSTANT keyword in the IMAGE object. For the XYZ RDR, this value is (0.0,0.0,0.0), meaning that all three bands must be zero (if only one or two bands are zero, that does not indicate missing data).
Processing
MER Camera Payload RDRs are considered Level 3 (Calibrated Data equivalent to NASA Level 1-A), Level 4 (Resampled Data equivalent to NASA Level 1-B), or Level 5 (Derived Data equivalent to NASA Level 1-C, 2 or 3). The RDRs are to be reconstructed from Level 2 edited data, and are to be assembled into complete images that may include radiometric and/or geometric correction.
MER Camera Payload EDRs and RDRs will be generated by JPL’s Multimission Image Processing Laboratory (MIPL) under the OPGS subsystem of the MER GDS. RDRs will also be generated by the Athena Pancam Science and Microscopic Imager Science Teams under the SOAS subsystem of the GDS.
RDR data products will be generated by, but not limited to, MIPL using the Mars Suite of VICAR image processing software at JPL, the Athena Pancam Science Team using IDL software at Cornell University and JPL, and the Microscopic Imager Science Team using ISIS software at USGS (Flagstaff) and JPL. The RDRs produced will be processed data. The input will be one or more Camera EDR or RDR data products and the output will be formatted according to this SIS. Additional meta-data may be added by the software to the PDS label.
Data
RDR products generated by MIPL will have a VICAR label wrapped by a PDS label, and their structure can include the optional EOL label after the binary data. RDR products not generated by MIPL may contain only a PDS label. Or, RDR products conforming to a standard other than PDS, such as JPEG compressed or certain Terrain products, are acceptable without a PDS header during mission operations, but may not be archivable.
The RDR data product is comprised of radiometrically decalibrated and/or camera model corrected and/or geometrically altered versions of the raw camera data, in both single and multi-frame (mosaic) form. Most RDR data products will have PDS labels, or if generated by MIPL (OPGS), dual PDS/VICAR labels. Non-labeled RDRs include JPEG compressed products and the Terrain products.
Software
The MIPL Mars Program Suite was used to generate these RDRs.
Media/Format
The data set will initially be delivered and kept online. Upon Mission completion, the Operations RDRs will be delivered to PDS on DVD.
see ALSO