2006
DOI: 10.1029/2005je002462
|View full text |Cite
|
Sign up to set email alerts
|

Processing of Mars Exploration Rover imagery for science and operations planning

Abstract: [1] The twin Mars Exploration Rovers (MER) delivered an unprecedented array of image sensors to the Mars surface. These cameras were essential for operations, science, and public engagement. The Multimission Image Processing Laboratory (MIPL) at the Jet Propulsion Laboratory was responsible for the first-order processing of all of the images returned by these cameras. This processing included reconstruction of the original images, systematic and ad hoc generation of a wide variety of products derived from thos… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 112 publications
(38 citation statements)
references
References 19 publications
0
38
0
Order By: Relevance
“…Should that AUV become entrapped, or suffer a mechanical failure under the one hundred meter thick ice tongue, the environmental data it collected would likely be irrecoverable. This is in stark contrast with other field exploration robots, such as the Mars rovers [15,2], which have returned incredibly valuable scientific data despite every vehicle remaining behind on the Martian surface.…”
Section: Autonomous Underwater Explorationmentioning
confidence: 82%
“…Should that AUV become entrapped, or suffer a mechanical failure under the one hundred meter thick ice tongue, the environmental data it collected would likely be irrecoverable. This is in stark contrast with other field exploration robots, such as the Mars rovers [15,2], which have returned incredibly valuable scientific data despite every vehicle remaining behind on the Martian surface.…”
Section: Autonomous Underwater Explorationmentioning
confidence: 82%
“…and/or geomorphologic endmember rocks and soils, (4.2) Creation of ''Photometry QUBs'', i.e., image cubes derived from Pancam stereo modeling that contained three-dimensional maps in a rover-based coordinate system of disparity, range, surface normals, incidence, emission, and phase angles, plus radiance factor data from left-eye and geometrically warped right-eye data, as shown in Fig. 2 (Soderblom et al, 2004(Soderblom et al, , 2008Alexander et al, 2006); (4.3) Application of the sky radiance model of Lemmon et al (2004) to compensate for the effects of reddened diffuse skylight for a given sol, local time, and ROI radiance value; and (4.4) Derivation of Hapke scattering parameters for a set of rock or soil reflectance observations (corrected for diffuse skylight and incorporating local facet tilts for ROIs) at a given wavelength.…”
Section: Methodsmentioning
confidence: 99%
“…The Navcam is a panchromatic stereo pair of engineering camera with 20 cm baseline separation (Alexander et al, 2006). The Pancam is a multispectral stereo pair of science cameras with 30 cm baseline separation.…”
Section: Datamentioning
confidence: 99%
“…Both Navcam and Pancam are mounted on the camera bar, which can rotate ±90 degrees of elevation and 360 degrees of azimuth, enabling acquisition of panoramic images. The rover images have been pre-processed by the Jet Propulsion Laboratory (JPL) Multi-mission Image Processing Laboratory (MIPL) and products such as mosaics, linearized (epipolar) images, 3D coordinate data and range maps are also provided (Alexander et al, 2006). The derived 3D data sets, e.g., 3D coordinate data (XYL files) and range map (RNL files) are stored with the index of the left images of HiRISE is a pushbroom sensor on board the Mars Reconnaissance Orbiter.…”
Section: Datamentioning
confidence: 99%