The 2D/3D physical functions used to perform the geometric correction differ from one another, depending on the sensor, platform, and the sensor’s image acquisition geometry (Figure 8�8):
Radar (side-looking acquisition) Wiskbroom
(mechanical sweeping acquisition) Camera
(instantaneous acquisition) Pushbroom (CCD line acquisition) FIgure 8.8
Image acquisition geometry of different satellite sensors� (Courtesy and copyright Serge Riazanoff, VisioTerra, 2009�)
Array camera systems (for instantaneous acquisition), such as photogrammetric
•
cameras, metric cameras (MC), or large format cameras (LFC)
Mechanical rotating or sweeping mirrors, such as Landsat-MSS, TM, or ETM
• +
Pushbroom scanners (for line acquisition), such as MERIS, SPOT-HRV/high reso-
•
lution in geometry (HRG), and IRS-1C/D
Agile scanners (for line acquisition), such as IKONOS, QuickBird, WorldView
•
The SAR sensors (for side-looking acquisition), such as Environmental Satellite
•
(ENVISAT), Radarsat-1/2, CosmoSkyMed, Terra-SAR
Although each sensor has its own unique characteristics, one can draw generalities for the development of 2D/3D physical models in order to fully correct all distortions described in Section 8�2� The physical model should mathematically model all distor- tions of the platform (position, velocity, and attitude for VIR sensors), sensor (lens, viewing/look angles, and panoramic effect), Earth (ellipsoid and relief for 3D), and car- tographic projection� The geometric correction process can address each distortion one- by-one, either step-by-step or simultaneously� In fact, it is better to consider the total geometry of viewing (platform + sensor + Earth + map), because some of the distortions are correlated and have the same type of impact on the ground� It is theoretically more precise to compute only one combined parameter rather than each component of this combined parameter separately; this also avoids overparameterization and correlation between terms�
Some examples of combined parameters include the following:
The “orientation” of the image is a combination of the platform heading due to
•
orbital inclination, yaw of the platform, and convergence of the meridian�
The “scale factor” in the along-track direction is a combination of the velocity,
•
altitude, and pitch of the platform, the detection signal time of the sensor, and a component of the Earth’s rotation in the along-track direction�
The “leveling angle” in the across-track direction is a combination of platform roll,
•
the viewing angle, orientation of the sensor, Earth’s curvature, etc�
Considerable research has been carried out to develop robust and rigorous 3D physical models that describe the acquisition geometry related to different types of images (VIR and SAR images; low-, medium-, and high-resolution images) and of platforms (space- borne and airborne)� The 2D physical model was developed for ERTS imagery (Kratky 1971), and 3D physical models were developed for the following:
Low-/medium-resolution VIR satellite images (Bọhr 1976;
• Masson d’Autume 1979;
Konecny 1979; Sawada et al� 1981; Khizhnichenko 1982; Friedmann et al� 1983;
Guichard 1983; Toutin 1983; Salamonowicz 1986; Konecny, Kruck, and Lohmann 1986; Gugan 1987; Konecny et al� 1987; Kratky 1987; Shu 1987; Paderes, Mikhail, and Fagerman 1989; Westin 1990; Novak 1992; Robertson et al� 1992; Ackermann et al�
1995; Sylvander et al� 2000; Westin 2000)
High-resolution VIR satellite images (Gopala Krishna et al� 1996; Jacobsen
•
1997; Cheng and Toutin 1998; Toutin and Cheng 2000; Bouillon et al� 2002, 2006;
Chen and Teo 2002; Hargreaves and Robertson 2001; Toutin 2003a; Westin and Forsgren 2002)
The SAR satellite images (Rosenfield 1968; Gracie et al� 1970; Leberl 1978; Wong,
•
Orth, and Friedmann 1981; Curlander 1982; Naraghi, Stromberg, and Daily 1983;
Guindon and Adair 1992; Toutin and Carbonneau 1992; Tannous and Pikeroen 1994; Toutin and Chénier 2009)
Airborne VIR images (Derenyi and Konecny 1966; Konecny 1976; Gibson 1984;
•
Ebner and Muller 1986; Hoffman and Muller 1988)
Airborne SLAR/SAR images (La Prade 1963; Rosenfield, 1968; Gracie et al� 1970;
•
Derenyi 1970; Konecny 1970; Leberl 1972; Hoogeboom, Binnenkade, and Veugen 1984; Toutin, Carbonneau, and St-Laurent 1992)
The 2D physical model for ERTS by Kratky (1971) took into consideration and mathemati- cally modeled, in a step-by-step manner, the effects of scanner geometry, panoramic effect, Earth’s rotation, satellite circular orbit and attitude, nonuniform scan rate, and map pro- jection to finalize with a dual simple equation (one for each axis), which mathematically integrated all the previous error equations�
The general starting points of other research studies in deriving the mathematical func- tions of the 3D physical model are, generally, as follows:
The well-known collinearity condition and equations (Bonneval 1972; Wong 1980)
•
for VIR images are given by
x f m X X m Y Y m Z Z
m X X
= −( ) 11( ( −− 0)+ 12( − 0)+ 13( − 0)
31 0))+m Y Y32( − 0)+m Z Z33( − 0)
(8�4)
y f m X X m Y Y m Z Z
m X X
= −( ) 21( ( −− 0)+ 22( − 0)+ 23( − 0)
31 0))+m Y Y32( − 0)+m Z Z33( − 0)
(8�5) where (x,y) are the image coordinates; (X,Y,Z) are the map coordinates, (X0,Y0,Z0) are the projection center coordinates, –f is the focal length of the VIR sensor, and [mij] are the nine elements of the orthogonal 3-rotation matrix�
The Doppler and range equations for radar images are as follows:
•
f V V S P
= ( −S P) ( )−
− 2 S
P ⋅
λ| | (8�6)
r= −|S P |
(8�7) where f is the Doppler value, r is the range distance, S
and V S
are the sensor position and velocity, P
and V P
are the target point position and velocity on the ground, and λ is the radar wavelength�
It should be noted that collinearity equations were adapted as radargrammetric equa- tions to process radar images (Leberl 1972, 1990) and later as an integrated and unified mathematical equation to process multisensor (VIR or radar) images (Toutin 1995b)�
The collinearity equations are valid for an instantaneous image or scan-line acquisi- tion, such as photogrammetric cameras (LFC, MC), and VIR scanner sensors (used aboard SPOT, Landsat), and the Doppler-range equations are valid for a SAR scan line� However, since the parameters of neighboring scan lines of scanners are highly correlated, it is pos- sible to link the exposure centers and rotation angles of different scan lines to integrate supplemental information, such as either of the following:
The ephemeris and attitude data using the laws of celestial mechanics (Figures 8�3
•
and 8�4) for satellite images
The global positioning system (GPS) and inertial navigation system (INS) data for
•
airborne images
The integration of different distortions and the mathematical derivation of equations for different sensors are outside the scope of this chapter� They are described for photogram- metric cameras in the works of Bonneval (1972) or Wong (1980), for scanner images in those of Leberl (1972), Konecny (1976), or de Masson d’Autume (1979), for ERTS/Landsat images in the works of Kratky (1971), Bọhr (1976), or Salamonowicz (1986), for pushbroom scanners, such as SPOT, in those of Guichard (1983), Toutin (1983), Konecny, Kruck, and Lohmann (1986), and Konecny et al� (1987), and for SAR data in the works of Leberl (1978) or Curlander (1982)�