1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Tài liệu Cơ sở dữ liệu hình ảnh P3 doc

48 596 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Satellite Imagery in Earth Science Applications
Trường học John Wiley & Sons, Inc.
Chuyên ngành Image Databases
Thể loại tài liệu
Năm xuất bản 2002
Định dạng
Số trang 48
Dung lượng 311,37 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

A variety of data collection systems exist today to obtain image and nonimagedata using remote sensing.. Indesigning a database for remotely sensed image data, it is important to underst

Trang 1

Image Databases: Search and Retrieval of Digital Imagery

Edited by Vittorio Castelli, Lawrence D Bergman Copyright  2002 John Wiley & Sons, Inc ISBNs: 0-471-32116-8 (Hardback); 0-471-22463-4 (Electronic)

on However, images tend to dominate the archives of remotely sensed data both

in volume and in variety The applications of remote sensing are numerous inboth civilian and military arenas Examples of civilian applications include dailyweather forecasting, long-term climate studies, monitoring atmospheric ozone,forecasting crops, and helping farmers with precision agriculture

A variety of data collection systems exist today to obtain image and nonimagedata using remote sensing Instruments that obtain image data are in general referred

to as imagers Measurements such as surface height obtained by altimeters and

reflected radiance from the Earth at various wavelengths, obtained by radiometers,spectrometers, or spectroradiometers, are represented as images The number ofwavelength ranges (spectral bands) used by imaging instruments can vary fromone (panchromatic) to hundreds (hyperspectral) A variety of mechanisms areused in sensing, including across-track and along-track scanning (Section 3.4.2),resulting in the need for algorithms modeling the sensing geometry to ensure that theimages are properly mapped (registered) to a ground coordinate system Usually,several detectors are used to obtain images from a given instrument, resulting inthe need for proper cross-calibration, to ensure that the numbers obtained fromthe different detectors have the same physical meaning Images are obtained atvarious resolutions (or pixel sizes), ranging from one meter to a few kilometers

35

Trang 2

36 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

Generally, low-resolution images are used for frequent and global coverage of theEarth, whereas high-resolution images are used for occasional and detailed coverage

of selected areas

Remotely sensed images must be radiometrically and geometrically corrected

to ensure that they provide accurate information The corresponding processingusually includes corrections and derivation of “higher levels” of information(beyond gray levels and colors of pixels) such as chlorophyll concentration,sea surface temperature, cloud heights, ice motion, and land cover classes Indesigning a database for remotely sensed image data, it is important to understandthe variety of data collection systems, the types of processing performed onthe data, and the various ways in which the data and the derived informationare accessed The goal of this chapter is to discuss the issues associated withmanaging remotely sensed image data, with a particular focus on the problemsunique to such data, and to provide a few examples of how some existing systemshandle those problems

Section 3.2 gives a brief history of remote sensing Section 3.3 provides adiscussion of applications with two examples of assessing human impact onthe Earth’s environment Section 3.4 covers data collection systems briefly withemphasis on defining terminology relevant to obtaining and managing imagedata Section 3.5 is a discussion of the types of errors and artifacts in remotelysensed images and the corrections required before proceeding with higher levels

of information extraction Section 3.6 outlines processing steps to derive usefulinformation from remotely sensed images Section 3.7 addresses the implications

of the variety of collection systems, processing, and usage patterns on the storageand access of remotely sensed image data Section 3.8 gives a few examples

of systems managing such data and how they address the issues identified inSection 3.7 Section 3.9 concludes the chapter with a summary of the key points

MISSIONS

A brief history of remote sensing, with a focus on land remote sensing, can

be found in Ref [1] An extensive survey of airborne and space-borne missionsand sensors for observing the Earth is given in Ref [2] The latter referencedescribes more than 125 space-borne missions, several of which are series ofmultiple satellites, and more than 190 airborne sensors Some of the interestingfacts from these references are summarized in the following text to illustrate thevariety of applications and the breadth of international interest in remote sensing.Remote sensing of Earth is said to have started with the first photographfrom a balloon over Paris taken by Gaspard Felix Tournachon in 1895 (see

http://observe.ivv.nasa.gov/nasa/exhibits/history/ ) By 1909, aerial photographs

of the Earth were being taken for various applications The first military

Earth-observing satellite, called Discoverer, was launched in August 1960 The data

were initially meant to support biomedical research and Earth observations.However, a few months after launch, the data were classified secret and became

Trang 3

HISTORICAL BACKGROUND AND REMOTE SENSING MISSIONS 37

unavailable for civilian purposes The first satellite for civilian applications waslaunched by the National Aeronautics and Space Administration (NASA) andthe Department of Defense (DOD) in August 1960 It was the first experimental

weather satellite and was called the Television Infrared Observation Satellite

(TIROS-1) This led to a series of satellites that became operational in 1966

as TIROS Operational Satellites (TOS), to be renamed the National Oceanic

and Atmospheric Administration (NOAA) Polar Orbiting Environmental lites (POES) in 1970 This was followed in 1978 with a series of NOAA weather

Satel-satellites carrying the Advanced Very High-Resolution Radiometer (AVHRR)that have been used for both meteorologic applications and studies of vegetationand land use NASA launched the first of the series of satellites dedicated to

land remote sensing in July 1972 This was initially called the Earth Resources

Technology Satellite (ERTS-1) and was later renamed Landsat-1 Since then,

there have been several Landsat satellites The latest in the series, Landsat-7 waslaunched in April 1999 While Landsat provides relatively high-resolution data(30 m), AVHRR provides coarser resolution (1.1 km) data, but more frequentobservations of a given location on Earth The series of NOAA GeostationaryOperational Environmental Satellites (GOES) provide continuous observations ofthe Earth at an even coarser resolution These are principally used for continuousmonitoring of atmospheric phenomena and for supporting weather forecasts The

satellite series called SPOT, designed by the Centre National d’Etudes Spatiales

(CNES) in France, has been providing remotely sensed land images since 1986.The latest in the series, SPOT 4, was launched in March 1998 In September 1999,Space Imaging, Inc., a private company in the United States launched IKONOS,

a satellite for high-resolution (1-m panchromatic and 4-m multispectral) imaging.Since the launch of IRS-1A, in March 1988, India has had a series of remote

sensing satellites, called IRS There have been other satellites such as the Nimbus

series (Nimbus-1, launched in August 1964, to Nimbus-7, launched in October1978), SeaSat (launched in June 1978) and Sea star (with the SeaWiFS instru-ment launched in August 1997) that have been used mainly for ocean, coastalzone, and fishery applications

NASA initiated a program called the Mission to Planet Earth (MTPE)

in the 1980s This is a part of the broader U.S Global Change Research

Program (USGCRP) The MTPE is now renamed the Earth Science Enterprise.

There are several partners from other agencies (e.g., NOAA, U.S GeologicalSurvey, Department of Energy) and countries (e.g., Australia, Brazil, Canada,Finland, France, Japan, Netherlands) in this program The purpose of thiscomprehensive program is to study the Earth as an integrated and coupledsystem, consisting of the atmosphere, oceans, and landmasses interacting witheach other over a range of spatial and temporal scales [3–6] Phase 1 of thisprogram consisted of many spacecraft, several of which are still operational,including NASA missions: Earth Radiation Budget Satellite (ERBS), UpperAtmospheric Research Satellite (UARS), Topography Experiment for OceanCirculation (TOPEX/Poseidon — joint with France), Tropical Rainfall MeasuringMission (TRMM — joint with Japan) Several non-NASA missions that are

Trang 4

38 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

considered part of the Phase 1 of MTPE include NOAA-12 and NOAA-14,European Remote Sensing Satellites (ERS-1 and–2), Japanese Earth ResourcesSatellite (JERS-1), and Radarsat (Canada) Phase 2 of this program consists ofthe Earth Observing System (EOS) EOS was considered the centerpiece of theMTPE and continues to be the biggest part of NASA’s Earth Science Program.EOS consists of a series of satellites and a variety of instruments that will monitorthe Earth from space The primary purpose of EOS is to establish a long-termbasis for determining the extent, causes, and consequences of global climatechange The first two EOS instruments were launched on TRMM in November

1997 The first major EOS satellite, Terra (formerly known as EOS-AM), waslaunched in December 1999 This is to be followed by Aqua (formerly known asEOS-PM) in 2001 and Aura (formerly known as EOS CHEM) in 2003 Completedetails of all satellites and instruments constituting the EOS Program can be found

in the Earth Science Enterprise/EOS Reference Handbook [6].

Remotely sensed data have many different civilian and military applications

in diverse disciplines A few examples of civilian applications are as follows:characterizing and studying variations in atmospheric ozone, quantifying andidentifying causes and effects of pollution, forecasting weather, monitoringvolcanic eruptions, forest fires, floods and other natural hazards, tracking sea-icemotion, mapping and studying temporal changes in global biomass productivity,studying deforestation and desertification, topographical mapping, forecastingcrops, supporting precision agriculture, monitoring urban change, land use planning,and studying long-term climate change These applications range from providingdetailed information covering small areas to individuals and small organizationsfor commercial purposes to generating global-scale information that is relevant

in formulating international policies Examples of international policies in whichremotely sensed information has played a significant role include the 1987 Montrealprotocol for eliminating chlorofluorocarbons (CFCs) [7] and the emission-reducingaccords from the Kyoto Climate Change Conference in December 1997 [8].Although accuracy requirements may vary from application to application, acommon theme is that decisions with individual, regional, national, or internationalimpact are made using information derived from remote sensing data Especially

in cases in which the information is used in making a public policy that affects thelives of millions of people, it is extremely important that the quality of the data andthe scientific basis of the algorithms that are used to derive conclusions from thedata are well understood, documented, and preserved for posterity Preserving allthe data and related information and making them conveniently accessible to usersare as important as collecting the data using remote sensing systems

Many interesting applications, images, and animations can be found at theURLs listed at the end of this chapter We will consider two examples of appli-cations of remote sensing The first provides an illustration of a time series ofdata from NASA’s Total Ozone Measuring System (TOMS) instruments used

Trang 5

APPLICATIONS OF REMOTE SENSING 39

to observe the progression of the Antarctic ozone concentration over the pastseveral years The second provides an example of observing deforestation overtime using Landsat data

Ozone (O3) in the Earth’s atmosphere is critical to the life on the surface of theEarth [9] Although it is a lethal pollutant at lower altitudes, it screens the ultra-violet (UV) radiation that destroys cells in plants, animals, and humans at highaltitudes Extreme exposure to UV radiation causes skin cancer Ground-basedinstruments and those flown aboard balloons, aircraft, and spacecraft have beenused extensively for measuring ozone concentration The ozone concentration ismeasured in parts per million (the number of O3molecules per million molecules

of air) and is typically only a few parts per million Measurements show thatabout 90 percent of the ozone in the atmosphere is in the stratosphere (altitudesbetween 10 and 50 km) Therefore, the ozone layer is generally referred to as

the stratospheric ozone layer The “thickness” of the ozone layer is measured in

“Dobson Units (DU).” To define DU, imagine that, all the ozone contained in avertical column of atmosphere above a given ground location is brought to sealevel and at room temperature The average thickness of such a layer of ozoneover the globe is 3 mm (about the thickness of two stacked pennies) This isdesignated as 300 DU Despite its very low concentration, ozone is critical to thesurvival of life on Earth

In the 1920s, CFCs, a family of chemicals, were developed as a safe substitutefor flammable substances such as ammonia for use in refrigerators and spray cans.Over subsequent decades, there was an enormous growth in the use of CFCs.Although the amount of chlorine occurring naturally in the atmosphere is verylow, CFCs introduced a significant amount of chlorine into the ozone layer.Under certain conditions, chlorine has the potential for destroying large amounts

of ozone This effect of reducing ozone concentration has, in fact, been observed,especially over Antarctica

In 1985, Farman and coworkers [10] showed that ozone was disappearingover Antarctica and that the measured amounts were much less than the naturallow values This led to an intensive measurement campaign and analyses thathave yielded a nearly complete characterization of the physical and chemicalprocesses controlling Antarctic ozone Concerns over the health of the ozone

layer led, in 1987, to the international agreement, called the Montreal Protocol

that restricted and ultimately phased out the production of CFCs [45] There is atime lag of years, however, between the stopping of the production of CFCs andthe reduction of their concentration in the stratosphere As the CFCs begin todecrease, it is expected that the Antarctic ozone amounts should begin to recover.Several types of measurements have been made on a global scale to monitorthe ozone layer Remotely sensed images from a variety of sources, includingspace-borne instruments, aircraft, and ground-based stations, were needed for thethorough analysis of cause and effect relationships required to support a decisionabout CFCs Remotely sensed images from the TOMS instruments are being

Trang 6

40 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

BUV & TOMS Total ozone

Total ozone (DU)

Oct 70 Oct 71 Oct 72 Oct 79

Oct 84 Oct 89

Oct 91 Oct 93

Figure 3.1 The progression of the hole in the ozone layer between 1970 and 1993,

imaged by the Total Ozone Measuring System (TOMS) instruments A color version of

this figure can be downloaded from ftp://wiley.com/public/sci tech med/image databases.

used for ongoing monitoring of the ozone concentration The TOMS instrumentshave been flown on several spacecraft, starting with Nimubs-7 in 1978 Theseries of images shown in Figure 3.1 illustrates the progression of the ozonehole during the period 1970 through 1993 [Before 1978 images were obtainedfrom the backscatter ultraviolet (BUV) instrument on Nimbus 4] Each imageshows a color representation of the thickness of the total ozone column measured

in DUs The images represent the monthly means of thickness for October ofeach year displayed As shown in the color scale, the thickness is the smallest

in the blue areas Generation of these images involves several steps, startingwith obtaining instrument-observed radiance values, deriving ozone concentrationvalues, and mapping them to a standard polar projection for display Access to

image data from TOMS can be obtained through http://toms.gsfc.nasa.gov and

http://daac.gsfc.nasa.gov.

Reduction of forest areas around the world due to natural causes, such as forestfires, and human activities, such as logging and converting of forested regions

into agricultural or urban regions, is referred to as deforestation Deforestation

has significant impact on the global carbon cycle and biodiversity The removal oflarge trees and the burning of debris to clear the land increase the carbon dioxidecontent of the atmosphere, which may have an impact on climate Tropical rainforests occupy about 7 percent of the land area of the Earth However, they arehome to more than half of the living plant and animal species Thus, deforestationcan lead to massive extinction of plant and animal species The largest tropicalrain forest in the world is the Amazon Rain Forest It covers parts of Bolivia,Brazil, Colombia, Ecuador, and Peru An example of deforestation over time isshown in Figure 3.2 This shows two Landsat images of Rondonia, Brazil The

Trang 7

APPLICATIONS OF REMOTE SENSING 41

Figure 3.2 Example of deforestation in Rondonia, Brazil, as it appears in Landsat TM

images A color version of this figure can be downloaded from ftp://wiley.com/public/sci

tech med/image databases.

left image was obtained in 1975 and the right image in 1986 Significant increase

in human population occurred between 1975 and 1986, resulting in colonization

of the region adjacent to the main highway and conversion of forestland to cultural use It is easy to see these areas in the 1986 image — they appear as afish bone pattern By accurately coregistering images such as these (i.e., over-laying them), comparing them, and using classification techniques discussed inSection 3.6, it is possible to obtain accurate estimates of the extent of defor-estation By analyzing a large number of Landsat images such as these (it takesabout 210 Landsat images to cover the Brazilian Amazon Basin), it has beenshown that between 1975 and 1988, about 5.6 percent of the Brazilian AmazonBasin became deforested The impact on biodiversity is even greater than thatindicated by this estimate of deforestation, because the natural plants and animals

agri-in the forest are adversely affected by isolation of previously contiguous habitats.Contiguous areas of less than 100 km2 are considered isolated Greater exposure

to winds and predators at the boundary between forested and deforested areasalso affects the natural plants and animals The habitat within 1 km of the forestboundary is considered to be affected in this manner With these considerations,

it is estimated that about 14.4 percent of the habitat for natural plant and animallife in Brazil was impacted [11] Acquiring remotely sensed images on a globalscale periodically and over a long period of time, and archiving them along withancillary data (i.e., data other than the remotely sensed image data needed foranalysis), metadata, and the results of analyses, will help monitor deforestation,assist in policy making, and aid in studying the impacts of changes in policy.Because of the importance of this issue, there are several global and regionalscale initiatives to monitor the forests of the world over time Examples of globalinitiatives are as follows:

• The international Global Observations of Forest Cover (GOFC) Project

• NASA Landsat Pathfinder Humid Tropical Forest Inventory Project (HTFIP)

Trang 8

42 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

• Commission of the European Communities Topical Ecosystem EnvironmentObservation by Satellite (TREES) Project

• High-Resolution Data Exchange Project of the Committee on Earth vation Satellites (an international affiliation of space agencies) and theInternational Geosphere Biosphere Program

Obser-• The multinational Global Forest Mapping Program (GFMP) led by the EarthObservation Research Center (EORC) of the National Space DevelopmentAgency of Japan (NASDA)

Regional initiatives include the following:

• North American Landscape Characterization (NALC) Project

• Regional Multiresolution Land Characteristics (MRLC) covering the minous United States;

conter-• U.S Forest Inventory and Analysis (FIA) and the Canadian NationalForestry Database Program (NFDP)

More details about these and other programs can be found in Ref [12]

There are a large variety of systems for collecting remotely sensed data Thesecan be categorized in several ways according to the

• type of instrument (imager, sounder, altimeter, radiometer, spectrometer, etc.),

• mechanics of the instrument (push broom, whisk broom, serial cross-track,parallel cross-track, conical scan, step-staring, etc.),

• sensing mechanism — passive or active,

• viewing characteristics — pointable or fixed, nadir- or off-nadir-looking,single- or multiangle (mono or stereo),

• spectral characteristics measured (panchromatic, multispectral, spectral),

hyper-• spatial resolution (high, moderate, low),

• observed wavelength range (UV, visible, near infrared, thermal, microwave,etc.),

• platform — aircraft, spacecraft, and

• altitude (in case of airborne sensors) or orbits (in the case of borne sensors) — sun-synchronous, geosynchronous, geostationary, lowinclination

space-Some of the foregoing terms, especially the ones contributing to data managementissues, are defined in the following text Because the focus of this chapter is

on image data, the discussion will be limited to the terms relating to imaginginstruments For a more complete discussion of terminology see Refs [2,13]

Trang 9

DATA COLLECTION SYSTEMS 43

Instruments used in remote sensing belong to one of the following categories:

(also called detectors) that measure characteristics of the remote object (e.g.,the Earth) and that produces measurements that can be represented as animage Most of the imagers used in remote sensing acquire images elec-tronically by measuring the radiance incident at the sensor(s), convert thedata into digital format, and transmit the results to a receiving system onthe ground

are measured over a “reference ellipsoid” — a standard for the zero altitudesurface of the Earth The altitudes can be represented as a gray scale orcolor-coded two-dimensional image or as a three-dimensional surface

or emitted) from the Earth’s surface in a given set of wavelength bands ofthe electromagnetic spectrum

incident at its sensor(s)

and their spectral distribution

Generally, unlike conventional cameras, an imaging instrument does not obtain

an image as a “snap shot” at a single point in time It uses a relatively smallnumber of sensors and relies on some form of scanning mechanism and onthe motion of the aircraft (or spacecraft) to measure radiance from differentparts of the scene being imaged Consider the general imaging geometry shown

in Figure 3.3 Here, a detector D measures the radiance reflected from point

P on Earth S is the source of illumination, which is generally the Sun Thedetector actually measures radiance from a finite region surrounding P, called

Flight line (along track) Cross-track

S

P D

N

D: Detector

N: Nadir point (point on earth vertically

beneath D)

P: Point being viewed by D

S: Source of illumination (usually the sun)

Figure 3.3 General imaging geometry.

Trang 10

44 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

the instantaneous field of view (IFOV) (Note that the size of an IFOV is loosely

referred to as resolution Each IFOV results in a pixel in the sensed image) Asthe platform (spacecraft or aircraft) moves along its path, the IFOV traces a smallstrip on the ground By using an array of detectors in an instrument, the radiancevalues from an array of IFOVs can be measured simultaneously Wide groundswaths are imaged using various combinations of scanning mechanisms, arrays

of detectors, and platform motion Two of the commonly used combinations andthe synthetic-aperture radar instrument are discussed in the following list:

mirror, an across-track scanner traces a scan line along which the detectormeasures radiance values of an array of IFOVs The continuous signalmeasured by the detector is sampled and converted to digital counts through

an analog-to-digital conversion process Thus, a scan line results in a row ofpixels in the image As the platform (spacecraft or aircraft) moves, succes-sive scan lines are traced The instrument can have several detectors, so

that, as the mirror oscillates, n scan lines can be traced simultaneously.

The platform velocity and scanning period are matched to avoid overlaps or

gaps between successive sets of n scan lines If n= 1, then the instrument

is called a serial cross-track scanner If n > 1, then it is called a parallel

cross-track scanner.

array of detectors arranged perpendicular to the direction of platform motion.There is no scanning mirror Each detector measures the radiance valuesalong a track parallel to the platform motion, thus generating a column

of pixels in the resulting image The set of detectors in the linear arraygenerates a row of pixels at a time

following section) measure reflected signals from the ground as the form moves and mathematically reconstruct high-resolution images, creating

plat-images as if obtained with a very large antenna These are called

synthetic-aperture radar instruments (SAR).

Sensing mechanisms can be either passive or active

from the Earth without an active source of radiation in the sensor The source

of illumination with passive sensors is usually the sun

narrow spectral band (e.g., radar or laser) and measures the reflected (echo)radiance

Depending on the acquisition direction, instruments can be categorized as follows:

Trang 11

DATA COLLECTION SYSTEMS 45

Nadir-Looking Nadir is the point (point N in Figure 3.3) on Earth that is

verti-cally below the sensor The set of all points vertiverti-cally below a platform as it

moves is referred to as the nadir track A nadir-looking instrument images a

swath (narrow or broad) on either side of the nadir track An off-nadir- or looking instrument images a swath on one side or the other of the nadir track

side-To describe the imaging more precisely, one needs to specify the maximum andminimum “look angles” viewed by the instrument in the cross-track direction.There are also instruments that view in the off-nadir directions along track

they point to does not change, except for scanning and minor variations

in the platform attitude (its relative orientation with respect to the Earth).There are other instruments whose pointing direction can be controlled sothat areas of interest are imaged using commands from the ground

image a given ground location from different angles within a short span oftime This permits a “stereo” capability useful for developing digital eleva-tion maps Multiangle views of Earth are also useful to study the effects

of varying lengths of atmospheric column and “bidirectional” reflectanceproperties (i.e., differences in the reflectance of objects when viewed fromtwo different directions)

An instrument may measure reflected or emitted radiance values in broad ornarrow spectral bands and in a small or large set of spectral bands The spectralresolution of a sensor is defined as the narrowest spectral bandwidth that it canmeasure Depending on the spectral resolution and number of bands measured,the instruments are categorized as follows:

a wide spectral range (usually covering the entire visible spectrum and part

of the ultraviolet spectrum)

spectral bands It may have several detectors sensitive to each spectral band,

and, if there are n spectral bands, they generate n radiance values for each

pixel observed

imager, in which the number n of spectral bands is large (usually greater than

200) and the width of each spectral band is small The spectral bands are

narrow enough and sufficiently closely spaced to allow each n-dimensional

vector of measured radiance values to approximate the continuous spectrumcorresponding to the observed pixel

The spatial resolution is defined as the minimum distance between two points

on the ground that the sensor is able to distinguish It depends on the sensor

Trang 13

ERRORS, ARTIFACTS, AND REQUIRED CORRECTIONS 47

itself and on the distance between the sensor and the ground Sometimes theresolution is specified in angular units (microradians) Most often, however, thespatial resolution is specified in meters (or kilometers) representing one side of asquare area on the ground constituting a pixel High resolution in remote sensingusually refers to pixel sizes less than or equal to 100 m Moderate resolutionranges from 100 to 500 m Low resolution refers to pixel sizes above 500 m.Table 3.1 shows a few examples of imaging instruments categorized according

to the above characteristics This table also shows the peak data rates fromeach of these instruments The amount of data collected and transmitted by theinstruments depends on the data rates and the “duty cycle” (percentage of timethat they collect data)

In general, the goal of a remote sensing system is to measure certain parameters

of interest that pertain to a given area of the Earth Many distortions cause themeasurements to differ from their ideal values or to be assigned to the wrong pixellocations This is illustrated by two highly simplified examples in Figure 3.4.Figure 3.2a shows how radiation from locations other than that desired enters

a detector The desired location here is a small neighborhood around point P TheEarth is illuminated by the Sun S, and the radiation from the desired location

is reflected to the detector D In addition, radiation scattered by point A in theatmosphere reaches the detector Also, radiation reflected from the point P isscattered by point B in the atmosphere and reaches the detector

Figure 3.4b shows how a pixel can be “misplaced” or shifted if informationabout elevation is ignored Geographic coordinates are assigned to each pixel,depending on the angular displacement of the observed point relative to a refer-ence direction (for e.g., the vertical) Suppose a feature (for e.g., a mountain or a

tall building) with elevation h is present at point P on Earth Then, the radiation

from that feature will be observed along the line DQ (rather than DP) Thus, if theelevation information is ignored, the observation is assigned to a pixel numbercorresponding to the angle NDQ (rather than NDP) In effect, the resulting image

will appear as if the radiation at Q were arising from the point P on Earth

P

B A

S D

P′

(a)

P D

Q h

(b)

Figure 3.4 (a) Atmospheric effects (b) Elevation effect.

Trang 14

48 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

Detector Analog-to-digital

converter On-broad

Compute radiometric corrections Compute geometric corrections

Apply radiometric

corrections

Derive geophysical

parameters

Apply geometric corrections

Data assimilation/

fusion

Radiance from earth Spacecraft

• Detector PSF

• On-board cal data

• Atmospheric corrections L1A

Figure 3.5 “Generic” data flow.

Figure 3.5 shows a “generic” data flow diagram to illustrate the varioussteps in the acquisition, correction of errors and artifacts, and generation

of useful information from image data In the following sections, we willaddress the correction of telemetry errors and artifacts, radiometric errors, andgeometric errors

Telemetry errors are likely to occur when the data are received from the lite at a ground station Because of the errors in various parts of the sensing,recording, and communications systems, data bits may be “flipped.” The bitflips are minimized using error-detecting and error-correcting codes (e.g., Reed-Solomon codes) Typical specifications for satellite telemetry data are that biterrors before correction be fewer than 1 in 105 Generally, the post correctionrates are required to be fewer than 1 in 106to 1 in 107 Observed post correctionerror rates tend to be significantly lower (e.g., 1 in 1012)

satel-In the case of satellites with multiple sensors on board, data from differentsensors may be interleaved in the transmission to the ground stations Somedata may be downlinked directly and others may be played back from onboardrecorders The data may be out of time sequence, that is, some of the dataobserved earlier may be downlinked later In fact, data may be received atmultiple ground stations and may need to be assembled into a properly time-

ordered data stream All these effects are referred to as telemetry artifacts.

Removal of such artifacts involves separating data from various instruments intoindividual data sets, arranging them in proper time order, and subdividing theminto files covering some meaningful (usually fixed) time period Such files are

called production data sets (PDSs) The time period covered by PDSs may range

from 2 hours to 24 hours and usually depends on the instrument, its data rate and

Trang 15

ERRORS, ARTIFACTS, AND REQUIRED CORRECTIONS 49

application, so that the sizes of data sets are reasonable, and the data are availablefor higher levels of processing and analysis at a reasonable time after observation.Typically, for EOS instruments, the files range in size from 200 MB to 6 GB

For purposes of illustration, consider a multispectral sensor imaging the Earth’ssurface with a cross-track scanning mechanism, such as the Landsat ThematicMapper (TM) The sensor can have multiple spectral bands and multiple detectors

in each of the spectral bands Under given solar illumination conditions, eachdetector measures the reflected light in a given spectral band The reflectance

in multiple spectral bands characterizes the area being viewed on Earth Ideally,the numbers recorded by each of the detectors in a given spectral band should

be the same when they view the same target However, there are several sources

of detector-to-detector variation The atmosphere affects the amount of reflectedlight reaching the detectors The atmospheric effect depends on the viewingangle of the detectors: the farther from nadir a given location on Earth is, thelonger the atmospheric column that the reflected light passes through, and thegreater the attenuation The effect of the atmosphere also depends on the spectralband in which a detector is measuring because the atmospheric absorption iswavelength-dependent Detectors also may have nonlinear responses to incidentradiation Ideally, a detector should record a point on Earth as a point in theimage However, real detectors generate a “spread” or “smear” in the imagecorresponding to a given point on Earth This is a measurable characteristic of

a detector and is called its point-spread function (PSF) Because point-spread

functions of the detectors differ from the ideal, the values measured for a givenlocation on Earth are affected by the values from its neighboring locations.Knowledge of all these effects is required to retrieve the detector’s idealresponse, that is, to radiometrically calibrate the detector’s response The cali-bration of the detector responses requires prelaunch measurements of a known

“standard” source of radiation Also, each detector, while in orbit, makes frequentmeasurements of an onboard calibration source Some instruments, for purposes

of calibration, may use occasional views of “constant” sources of radiance, such

as deep space, the Sun, and/or the Moon Such calibration data are generallystored as a part of the instrument data stream

Acquired images are subject to geometric distortions Ideally, we would like toknow the exact location on Earth that a detector views at any given instant, andassociate the value recorded by the detector with that location The geometricmodel that determines (estimates) the location to which a detector points is some-

times referred to as the look-point model Some of the geometric effects to be

considered in constructing the look-point model are detector-to-detector ments; location (ephemeris) of the spacecraft relative to the Earth; orientation(attitude) of the spacecraft; scanning and detector sampling frequency; variation

Trang 16

displace-50 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

of pixel size (area viewed by a detector) due to viewing angle; pixel ments due to atmospheric refraction; pixel displacements due to elevation effects,and desired map projection

displace-Geometric calibration consists of applying knowledge of these effects to mine the pixel placements Geometric calibration performed by modeling the

deter-known distortions indicated earlier is called systematic correction Because of

errors in the knowledge of parameters that define the look-point model, theregenerally are residual errors These can be corrected using ground control points(GCPs) GCPs are identifiable objects in images for which corresponding groundcoordinates can be accurately determined from a map (or another image that hasbeen geometrically corrected) Prelaunch measurements of the detector configura-tion, validation of the scanning geometry model, and validation of the geometriccorrection process using targets with known geometry are essential For eachimage that is observed, the geometric correction parameters need to be computedand stored along with the image data

The specific types of prelaunch measurements, calibration procedures, onboardcalibration mechanisms, and the radiometric and geometric calibration parameters

to be stored with the data vary from instrument to instrument However, in allcases, it is essential to maintain the calibration parameters in order to interpret theremotely sensed data meaningfully and obtain accurate information from them

Level 0 Reconstructed, unprocessed instrument or payload data at full

reso-lution; any and all communication artifacts (e.g., synchronization frames,communication headers, and duplicate data) removed The PDSs defined inSection 3.5.1 are Level 0 data products

Level 1A Reconstructed, unprocessed instrument data at full resolution,

time-referenced, and annotated with ancillary information, including radiometricand geometric calibration coefficients, and georeferencing parameters (e.g.,platform ephemeris) computed and appended but not applied to the level

0 data

Level 1B Level 1A data that have been processed to sensor units (by applying

the radiometric corrections)

Level 2 Derived geophysical variables at the same resolution and location as

the Level 1 source data Several examples of geophysical variables are given

Trang 17

PROCESSING 51

in the following text For example, sea surface temperature and land surfacemoisture are geophysical variables derived from the measured radiancevalues constituting a Level 1B product

Level 3 Derived geophysical variables mapped on uniform space–time grid

scales, usually with some completeness and consistency

Level 4 Model output or results from analyses of lower levels of data (e.g.,

variables derived from multiple measurements)

Not all instruments have products at all these levels For convenience, some ofthe instrument teams have defined other intermediate levels also For example, thescience team responsible for the moderate resolution imaging spectroradiometer(MODIS) instrument on EOS defines a Level 2G product as an intermediate step

in geometric corrections to map a Level 2 product onto a standard grid to produce

a Level 3 product [14]

Several examples of geophysical variables referred to in the foregoing nitions of Level 2 and Level 3 products are grouped as characteristics of theatmosphere, land, and oceans as follows:

defi-Atmosphere atmospheric temperature profile, atmospheric humidity profile,

aerosol vertical structure, cloud height, cloud-top temperature, aerosoloptical depth, top of the atmosphere fluxes, lightening events, cirrusice content, cloud-top altitude, concentrations of various gases such aschlorofluorocarbon (CFC), methane, ozone, carbon monoxide, and nitrousoxide

emissivity, surface kinetic temperature, surface temperature, ice sheet tion, ice sheet roughness, land surface moisture, snow water equivalent,day–night temperature difference, and snow cover

eleva-Oceans sea surface temperature, ocean surface wind speed, sea ice

concentra-tion, sea surface topography, chlorophyll-A pigment concentraconcentra-tion, phyll fluorescence, and water-leaving radiance

chloro-In addition, Level 4 products are produced from models that combine (by dataassimilation or fusion) several of the lower-level products, to characterize certainphenomena such as weather, global circulation, primary productivity, carboncycle, and so on

Either individual geophysical variables or groups of several of them constitute

data products The term standard products is used in the EOS Program to refer

to “data products that are generated as a part of a research investigation usingEOS data that are of wide research utility, are routinely generated, and in generalare produced for spatially and/or temporally extensive subsets of the data.”

The term special data products is used in the EOS Program to refer to “data

products that are generated as part of a research investigation using EOS dataand that are produced for a limited region or time period, or products that arenot accepted as standard by the EOS Investigators’ Working Group (IWG) and

Trang 18

52 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

NASA Headquarters Special products may be reclassified later as standard

of the instruments on EOS) that acquire and process data at a high resolutionover small regions of the Earth The following discussion provides a sampling

of methods used in deriving information from such multispectral (or tral) sensors There are several textbooks on image processing and multispectralimage analysis that cover these topics in much greater detail [13,15–18] Thediscussion here is qualitative and avoids much of the mathematical detail neededfor a more thorough treatment of this topic The products containing the derivedinformation in these cases are usually categorized as Level 3 and above, becausethey are typically mapped to standard ground coordinates (rather than sensorcoordinates) on a grid

hyperspec-From the point of view of accessing and manipulating data, the operations onmultispectral images can be categorized as single-band (or single-image) oper-ations, multiband (or multiimage) operations, and data-merging (multitemporal,multisensor, image and ancillary data)

In single-band or single-image operations, each of the spectral bands in a spectral image is handled separately These operations are used for enhancingthe images for visual analysis and interpretation, for removing artifacts of thesensing system, for radiometric and geometric corrections (see Section 3.5), orfor spatial feature extraction Some of the most common single-band operationsare described here

multi-3.6.2.1 Contrast Stretch Sensor data in a single image generally fall within a

narrower range than that of display devices Color display devices with 8 bitsper color permit 256 levels (0 through 255), but the values from a sensor mightspan a different range The input pixel values of the image are modified to outputvalues using a linear or nonlinear function that maps the smallest number to 0and the largest to 255 This improves the contrast in the image and assists ananalyst during manual interpretation

3.6.2.2 Histogram Equalization A disadvantage of a linear stretch is that the

output gray levels are assigned equally to input values regardless of whether theyoccur rarely or frequently Frequently occurring gray levels represent a largerpart of the image Therefore, to increase the contrast in a large percentage of theimage area, variable stretching of the input ranges is performed on the basis of

Trang 19

PROCESSING 53

the frequency of occurrence of input values The larger the frequency, the widerthe range of output values assigned The result is that the histogram of the outputimage approximates a uniform distribution

3.6.2.3 Selective Saturation A user may be interested in a specific area of

the image and may not care about the rest In such a case, the image values inthe area of interest may be stretched to the maximum output range (0 to 255)and the values in the other parts of the image be allowed to “saturate.” That is,the contrast stretching function may map some pixel values outside the area ofinterest to numbers below 0 or above 255 Output values below 0 are set to 0and those above 255 are set to 255

3.6.2.4 Sensor Noise Removal Occasional dropouts (or flipped bits) in sensor

values may appear in images as individual pixels, whose values are very differentfrom those of their neighboring pixels Such values are adjusted using medianfiltering A given pixel’s value is replaced by the median value of a set of pixels

in a 3× 3 or 5 × 5 neighborhood centered on the given pixel This processingstep facilitates visual interpretation However, for accurate analyses, it may bemore useful simply to mark such pixels as “wrong” values

3.6.2.5 Destriping With multiple detectors in a given spectral band (as in

parallel cross-track and along-track scanners), the images may appear striped,because the different sensors have slightly different responses, and thereforegenerate slightly different output values even while viewing the same target.For example, a uniform source of reflectance such as a desert or a lake mayappear in the image as having a set of horizontal stripes Although such imagesmust be rigorously calibrated radiometrically for accurate analytical work (seeprevious section on Radiometric errors), it is possible to use simple algorithms

to remove the striping in the image for visual analysis An example of such

an algorithm is based on computing the histograms of the subsets of the image

obtained by individual detectors Thus, if there are n detectors in the instrument,

n histograms are obtained The mean and standard deviation of each histogramare compared to, say, those of the first histogram For each histogram, exceptthe first, two adjustment factors are constructed: a gain, computed as the ratio ofthe variance of the first histogram to its variance; and a bias, computed as thedifference of the mean of the first histogram and its mean Each of the subim-ages corresponding to the individual detectors is then corrected by adding thebias factor to it and multiplying the result by the gain factor

3.6.2.6 Correcting Line Dropouts Occasionally, because of telemetry errors,

images may have line dropouts — 0 values for all or part of a scan line Suchdropouts appear as black lines in an image For visual purposes, they can becorrected by assigning the average of the pixel values from the lines above andbelow to each affected pixel

3.6.2.7 Spatial Filtering Spatial filters emphasize or block image data at

various spatial frequencies Slow variations in brightness (e.g., over deserts and

Trang 20

54 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

bodies of water) have low spatial frequency, whereas rapid variations (e.g., atedges between fields and roads, lines of crops in agricultural fields — in high-resolution images) have high spatial frequency Low-pass (or low-emphasis)filters are designed to reduce the high-frequency components of the image andthus suppress local detail High-pass (or high-emphasis) filters emphasize thelocal detail and suppress the overall variations in the image Spatial filtering usesthe input pixel values in a neighborhood of each pixel to determine the values ofthe output pixel A simple low-pass filter is an averaging filter that replaces eachpixel value by the average of the pixel values in a 3× 3 or 5 × 5 neighborhoodcentered at the given pixel A simple high-pass filter replaces each pixel value bythe difference between the input value and the value of the output of an averagingfilter at that pixel

Both types of filters are particular cases of “convolution filters.” Convolution

filters are implemented using an n × n matrix of “weights,” where n is usually an

odd number The matrix is used as a “moving window” that is positioned overeach possible location in the input image The weights and the correspondinginput pixel values are multiplied, and the resulting products are all added toform the output pixel value The 3× 3 averaging filter uses all weights equal

to 1/9 A simple 3× 3 high-pass filter uses all weights equal to −1/9 except

at the center, where the weight is 8/9 The effect of convolution filtering on animage depends on the set of weights and the size of the convolution matrix

For example, the larger the value of n for an averaging filter, the greater the

low-frequency enhancement (and the more smeared the appearance of the outputimage) The high-pass filter discussed earlier removes all low-frequency varia-tions in the image However, by varying the center weight of the convolutionmatrix, it is possible to retain different degrees of low-frequency input data andhence vary the amount of edge enhancement obtained

3.6.2.8 Fourier Analysis The filters discussed earlier manipulate the image

in the spatial domain [i.e., the (x, y) coordinate system in which the image is

obtained] It is possible to manipulate them in a “transform domain.” In theFourier transform domain, an image is expressed as a sum of a two-dimensionalset of sinusoidal functions with different frequencies, amplitudes, and phases

The frequencies (u, v) form the coordinate system for the Fourier transform domain The amplitudes and phases are recorded for each point (u, v) in the

Fourier domain There are standard algorithms for fast computation of Fouriertransforms The Fourier transform is invertible: the original image values can berecovered from the transform It can be shown that convolution in the spatialdomain is equivalent to multiplication of corresponding values in the Fouriertransform domain Thus, to perform a convolution, one can obtain the Fouriertransforms of the input image and the convolution weight matrix, multiply thetwo Fourier transforms point by point, and obtain the inverse Fourier transform

of the result This method is significantly faster than computing the convolution

in the spatial domain unless the filter matrix is very small In addition, it issometimes convenient to design filters in the frequency domain This is done by

Trang 21

PROCESSING 55

observing the image displayed in the Fourier domain to highlight some of thefrequency anomalies in the original image Those frequencies are then suppresseddirectly in the Fourier domain (i.e., they are set to zero) A “clean” image is thenderived by obtaining the inverse [13]

Multiband (or multiimage) operations combine the data from two or more spectralbands (or images) to facilitate visual interpretation or automated informationextraction Usually, the purpose of interpretation or information extraction is

to distinguish objects on the ground or land cover types The most commonmultiband operations are now described briefly

3.6.3.1 Spectral Ratios Ratios of corresponding pixel values in a pair of

spec-tral bands are used to compensate for varying illumination effects, such as thosecaused by surface slopes In those cases, the measured reflectance values for agiven object type may be significantly different in different parts of the image.However, the ratios between the values in two spectral bands remain approxi-mately the same The ratios are different for different object types Therefore, it

is often possible to discriminate between different types of objects using tral ratios However, it is also possible that the spectral ratios hide some of thedifferences between object types that are not due to variations in illumination Ifthe absolute reflectance of two objects is different but the slopes of their spectraare the same in a given spectral region, they may yield the same spectral ratio.Because pairs of spectral bands are used to generate ratios, many ratio imagescan be generated from a multispectral input image For example, for a Landsatthematic mapper image with six nonthermal spectral bands, there are 30 possibleratio images The ratio images can be displayed in combinations of three (or withtwo ratio images and one of the input spectral bands) to generate color compos-ites Thus, a very large number of combinations are possible A criterion forselecting which ratios to display relies on the amount of variance in a given ratioimage and the correlation between ratio images: it is desirable to use images withthe highest variance and minimum correlation An optimum index factor (OIF)

spec-is defined on the basspec-is of these criteria to select ratios to dspec-isplay [19]

3.6.3.2 Principal Components Significant correlation may exist between

spec-tral bands of a multispecspec-tral image That is, two or more specspec-tral bands in the imagemay convey essentially the same information A scatter plot of the values in twospectral bands that are highly correlated shows most of the values close to a straightline Thus, by using a linear combination of the two bands, the data values can beprojected along the line that captures most of the variance in the data Generalizing

this to n dimensions, one can reduce the effective dimensionality of the data by

“packing” the information into fewer dimensions that capture most of the ance in the image This is done by computing the covariance matrix of the spectralbands The covariance matrix is then diagonalized using eigenvector analysis [20]

Trang 22

vari-56 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

The eigenvectors are used to produce linear combinations of spectral bands, calledprincipal components, which are uncorrelated and whose variances are equal to theeigenvalues The principal components that correspond to the largest eigenvaluescapture most of the intensity variations across the image

3.6.3.3 Canonical Components As discussed earlier, principal components

treat a multispectral image as a whole in determining the correlation betweenspectral bands Canonical components are a variation of this concept, wherein thelinear combinations of the original spectral bands are computed in order to maxi-mize the visual dissimilarity between objects belonging to different user-selectedclasses [13]

3.6.3.4 Decorrelation Stretch The principal components of a multispectral

image can be used to perform a “decorrelation stretch” to enhance color display of

a highly correlated data The three most significant components are first obtainedusing principal components analysis Each of the components is then indepen-dently stretched in contrast to take advantage of the full dynamic range of thedisplay The stretched data are then transformed back into the original spectralcoordinates This process increases the color saturation in the display better than

a simple contrast stretch of the original spectral bands

3.6.3.5 Vegetation Components Certain combinations of spectral bands are

found to emphasize differences between vegetation and other types of land cover

as well as the differences among various types of vegetation For example, in theNOAA AVHRR sensor, combinations of Channel 1 (visible band) and Channel 2(near infrared band) are found to be indicative of green vegetation Such combi-

nations are therefore called vegetation indices A simple vegetation index (VI) is

given by the pixel-by-pixel difference (C2–C1), where C2 and C1 are the imagevalues in the two channels The Normalized Difference Vegetation Index (NDVI)

is defined as follows:

NDVI= (C2− C1)/(C2+ C1)

Both VI and NDVI have large values for healthy vegetation because of itshigh reflectivity in the infrared band The NDVI is the preferred index for globalvegetation monitoring because it is a ratio and, as discussed earlier (see para-graph on Spectral Ratios), compensates for differences in illumination conditions,surface slopes, aspect, and so on The NDVI has been used to produce exten-sive and frequent maps of global vegetation using AVHRR data [21–23] (see

http://edcdaac.usgs.gov/dataproducts.html) In Ref [24], an empirically

deter-mined linear transformation called the ‘Tasseled Cap’ transformation is defined

for Landsat multispectral scanner (MSS) data that maps most of the data into twocomponents — brightness and greenness The latter component is strongly corre-lated with the amount of vegetation This concept has been extended by Cristand Cicone [25] to map six bands of Landsat TM data (other than the thermal

Trang 23

PROCESSING 57

Training Training sample

assessment

Decision rule

Multispectral

Feature vectors

Ground truth

Sample set of “labeled”

Figure 3.6 Steps in image classification.

band) into three components that emphasize soils, vegetation, and wetness Forexamples of other work related to vegetation mapping see Refs [26–31]

Classification is the process of assigning a label representing a meaningful gory (or class) to each pixel in an image, on the basis of multispectral values ofthe pixel (and possibly its neighbors) For example, Labels 1, 2, 3, and 4 could

cate-be assigned respectively to pixels determined to cate-be forest, agriculture, urban, andwater The label image could then be displayed by assigning a unique and distinc-tive color to each label to highlight the different classes Although classification

is a multiband operation, owing to the variety of techniques and its importance it

is discussed here as a separate section Typically, classification involves decisionrules with parameters that must be estimated Generally, classification algorithmshave the following phases: feature extraction, training (or learning), labeling, andaccuracy assessment (or validation) Figure 3.6 shows these steps and the inputsand outputs involved

3.6.4.1 Feature Extraction Features are numerical characteristics based on

which classification decisions are made That is, they help discriminate betweenthe different classes They should have values as close as possible for pixels orregions of the same class and as dissimilar as possible for those of differentclasses Generally, several features are used together (constituting a “featurevector”) for a given pixel (or a region) to which the class label is to be assigned Afeature vector can simply be the one-dimensional array of multispectral measure-ments at a given pixel One can apply a principal components or canonicalcomponents transformation (discussed earlier) and assemble a multichannel imageconsisting of the components accounting for a predetermined amount (i.e., at least

95 percent) of the image variance The feature vector for a given pixel then is

Trang 24

58 SATELLITE IMAGERY IN EARTH SCIENCE APPLICATIONS

the one-dimensional array of values in that multichannel image Alternatively,one can combine textural (i.e., local variations in the neighborhood of a pixel)and spectral values into feature vectors

3.6.4.2 Training (or Learning) The training or learning step in a classification

algorithm is where the parameters of a “decision rule” are estimated A decision

rule is essentially a set of mathematical functions (also called discriminant

func-tions) that are evaluated to decide the label to be assigned to each feature vector.

The corresponding curves (in two dimensions) or hypersurfaces (in higher

dimen-sional spaces) are referred to as decision surfaces Two examples of decision

surfaces are shown in Figure 3.7 Here, the feature vectors are two-dimensional

In general, feature vectors from multispectral images have much higher sionality The scatter plots show three distinct classes, denoted by o, +, and

dimen-−, respectively Figure 3.7a shows ellipses and Figure 3.7b shows straight linesseparating these classes In the higher dimensional case, these become hyper-ellipsoids and hyperplanes, respectively First, suppose that there is a known,complete, and valid physical model of the process that determines the observedfeature vector given a particular class Also, suppose that it is possible to invertthe model Then, given an observed feature vector, it is possible to use the inversemodel and assign a class label to it However, physical models are not alwaysknown Even if physical models are known to characterize the observing process,deviations of the pixels from the ideal classes modeled cause the observations to

be different from the predicted values Examples of distorting phenomena thatmake it difficult to model the observations physically are mixed pixels, slope andaspect variations, and so forth Even as the state of the art in measurements andphysical modeling improves, it is necessary to account for the residuals through

a statistical model The decision rules can thus be based on a combination ofphysical and statistical models

Examples of decision rules are as follows:

vectors of a set of samples from that class An observed feature vector isassigned to the class that has the closest mean

+ +

+ + + +

+

o o o o o o o o o

+ +

+

o o o o o o o o

Ngày đăng: 21/01/2014, 18:20

Nguồn tham khảo

Tài liệu tham khảo Loại Chi tiết
3. G. Asrar and H.K. Ramapriyan, Data and information system for Mission to Planet Earth, Remote Sensing Rev. 13, 1 – 25 (1995) Sách, tạp chí
Tiêu đề: Remote Sensing Rev
5. J. Dozier and H.K. Ramapriyan, Planning for the EOS Data and Information System (EOSDIS), The Science of Global Environmental Change, NATO ASI, 1991 Sách, tạp chí
Tiêu đề: The Science of Global Environmental Change
6. NASA, M. King and R. Greenstone, eds., 1999 EOS Reference Handbook: A Guide to NASA’s Earth Science Enterprise and the Earth-Observing System, NASA Goddard Space Flight Center, pp. 34 – 48 1999; (http://eos.nasa.gov/) Sách, tạp chí
Tiêu đề: 1999 EOS Reference Handbook: A Guide"to NASA’s Earth Science Enterprise and the Earth-Observing System", NASA GoddardSpace Flight Center, pp. 34 – 48 1999; ("http://eos.nasa.gov/
7. I.H. Rowlands, The fourth meeting of the parties to the Montreal Protocol: Report and reflection, Environment, 35(6), 25 – 34 (1993) Sách, tạp chí
Tiêu đề: Environment
10. J.C. Farman, B.G. Gardiner, and J.D. Shanklin, Large losses of total ozone in Antarc- tica reveal seasonal ClO x /NO x interaction, Nature 15, 207 – 210 (1985) Sách, tạp chí
Tiêu đề: Nature
11. D.L. Skole, and C.J. Tucker, Evidence for tropical deforestation, fragmented habitat, and adversely affected habitat in the Brazilian Amazon 1978 – 1988, Science 260, 1905 – 1910 (1993) Sách, tạp chí
Tiêu đề: Science
12. D.L. Skole, W.A. Salas, and V. Taylor eds., Global Observations of Forest Cover:Fine Resolution Data and Product Design Strategy, Report of a Workshop, CNES Headquarters, Paris, France, 23 – 25 September 1998 Sách, tạp chí
Tiêu đề: Global Observations of Forest Cover:"Fine Resolution Data and Product Design Strategy
13. T.M. Lillesand and R.W. Kiefer, Remote Sensing and Image Interpretation, 4th ed., Wiley and Sons, New York, 2000 Sách, tạp chí
Tiêu đề: Remote Sensing and Image Interpretation
14. R.E. Wolfe, D.P. Roy, and E. Vermote, MODIS land data storage, gridding and compositing methodology: level 2 Grid, IEEE Trans. Geosci. Remote Sensing, 35, 1324 – 1338 (1998) Sách, tạp chí
Tiêu đề: IEEE Trans. Geosci. Remote Sensing
15. J.A. Richards, Remote Sensing Digital Image Analysis, An Introduction, Springer- Verlag, Berlin, 1993 Sách, tạp chí
Tiêu đề: Remote Sensing Digital Image Analysis, An Introduction
17. J.R. Jensen, Introductory Digital Image Processing: A remote Sensing Perspective, Prentice Hall, Englewood Cliffs, N.J., 1996 Sách, tạp chí
Tiêu đề: Introductory Digital Image Processing: A remote Sensing Perspective
18. P.J. Gibson, and C.H. Power, Introductory Remote Sensing: Digital Image Processing and Applications, Routledge, London, New York, 2000 Sách, tạp chí
Tiêu đề: Introductory Remote Sensing: Digital Image Processing"and Applications
19. P.S., Chavez, Jr. et al., Statistical method for selecting Landsat MSS ratios, J. App.Photo. Eng., 8,23 – 30, (1982) Sách, tạp chí
Tiêu đề: J. App."Photo. Eng
20. M.M. Nicholson Fundamentals and Techniques of Mathematics for Scientists, Wiley and Sons, New York, 1961, pp. 460 – 464 Sách, tạp chí
Tiêu đề: Fundamentals and Techniques of Mathematics for Scientists
22. M.E. James, and S. Kalluri, The Pathfinder AVHRR land data set: An improved coarse resolution data set for terrestrial monitoring, Int. J. Remote Sensing 15, 3347 – 3364 (1994) Sách, tạp chí
Tiêu đề: Int. J. Remote Sensing
23. J.R.G. Townshend et al., The 1-km AVHRR global data set: needs of the International Geosphere Biosphere Program, Int. J. Remote Sensing 15, 3319 – 3332 (1994) Sách, tạp chí
Tiêu đề: Int. J. Remote Sensing
24. R.J. Kauth, and G.S. Thomas, The tasseled cap — a graphic description of spectral- temporal development of agricultural crops as seen by Landsat, Proceedings of the 2nd International Symposium on Machine Processing of Remotely Sensed Data, Purdue University, West Lafayette, Ind. 1976 pp. 4B 41 – 51 Sách, tạp chí
Tiêu đề: Proceedings of the 2nd"International Symposium on Machine Processing of Remotely Sensed Data
25. E.P. Crist, and R.C. Cicone, Application of the tasseled cap concept to simulated Thematic Mapper data, Photo. Eng. Remote Sensing 50(3), 343 – 352 (1984) Sách, tạp chí
Tiêu đề: Photo. Eng. Remote Sensing
26. A.R. Huete, A soil-adjusted vegetation index (SAVI), Remote Sensing Environ. 25, 295 – 309, (1988) Sách, tạp chí
Tiêu đề: Remote Sensing Environ
27. F.G. Hall, K.F. Huemmrich, and S.N. Goward, Use of narrow-band spectra to Esti- mate the fraction of absorbed photosynthetically active radiation, Remote Sensing of Environment 33, 47 – 54 (1990) Sách, tạp chí
Tiêu đề: Remote Sensing of"Environment

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm

w