1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Tài liệu Xử lý hình ảnh kỹ thuật số P16 doc

42 358 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Image Feature Extraction
Tác giả William K. Pratt
Trường học John Wiley & Sons, Inc.
Chuyên ngành Digital Image Processing
Thể loại Sách tham khảo
Năm xuất bản 2001
Định dạng
Số trang 42
Dung lượng 1,29 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

The figure-of-merit approach to feature evaluation involves the establishment ofsome functional distance measurements between sets of image features such that alarge distance implies a l

Trang 1

16

IMAGE FEATURE EXTRACTION

An image feature is a distinguishing primitive characteristic or attribute of an image.

Some features are natural in the sense that such features are defined by the visualappearance of an image, while other, artificial features result from specific manipu-lations of an image Natural features include the luminance of a region of pixels andgray scale textural regions Image amplitude histograms and spatial frequency spec-tra are examples of artificial features

Image features are of major importance in the isolation of regions of common

property within an image (image segmentation) and subsequent identification or labeling of such regions (image classification) Image segmentation is discussed in

Chapter 16 References 1 to 4 provide information on image classification niques

tech-This chapter describes several types of image features that have been proposedfor image segmentation and classification Before introducing them, however,methods of evaluating their performance are discussed

16.1 IMAGE FEATURE EVALUATION

There are two quantitative approaches to the evaluation of image features: prototypeperformance and figure of merit In the prototype performance approach for imageclassification, a prototype image with regions (segments) that have been indepen-dently categorized is classified by a classification procedure using various imagefeatures to be evaluated The classification error is then measured for each featureset The best set of features is, of course, that which results in the least classificationerror The prototype performance approach for image segmentation is similar innature A prototype image with independently identified regions is segmented by a

Digital Image Processing: PIKS Inside, Third Edition William K Pratt

Copyright © 2001 John Wiley & Sons, Inc.ISBNs: 0-471-37407-5 (Hardback); 0-471-22132-5 (Electronic)

Trang 2

The figure-of-merit approach to feature evaluation involves the establishment ofsome functional distance measurements between sets of image features such that alarge distance implies a low classification error, and vice versa Faugeras and Pratt

(5) have utilized the Bhattacharyya distance (3) figure-of-merit for texture feature

evaluation The method should be extensible for other features as well The

Bhatta-charyya distance (B-distance for simplicity) is a scalar function of the probability

densities of features of a pair of classes defined as

(16.1-1)

where x denotes a vector containing individual image feature measurements with

conditional density It can be shown (3) that the B-distance is related tonically to the Chernoff bound for the probability of classification error using a

mono-Bayes classifier The bound on the error probability is

(16.1-2)

where represents the a priori class probability For future reference, the

Cher-noff error bound is tabulated in Table 16.1-1 as a function of B-distance for equally

likely feature classes

For Gaussian densities, the B-distance becomes

(16.1-3)

where ui and represent the feature mean vector and the feature covariance matrix

of the classes, respectively Calculation of the B-distance for other densities is ally difficult Consequently, the B-distance figure of merit is applicable only for

gener-Gaussian-distributed feature data, which fortunately is the common case In tice, features to be evaluated by Eq 16.1-3 are measured in regions whose class hasbeen determined independently Sufficient feature measurements need be taken sothat the feature mean vector and covariance can be estimated accurately

=

Trang 3

AMPLITUDE FEATURES 511 TABLE 16.1-1 Relationship of Bhattacharyya Distance

and Chernoff Error Bound

lumi-in a pixel neighborhood is given by

(16.2-1)

where W = 2w + 1 An advantage of a neighborhood, as opposed to a point

measure-ment, is a diminishing of noise effects because of the averaging process A tage is that object edges falling within the neighborhood can lead to erroneousmeasurements

disadvan-The median of pixels within a neighborhood can be used as an alternativeamplitude feature to the mean measurement of Eq 16.2-1, or as an additional

feature The median is defined to be that pixel amplitude in the window for which

one-half of the pixels are equal or smaller in amplitude, and one-half are equal orgreater in amplitude Another useful image amplitude feature is the neighborhoodstandard deviation, which can be computed as

Trang 4

In the literature, the standard deviation image feature is sometimes called the image dispersion Figure 16.2-1 shows an original image and the mean, median, and stan-

dard deviation of the image computed over a small neighborhood

The mean and standard deviation of Eqs 16.2-1 and 16.2-2 can be computedindirectly in terms of the histogram of image pixels within a neighborhood This

leads to a class of image amplitude histogram features Referring to Section 5.7, the

first-order probability distribution of the amplitude of a quantized image may bedefined as

(16.2-3)where denotes the quantized amplitude level for The first-order his-

togram estimate of P(b) is simply

FIGURE 16.2-1 Image amplitude features of the washington_ir image.

(a) Original (b) 7 × 7 pyramid mean

(c) 7 × 7 standard deviation (d ) 7 × 7 plus median

P b( ) = P R[F j k( , ) r= b]

Trang 5

AMPLITUDE FEATURES 513

(16.2-4)

where M represents the total number of pixels in a neighborhood window centered

about , and is the number of pixels of amplitude in the same window.The shape of an image histogram provides many clues as to the character of theimage For example, a narrowly distributed histogram indicates a low-contrastimage A bimodal histogram often suggests that the image contains an object with anarrow amplitude range against a background of differing amplitude The followingmeasures have been formulated as quantitative shape descriptions of a first-orderhistogram (6)

Trang 6

The factor of 3 inserted in the expression for the Kurtosis measure normalizes S K tozero for a zero-mean, Gaussian-shaped histogram Another useful histogram shape

measure is the histogram mode, which is the pixel amplitude corresponding to the

histogram peak (i.e., the most commonly occurring pixel amplitude in the window)

If the histogram peak is not unique, the pixel at the peak closest to the mean is ally chosen as the histogram shape descriptor

usu-Second-order histogram features are based on the definition of the joint bility distribution of pairs of pixels Consider two pixels and thatare located at coordinates and , respectively, and, as shown in Figure

proba-16.2-2, are separated by r radial units at an angle with respect to the horizontal

axis The joint distribution of image amplitude values is then expressed as

(16.2-11)

where and represent quantized pixel amplitude values As a result of the crete rectilinear representation of an image, the separation parameters mayassume only certain discrete values The histogram estimate of the second-order dis-tribution is

dis-(16.2-12)

where M is the total number of pixels in the measurement window and

denotes the number of occurrences for which and

If the pixel pairs within an image are highly correlated, the entries in will

be clustered along the diagonal of the array Various measures, listed below, havebeen proposed (6,7) as measures that specify the energy spread about the diagonal of

Trang 8

cate the degree of correspondence of a particular luminance pattern with an imagefield If a basis pattern is of the same spatial form as a feature to be detected withinthe image, image detection can be performed simply by monitoring the value of thetransform coefficient The problem, in practice, is that objects to be detected within

an image are often of complex shape and luminance distribution, and hence do notcorrespond closely to the more primitive luminance patterns of most image trans-forms

Lendaris and Stanley (8) have investigated the application of the continuous dimensional Fourier transform of an image, obtained by a coherent optical proces-sor, as a means of image feature extraction The optical system produces an electricfield radiation pattern proportional to

two-(16.3-1)

where are the image spatial frequencies An optical sensor produces an put

out-(16.3-2)

proportional to the intensity of the radiation pattern It should be observed that

and are unique transform pairs, but is not uniquelyrelated to For example, does not change if the origin of

is shifted In some applications, the translation invariance of may be abenefit Angular integration of over the spatial frequency plane produces

a spatial frequency feature that is invariant to translation and rotation Representing

in polar form, this feature is defined as

(16.3-3)

attribute of the feature

(16.3-4)

Fxy) F x y( , )exp{–i ω( x xy y)}d x d y

∞ –

∞ –

Trang 9

TRANSFORM COEFFICIENT FEATURES 517

The Fourier domain intensity pattern is normally examined in specificregions to isolate image features As an example, Figure 16.3-1 defines regions forthe following Fourier features:

Trang 10

For a discrete image array , the discrete Fourier transform

(16.3-9)

FIGURE 16.3-2 Discrete Fourier spectra of objects; log magnitude displays.

(a ) Rectangle (b ) Rectangle transform

(c ) Ellipse (d ) Ellipse transform

(e ) Triangle (f ) Triangle transform

Trang 11

TEXTURE DEFINITION 519

for can be examined directly for feature extraction purposes izontal slit, vertical slit, ring, and sector features can be defined analogous toEqs 16.3-5 to 16.3-8 This concept can be extended to other unitary transforms,such as the Hadamard and Haar transforms Figure 16.3-2 presents discrete Fouriertransform log magnitude displays of several geometric shapes

Hor-16.4 TEXTURE DEFINITION

Many portions of images of natural scenes are devoid of sharp edges over largeareas In these areas, the scene can often be characterized as exhibiting a consistentstructure analogous to the texture of cloth Image texture measurements can be used

to segment an image and classify its segments

Several authors have attempted qualitatively to define texture Pickett (9) states

that “texture is used to describe two dimensional arrays of variations The ments and rules of spacing or arrangement may be arbitrarily manipulated, provided

ele-a chele-arele-acteristic repetitiveness remele-ains.” Hele-awkins (10) hele-as provided ele-a more detele-aileddescription of texture: “The notion of texture appears to depend upon three ingredi-ents: (1) some local 'order' is repeated over a region which is large in comparison tothe order's size, (2) the order consists in the nonrandom arrangement of elementaryparts and (3) the parts are roughly uniform entities having approximately the samedimensions everywhere within the textured region.” Although these descriptions oftexture seem perceptually reasonably, they do not immediately lead to simple quan-titative textural measures in the sense that the description of an edge discontinuityleads to a quantitative description of an edge in terms of its location, slope angle,and height

Texture is often qualitatively described by its coarseness in the sense that a patch

of wool cloth is coarser than a patch of silk cloth under the same viewing conditions.The coarseness index is related to the spatial repetition period of the local structure

A large period implies a coarse texture; a small period implies a fine texture Thisperceptual coarseness index is clearly not sufficient as a quantitative texture mea-sure, but can at least be used as a guide for the slope of texture measures; that is,small numerical texture measures should imply fine texture, and large numericalmeasures should indicate coarse texture It should be recognized that texture is aneighborhood property of an image point Therefore, texture measures are inher-ently dependent on the size of the observation neighborhood Because texture is aspatial property, measurements should be restricted to regions of relative uniformity.Hence it is necessary to establish the boundary of a uniform textural region by someform of image segmentation before attempting texture measurements

Texture may be classified as being artificial or natural Artificial textures consist ofarrangements of symbols, such as line segments, dots, and stars placed against aneutral background Several examples of artificial texture are presented in Figure16.4-1 (9) As the name implies, natural textures are images of natural scenes con-taining semirepetitive arrangements of pixels Examples include photographs

of brick walls, terrazzo tile, sand, and grass Brodatz (11) has published an album ofphotographs of naturally occurring textures Figure 16.4-2 shows several naturaltexture examples obtained by digitizing photographs from the Brodatz album

Trang 12

FIGURE 16.4-1 Artificial texture.

Trang 13

VISUAL TEXTURE DISCRIMINATION 521

16.5 VISUAL TEXTURE DISCRIMINATION

A discrete stochastic field is an array of numbers that are randomly distributed inamplitude and governed by some joint probability density (12) When converted tolight intensities, such fields can be made to approximate natural textures surpris-ingly well by control of the generating probability density This technique is usefulfor generating realistic appearing artificial scenes for applications such as airplaneflight simulators Stochastic texture fields are also an extremely useful tool forinvestigating human perception of texture as a guide to the development of texturefeature extraction methods

In the early 1960s, Julesz (13) attempted to determine the parameters of tic texture fields of perceptual importance This study was extended later by Julesz

stochas-et al (14–16) Further extensions of Julesz’s work have been made by Pollack (17),

FIGURE 16.4-2 Brodatz texture fields.

Trang 14

Purks and Richards (18), and Pratt et al (19) These studies have provided valuableinsight into the mechanism of human visual perception and have led to some usefulquantitative texture measurement methods.

Figure 16.5-1 is a model for stochastic texture generation In this model, an array

of independent, identically distributed random variables passes through alinear or nonlinear spatial operator to produce a stochastic texture array By controlling the form of the generating probability density and thespatial operator, it is possible to create texture fields with specified statistical proper-ties Consider a continuous amplitude pixel at some coordinate in Let the set denote neighboring pixels but not necessarily nearest geo-metric neighbors, raster scanned in a conventional top-to-bottom, left-to-right fash-ion The conditional probability density of conditioned on the state of itsneighbors is given by

(16.5-1)

The first-order density employs no conditioning, the second-order density

implies that J = 1, the third-order density implies that J = 2, and so on.

16.5.1 Julesz Texture Fields

In his pioneering texture discrimination experiments, Julesz utilized Markov processstate methods to create stochastic texture arrays independently along rows of thearray The family of Julesz stochastic arrays are defined below

1 Notation Let denote a row neighbor of pixel and let

P(m), for m = 1, 2, , M, denote a desired probability generating function.

2 First-order process Set for a desired probability function P(m).

The resulting pixel probability is

Trang 15

VISUAL TEXTURE DISCRIMINATION 523

3 Second-order process Set for , and set

, where the modulus function

for integers p and q This gives a first-order probability

(16.5-3a)and a transition probability

(16.5-3b)

4 Third-order process Set for , and set

for Choose to satisfy The governing probabilities then become

(16.5-4a)(16.5-4b)(16.5-4c)

This process has the interesting property that pixel pairs along a row areindependent, and consequently, the process is spatially uncorrelated

Figure 16.5-2 contains several examples of Julesz texture field discriminationtests performed by Pratt et al (19) In these tests, the textures were generatedaccording to the presentation format of Figure 16.5-3 In these and subsequentvisual texture discrimination tests, the perceptual differences are often small Properdiscrimination testing should be performed using high-quality photographic trans-parencies, prints, or electronic displays The following moments were used as sim-ple indicators of differences between generating distributions and densities of thestochastic fields

(16.5-5a)(16.5-5b)(16.5-5c)

Trang 16

The examples of Figure 16.5-2a and b indicate that texture field pairs differing in

their first- and second-order distributions can be discriminated The example of

Figure 16.5-2c supports the conjecture, attributed to Julesz, that differences in

third-order, and presumably, higher-order distribution texture fields cannot be perceivedprovided that their first-order and second- distributions are pairwise identical

FIGURE 16.5-2 Field comparison of Julesz stochastic fields;

(a) Different first order

sA = 0.289, sB = 0.204 (b) Different second ordersA = 0.289, sB = 0.289

Trang 17

VISUAL TEXTURE DISCRIMINATION 525

16.5.2 Pratt, Faugeras, and Gagalowicz Texture Fields

Pratt et al (19) have extended the work of Julesz et al (13–16) in an attempt to studythe discriminability of spatially correlated stochastic texture fields A class of Gaus-sian fields was generated according to the conditional probability density

(16.5-6a)where

(16.5-6b)

(16.5-6c)

The covariance matrix of Eq 16.5-6a is of the parametric form

FIGURE 16.5-3 Presentation format for visual texture discrimination experiments.

1 2

Trang 18

where denote correlation lag terms Figure 16.5-4 presents an example ofthe row correlation functions used in the texture field comparison tests describedbelow

Figures 16.5-5 and 16.5-6 contain examples of Gaussian texture field comparisontests In Figure 16.5-5, the first-order densities are set equal, but the second-ordernearest neighbor conditional densities differ according to the covariance function plot

of Figure 16.5-4a Visual discrimination can be made in Figure 16.5-5, in which the

correlation parameter differs by 20% Visual discrimination has been found to bemarginal when the correlation factor differs by less than 10% (19) The first- andsecond-order densities of each field are fixed in Figure 16.5-6, and the third-order

FIGURE 16.5-4 Row correlation factors for stochastic field generation Dashed line, field

A; solid line, field B.

(b) Constrained third-order density (a) Constrained second-order density

Trang 19

VISUAL TEXTURE DISCRIMINATION 527

conditional densities differ according to the plan of Figure 16.5-4b Visual

discrimi-nation is possible The test of Figure 16.5-6 seemingly provides a counterexample to

However, the general second-order density pairs and are not necessarily equal for an arbitrary neighbor , andtherefore the conditions necessary to disprove Julesz’s conjecture are violated

To test the Julesz conjecture for realistically appearing texture fields, it is sary to generate a pair of fields with identical first-order densities, identicalMarkovian type second-order densities, and differing third-order densities for every

neces-FIGURE 16.5-5 Field comparison of Gaussian stochastic fields with different second-order

FIGURE 16.5-6 Field comparison of Gaussian stochastic fields with different third-order

Trang 20

pair of similar observation points in both fields An example of such a pair of fields

is presented in Figure 16.5-7 for a non-Gaussian generating process (19) In thisexample, the texture appears identical in both fields, thus supporting the Juleszconjecture

Gagalowicz has succeeded in generating a pair of texture fields that disprove theJulesz conjecture (20) However, the counterexample, shown in Figure 16.5-8, is notvery realistic in appearance Thus, it seems likely that if a statistically based texturemeasure can be developed, it need not utilize statistics greater than second-order

FIGURE 16.5-7 Field comparison of correlated Julesz stochastic fields with identical

first-and second-order densities, but different third-order densities

FIGURE 16.5-8 Gagalowicz counterexample.

h A = 0.500, h B = 0.500

s A = 0.167, s B = 0.167

a A = 0.850, a B = 0.850

q A = 0.040, q B = − 0.027

Trang 21

TEXTURE FEATURES 529

Because a human viewer is sensitive to differences in the mean, variance, andautocorrelation function of the texture pairs, it is reasonable to investigate thesufficiency of these parameters in terms of texture representation Figure 16.5-9 pre-sents examples of the comparison of texture fields with identical means, variances,

and autocorrelation functions, but different nth-order probability densities Visual

discrimination is readily accomplished between the fields This leads to the sion that these low-order moment measurements, by themselves, are not always suf-ficient to distinguish texture fields

conclu-16.6 TEXTURE FEATURES

As noted in Section 16.4, there is no commonly accepted quantitative definition ofvisual texture As a consequence, researchers seeking a quantitative texture measurehave been forced to search intuitively for texture features, and then attempt to evalu-ate their performance by techniques such as those presented in Section 16.1 Thefollowing subsections describe several texture features of historical and practicalimportant References 20 to 22 provide surveys on image texture feature extraction.Randen and Husoy (23) have performed a comprehensive study of many texture fea-ture extraction methods

FIGURE 16.5-9 Field comparison of correlated stochastic fields with identical means,

variances, and autocorrelation functions, but different nth-order probability densities

gener-ated by different processing of the same input field Input array consists of uniform randomvariables raised to the 256th power Moments are computed

h A = 0.413, h B = 0.412

s A = 0.078, s B = 0.078

a A = 0.915, a B = 0.917

q A = 1.512, q B = 0.006

Ngày đăng: 21/01/2014, 15:20

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm