1. Trang chủ
  2. » Khoa Học Tự Nhiên

Methods of multivariate analysis

738 46 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 738
Dung lượng 3,71 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

I thank the authors, editors, and owners of copyrights for permission to reproducethe following materials: • Figure 3.8 and Table 3.2, Kleiner and Hartigan 1981, Reprinted by permission

Trang 2

Second Edition

Trang 4

Methods of Multivariate Analysis Second Edition

ALVIN C RENCHER

Brigham Young University

A JOHN WILEY & SONS, INC PUBLICATION

Trang 5

Copyright c  2002 by John Wiley & Sons, Inc All rights reserved.

Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form

or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4744 Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 605 Third Avenue, New York, NY 10158-0012, (212) 850-6011, fax (212) 850-6008 E-Mail: PERMREQ@WILEY.COM.

For ordering and customer service, call 1-800-CALL-WILEY.

Library of Congress Cataloging-in-Publication Data

Rencher, Alvin C., 1934–

Methods of multivariate analysis / Alvin C Rencher.—2nd ed.

p cm — (Wiley series in probability and mathematical statistics)

“A Wiley-Interscience publication.”

Includes bibliographical references and index.

Trang 6

2.2 Notation and Basic Definitions, 5

2.2.1 Matrices, Vectors, and Scalars, 5

2.2.2 Equality of Vectors and Matrices, 7

2.2.3 Transpose and Symmetric Matrices, 7

2.2.4 Special Matrices, 8

2.3 Operations, 9

2.3.1 Summation and Product Notation, 9

2.3.2 Addition of Matrices and Vectors, 10

2.3.3 Multiplication of Matrices and Vectors, 11

2.10 Orthogonal Vectors and Matrices, 31

2.11 Eigenvalues and Eigenvectors, 32

2.11.1 Definition, 32

2.11.2 I + A and I − A, 33

2.11.3 tr(A) and |A|, 34

2.11.4 Positive Definite and Semidefinite Matrices, 34

2.11.5 The Product AB, 35

2.11.6 Symmetric Matrix, 35

v

Trang 7

2.11.7 Spectral Decomposition, 35

2.11.8 Square Root Matrix, 36

2.11.9 Square Matrices and Inverse Matrices, 36

2.11.10 Singular Value Decomposition, 36

3.1 Mean and Variance of a Univariate Random Variable, 43

3.2 Covariance and Correlation of Bivariate Random Variables, 453.2.1 Covariance, 45

3.2.2 Correlation, 49

3.3 Scatter Plots of Bivariate Samples, 50

3.4 Graphical Displays for Multivariate Samples, 52

3.8.2 Three or More Subsets, 64

3.9 Linear Combinations of Variables, 66

3.9.1 Sample Properties, 66

3.9.2 Population Properties, 72

3.10 Measures of Overall Variability, 73

3.11 Estimation of Missing Values, 74

3.12 Distance between Vectors, 76

4.1 Multivariate Normal Density Function, 82

4.1.1 Univariate Normal Density, 82

4.1.2 Multivariate Normal Density, 83

4.1.3 Generalized Population Variance, 83

4.1.4 Diversity of Applications of the Multivariate Normal, 854.2 Properties of Multivariate Normal Random Variables, 85

4.3 Estimation in the Multivariate Normal, 90

4.3.1 Maximum Likelihood Estimation, 90

4.3.2 Distribution of y and S, 91

4.4 Assessing Multivariate Normality, 92

4.4.1 Investigating Univariate Normality, 92

4.4.2 Investigating Multivariate Normality, 96

Trang 8

4.5 Outliers, 99

4.5.1 Outliers in Univariate Samples, 100

4.5.2 Outliers in Multivariate Samples, 101

5.1 Multivariate versus Univariate Tests, 112

5.2 Tests on␮ with ⌺ Known, 113

5.2.1 Review of Univariate Test for H0: µ = µ0

5.4.1 Review of Univariate Two-Sample t-Test, 121

5.4.2 Multivariate Two-Sample T2-Test, 122

5.4.3 Likelihood Ratio Tests, 126

5.5 Tests on Individual Variables Conditional on Rejection of H0by

the T2-Test, 126

5.6 Computation of T2, 130

5.6.1 Obtaining T2from a MANOVA Program, 130

5.6.2 Obtaining T2from Multiple Regression, 130

5.7 Paired Observations Test, 132

5.7.1 Univariate Case, 132

5.7.2 Multivariate Case, 134

5.8 Test for Additional Information, 136

5.9 Profile Analysis, 139

5.9.1 One-Sample Profile Analysis, 139

5.9.2 Two-Sample Profile Analysis, 141

6.1 One-Way Models, 156

6.1.1 Univariate One-Way Analysis of Variance (ANOVA), 1566.1.2 Multivariate One-Way Analysis of Variance Model

(MANOVA), 1586.1.3 Wilks’ Test Statistic, 161

6.1.4 Roy’s Test, 164

6.1.5 Pillai and Lawley–Hotelling Tests, 166

Trang 9

6.1.6 Unbalanced One-Way MANOVA, 168

6.1.7 Summary of the Four Tests and Relationship to T2, 1686.1.8 Measures of Multivariate Association, 173

6.2 Comparison of the Four Manova Test Statistics, 176

6.5.1 Review of Univariate Two-Way ANOVA, 186

6.5.2 Multivariate Two-Way MANOVA, 188

6.9 Repeated Measures Designs, 204

6.9.1 Multivariate vs Univariate Approach, 204

6.9.2 One-Sample Repeated Measures Model, 208

6.9.3 k-Sample Repeated Measures Model, 211

6.9.4 Computation of Repeated Measures Tests, 212

6.9.5 Repeated Measures with Two Within-Subjects

Factors and One Between-Subjects Factor, 2136.9.6 Repeated Measures with Two Within-Subjects

Factors and Two Between-Subjects Factors, 2196.9.7 Additional Topics, 221

6.10 Growth Curves, 221

6.10.1 Growth Curve for One Sample, 221

6.10.2 Growth Curves for Several Samples, 229

6.10.3 Additional Topics, 230

6.11 Tests on a Subvector, 231

6.11.1 Test for Additional Information, 231

6.11.2 Stepwise Selection of Variables, 233

7.1 Introduction, 248

7.2 Testing a Specified Pattern for⌺, 248

7.2.1 Testing H0: ⌺ = ⌺0, 248

Trang 10

7.2.2 Testing Sphericity, 250

7.2.3 Testing H0: ⌺ = σ2[(1 − ρ)I + ρJ], 252

7.3 Tests Comparing Covariance Matrices, 254

7.3.1 Univariate Tests of Equality of Variances, 254

7.3.2 Multivariate Tests of Equality of Covariance Matrices, 2557.4 Tests of Independence, 259

7.4.1 Independence of Two Subvectors, 259

7.4.2 Independence of Several Subvectors, 261

7.4.3 Test for Independence of All Variables, 265

8 Discriminant Analysis: Description of Group Separation 270

8.1 Introduction, 270

8.2 The Discriminant Function for Two Groups, 271

8.3 Relationship between Two-Group Discriminant Analysis and

8.6.1 Tests for the Two-Group Case, 284

8.6.2 Tests for the Several-Group Case, 285

8.7 Interpretation of Discriminant Functions, 288

8.7.1 Standardized Coefficients, 289

8.7.2 Partial F -Values, 290

8.7.3 Correlations between Variables and Discriminant

Functions, 2918.7.4 Rotation, 291

8.8 Scatter Plots, 291

8.9 Stepwise Selection of Variables, 293

9 Classification Analysis: Allocation of Observations to Groups 299

9.1 Introduction, 299

9.2 Classification into Two Groups, 300

9.3 Classification into Several Groups, 304

9.3.1 Equal Population Covariance Matrices: Linear

Classification Functions, 3049.3.2 Unequal Population Covariance Matrices: Quadratic

Classification Functions, 306

Trang 11

9.4 Estimating Misclassification Rates, 307

9.5 Improved Estimates of Error Rates, 309

9.5.1 Partitioning the Sample, 310

9.5.2 Holdout Method, 310

9.6 Subset Selection, 311

9.7 Nonparametric Procedures, 314

9.7.1 Multinomial Data, 314

9.7.2 Classification Based on Density Estimators, 315

9.7.3 Nearest Neighbor Classification Rule, 318

10.1 Introduction, 322

10.2 Multiple Regression: Fixed x’s, 323

10.2.1 Model for Fixed x’s, 323

10.2.2 Least Squares Estimation in the Fixed-x Model, 324

10.3 Multiple Regression: Random x’s, 337

10.4 Multivariate Multiple Regression: Estimation, 337

10.4.1 The Multivariate Linear Model, 337

10.4.2 Least Squares Estimation in the Multivariate Model, 33910.4.3 Properties of Least Squares Estimators ˆB, 341

10.4.4 An Estimator for⌺, 342

10.4.5 Model Corrected for Means, 342

10.5 Multivariate Multiple Regression: Hypothesis Tests, 343

10.5.1 Test of Overall Regression, 343

10.5.2 Test on a Subset of the x’s, 347

10.6 Measures of Association between the y’s and the x’s, 349

10.7 Subset Selection, 351

10.7.1 Stepwise Procedures, 351

10.7.2 All Possible Subsets, 355

10.8 Multivariate Regression: Random x’s, 358

11.1 Introduction, 361

11.2 Canonical Correlations and Canonical Variates, 361

Trang 12

11.3 Properties of Canonical Correlations, 366

11.4 Tests of Significance, 367

11.4.1 Tests of No Relationship between the y’s and the x’s, 367

11.4.2 Test of Significance of Succeeding Canonical

Correlations after the First, 36911.5 Interpretation, 371

11.6.2 MANOVA and Discriminant Analysis, 376

12.1 Introduction, 380

12.2 Geometric and Algebraic Bases of Principal Components, 381

12.2.1 Geometric Approach, 381

12.2.2 Algebraic Approach, 385

12.3 Principal Components and Perpendicular Regression, 387

12.4 Plotting of Principal Components, 389

12.5 Principal Components from the Correlation Matrix, 393

12.6 Deciding How Many Components to Retain, 397

12.7 Information in the Last Few Principal Components, 401

12.8 Interpretation of Principal Components, 401

12.8.1 Special Patterns in S or R, 402

12.8.2 Rotation, 403

12.8.3 Correlations between Variables and Principal

Components, 40312.9 Selection of Variables, 404

13.1 Introduction, 408

13.2 Orthogonal Factor Model, 409

13.2.1 Model Definition and Assumptions, 409

13.2.2 Nonuniqueness of Factor Loadings, 414

13.3 Estimation of Loadings and Communalities, 415

13.3.1 Principal Component Method, 415

13.3.2 Principal Factor Method, 421

Trang 13

13.3.3 Iterated Principal Factor Method, 424

13.3.4 Maximum Likelihood Method, 425

13.4 Choosing the Number of Factors, m, 426

13.7 Validity of the Factor Analysis Model, 443

13.8 The Relationship of Factor Analysis to Principal Component

14.3.2 Single Linkage (Nearest Neighbor), 456

14.3.3 Complete Linkage (Farthest Neighbor), 459

14.3.4 Average Linkage, 463

14.3.5 Centroid, 463

14.3.6 Median, 466

14.3.7 Ward’s Method, 466

14.3.8 Flexible Beta Method, 468

14.3.9 Properties of Hierarchical Methods, 471

15.1.2 Metric Multidimensional Scaling, 505

15.1.3 Nonmetric Multidimensional Scaling, 508

Trang 14

15.3.2 Principal Component Plots, 531

15.3.3 Singular Value Decomposition Plots, 532

15.3.4 Coordinates, 533

15.3.5 Other Methods, 535

Trang 16

I have long been fascinated by the interplay of variables in multivariate data and bythe challenge of unraveling the effect of each variable My continuing objective inthe second edition has been to present the power and utility of multivariate analysis

in a highly readable format

Practitioners and researchers in all applied disciplines often measure several ables on each subject or experimental unit In some cases, it may be productive toisolate each variable in a system and study it separately Typically, however, the vari-ables are not only correlated with each other, but each variable is influenced by theother variables as it affects a test statistic or descriptive statistic Thus, in manyinstances, the variables are intertwined in such a way that when analyzed individ-ually they yield little information about the system Using multivariate analysis, thevariables can be examined simultaneously in order to access the key features of theprocess that produced them The multivariate approach enables us to (1) explorethe joint performance of the variables and (2) determine the effect of each variable

vari-in the presence of the others

Multivariate analysis provides both descriptive and inferential procedures—wecan search for patterns in the data or test hypotheses about patterns of a priori inter-est With multivariate descriptive techniques, we can peer beneath the tangled web ofvariables on the surface and extract the essence of the system Multivariate inferentialprocedures include hypothesis tests that (1) process any number of variables withoutinflating the Type I error rate and (2) allow for whatever intercorrelations the vari-ables possess A wide variety of multivariate descriptive and inferential procedures

is readily accessible in statistical software packages

My selection of topics for this volume reflects many years of consulting withresearchers in many fields of inquiry A brief overview of multivariate analysis isgiven in Chapter 1 Chapter 2 reviews the fundamentals of matrix algebra Chapters

3 and 4 give an introduction to sampling from multivariate populations Chapters 5,

6, 7, 10, and 11 extend univariate procedures with one dependent variable (including

t -tests, analysis of variance, tests on variances, multiple regression, and multiple

cor-relation) to analogous multivariate techniques involving several dependent variables

A review of each univariate procedure is presented before covering the multivariatecounterpart These reviews may provide key insights the student missed in previouscourses

Chapters 8, 9, 12, 13, 14, and 15 describe multivariate techniques that are notextensions of univariate procedures In Chapters 8 and 9, we find functions of thevariables that discriminate among groups in the data In Chapters 12 and 13, we

xv

Trang 17

find functions of the variables that reveal the basic dimensionality and characteristicpatterns of the data, and we discuss procedures for finding the underlying latentvariables of a system In Chapters 14 and 15 (new in the second edition), we givemethods for searching for groups in the data, and we provide plotting techniques thatshow relationships in a reduced dimensionality for various kinds of data.

In Appendix A, tables are provided for many multivariate distributions and tests.These enable the reader to conduct an exact test in many cases for which softwarepackages provide only approximate tests Appendix B gives answers and hints formost of the problems in the book

Appendix C describes an ftp site that contains (1) all data sets and (2) SAS mand files for all examples in the text These command files can be adapted for use

com-in workcom-ing problems or com-in analyzcom-ing data sets encountered com-in applications

To illustrate multivariate applications, I have provided many examples and cises based on 59 real data sets from a wide variety of disciplines A practitioner

exer-or consultant in multivariate analysis gains insights and acumen from long ence in working with data It is not expected that a student can achieve this kind ofseasoning in a one-semester class However, the examples provide a good start, andfurther development is gained by working problems with the data sets For example,

experi-in Chapters 12 and 13, the exercises cover several typical patterns experi-in the covariance

or correlation matrix The student’s intuition is expanded by associating these ance patterns with the resulting configuration of the principal components or factors.Although this is a methods book, I have included a few derivations For somereaders, an occasional proof provides insights obtainable in no other way I hope thatinstructors who do not wish to use proofs will not be deterred by their presence Theproofs can be disregarded easily when reading the book

covari-My objective has been to make the book accessible to readers who have taken asfew as two statistical methods courses The students in my classes in multivariateanalysis include majors in statistics and majors from other departments With theapplied researcher in mind, I have provided careful intuitive explanations of the con-cepts and have included many insights typically available only in journal articles or

in the minds of practitioners

My overriding goal in preparation of this book has been clarity of exposition Ihope that students and instructors alike will find this multivariate text more com-fortable than most In the final stages of development of both the first and secondeditions, I asked my students for written reports on their initial reaction as they readeach day’s assignment They made many comments that led to improvements in themanuscript I will be very grateful if readers will take the time to notify me of errors

or of other suggestions they might have for improvements

I have tried to use standard mathematical and statistical notation as far as sible and to maintain consistency of notation throughout the book I have refrainedfrom the use of abbreviations and mnemonic devices These save space when one

pos-is reading a book page by page, but they are annoying to those using a book as areference

Equations are numbered sequentially throughout a chapter; for example, (3.75)indicates the 75th numbered equation in Chapter 3 Tables and figures are also num-

Trang 18

bered sequentially throughout a chapter in the form “Table 3.8” or “Figure 3.1.”Examples are not numbered sequentially; each example is identified by the samenumber as the section in which it appears and is placed at the end of the section.When citing references in the text, I have used the standard format involving theyear of publication For a journal article, the year alone suffices, for example, Fisher(1936) But for books, I have usually included a page number, as in Seber (1984,

p 216)

This is the first volume of a two-volume set on multivariate analysis The second

volume is entitled Multivariate Statistical Inference and Applications (Wiley, 1998).

The two volumes are not necessarily sequential; they can be read independently Iadopted the two-volume format in order to (1) provide broader coverage than would

be possible in a single volume and (2) offer the reader a choice of approach.The second volume includes proofs of many techniques covered in the first 13chapters of the present volume and also introduces additional topics The presentvolume includes many examples and problems using actual data sets, and there arefewer algebraic problems The second volume emphasizes derivations of the resultsand contains fewer examples and problems with real data The present volume hasfewer references to the literature than the other volume, which includes a carefulreview of the latest developments and a more comprehensive bibliography In thissecond edition, I have occasionally referred the reader to Rencher (1998) to note thatadded coverage of a certain subject is available in the second volume

I am indebted to many individuals in the preparation of the first edition My tial exposure to multivariate analysis came in courses taught by Rolf Bargmann atthe University of Georgia and D R Jensen at Virginia Tech Additional impetus toprobe the subtleties of this field came from research conducted with Bruce Brown

ini-at BYU I wish to thank Bruce Brown, Deane Branstetter, Del Scott, Robert Smidt,and Ingram Olkin for reading various versions of the manuscript and making valu-able suggestions I am grateful to the following students at BYU who helped withcomputations and typing: Mitchell Tolland, Tawnia Newton, Marianne Matis Mohr,Gregg Littlefield, Suzanne Kimball, Wendy Nielsen, Tiffany Nordgren, David Whit-ing, Karla Wasden, and Rachel Jones

SECOND EDITION

For the second edition, I have added Chapters 14 and 15, covering cluster analysis,multidimensional scaling, correspondence analysis, and biplots I also made numer-ous corrections and revisions (almost every page) in the first 13 chapters, in an effort

to improve composition, readability, and clarity Many of the first 13 chapters nowhave additional problems

I have listed the data sets and SAS files on the Wiley ftp site rather than on adiskette, as in the first edition I have made improvements in labeling of these files

I am grateful to the many readers who have pointed out errors or made suggestionsfor improvements The book is better for their caring and their efforts

Trang 19

I thank Lonette Stoddard and Candace B McNaughton for typing and J D.Williams for computer support As with my other books, I dedicate this volume to

my wife, LaRue, who has supplied much needed support and encouragement

ALVINC RENCHER

Trang 20

I thank the authors, editors, and owners of copyrights for permission to reproducethe following materials:

• Figure 3.8 and Table 3.2, Kleiner and Hartigan (1981), Reprinted by permission

of Journal of the American Statistical Association

Table 3.3, Kramer and Jensen (1969a), Reprinted by permission of Journal of

Quality Technology

Table 3.4, Reaven and Miller (1979), Reprinted by permission of Diabetologia

• Table 3.5, Timm (1975), Reprinted by permission of Elsevier North-Holland

Publishing Company

Table 3.6, Elston and Grizzle (1962), Reprinted by permission of Biometrics

Table 3.7, Frets (1921), Reprinted by permission of Genetica

Table 3.8, O’Sullivan and Mahan (1966), Reprinted by permission of American

Journal of Clinical Nutrition

Table 4.3, Royston (1983), Reprinted by permission of Applied Statistics

Table 5.1, Beall (1945), Reprinted by permission of Psychometrika

Table 5.2, Hummel and Sligo (1971), Reprinted by permission of Psychological

Bulletin

Table 5.3, Kramer and Jensen (1969b), Reprinted by permission of Journal of

Quality Technology

Table 5.5, Lubischew (1962), Reprinted by permission of Biometrics

Table 5.6, Travers (1939), Reprinted by permission of Psychometrika

• Table 5.7, Andrews and Herzberg (1985), Reprinted by permission of

Springer-Verlag

Table 5.8, Tintner (1946), Reprinted by permission of Journal of the American

Statistical Association

• Table 5.9, Kramer (1972), Reprinted by permission of the author

• Table 5.10, Cameron and Pauling (1978), Reprinted by permission of National

Academy of Science

xix

Trang 21

• Table 6.2, Andrews and Herzberg (1985), Reprinted by permission of

Springer-Verlag

Table 6.3, Rencher and Scott (1990), Reprinted by permission of

Communica-tions in Statistics: Simulation and Computation

• Table 6.6, Posten (1962), Reprinted by permission of the author

• Table 6.8, Crowder and Hand (1990, pp 21–29), Reprinted by permission of

Routledge Chapman and Hall

• Table 6.12, Cochran and Cox (1957), Timm (1980), Reprinted by permission

of John Wiley and Sons and Elsevier North-Holland Publishing Company

• Table 6.14, Timm (1980), Reprinted by permission of Elsevier North-Holland

• Table 6.19, Burdick (1979), Reprinted by permission of the author

Table 6.20, Box (1950), Reprinted by permission of Biometrics

• Table 6.21, Rao (1948), Reprinted by permission of Biometrika Trustees

• Table 6.22, Cameron and Pauling (1978), Reprinted by permission of National

Academy of Science

• Table 6.23, Williams and Izenman (1989), Reprinted by permission of Colorado

State University

Table 6.24, Beauchamp and Hoel (1974), Reprinted by permission of Journal

of Statistical Computation and Simulation

Table 6.25, Box (1950), Reprinted by permission of Biometrics

Table 6.26, Grizzle and Allen (1969), Reprinted by permission of Biometrics

Table 6.27, Crepeau et al (1985), Reprinted by permission of Biometrics

Table 6.28, Zerbe (1979a), Reprinted by permission of Journal of the American

Trang 22

• Table 7.2, Reprinted by permission of R J Freund

Table 8.1, Kramer and Jensen (1969a), Reprinted by permission of Journal of

Quality Technology

• Table 8.3, Reprinted by permission of G R Bryce and R M Barker

Table 10.1, Box and Youle (1955), Reprinted by permission of Biometrics

Tables 12.2, 12.3, and 12.4, Jeffers (1967), Reprinted by permission of Applied

Statistics

Table 13.1, Brown et al (1984), Reprinted by permission of the Journal of

Pascal, Ada, and Modula

• Correlation matrix in Example 13.6, Brown, Strong, and Rencher (1973),

Reprinted by permission of The Journal of the Acoustical Society of America

• Table 14.1, Hartigan (1975), Reprinted by permission of John Wiley and Sons

Table 14.3, Dawkins (1989), Reprinted by permission of The American

Statis-tician

• Table 14.7, Hand et al (1994), Reprinted by permission of D J Hand

• Table 14.12, Sokol and Rohlf (1981), Reprinted by permission of W H

Free-man and Co

• Table 14.13, Hand et al (1994), Reprinted by permission of D J Hand

• Table 15.1, Kruskal and Wish (1978), Reprinted by permission of Sage

Publi-cations

• Tables 15.2 and 15.5, Hand et al (1994), Reprinted by permission of D J Hand

Table 15.13, Edwards and Kreiner (1983), Reprinted by permission of Biometrika

• Table 15.15, Hand et al (1994), Reprinted by permission of D J Hand

• Table 15.16, Everitt (1987), Reprinted by permission of the author

• Table 15.17, Andrews and Herzberg (1985), Reprinted by permission of

Springer Verlag

• Table 15.18, Clausen (1988), Reprinted by permission of Sage Publications

• Table 15.19, Andrews and Herzberg (1985), Reprinted by permission of

Springer Verlag

• Table A.1, Mulholland (1977), Reprinted by permission of Biometrika Trustees

• Table A.2, D’Agostino and Pearson (1973), Reprinted by permission of

Biometrika Trustees

• Table A.3, D’Agostino and Tietjen (1971), Reprinted by permission of Biometrika

Trustees

Trang 23

• Table A.4, D’Agostino (1972), Reprinted by permission of Biometrika Trustees

• Table A.5, Mardia (1970, 1974), Reprinted by permission of Biometrika

• Table A.10, Pearson and Hartley (1972) and Pillai (1964, 1965), Reprinted by

permission of Biometrika Trustees

Table A.11, Schuurmann et al (1975), Reprinted by permission of Journal of

Statistical Computation and Simulation

• Table A.12, Davis (1970a,b, 1980), Reprinted by permission of Biometrika

Trustees

• Table A.13, Kleinbaum, Kupper, and Muller (1988), Reprinted by permission

of PWS-KENT Publishing Company

• Table A.14, Lee et al (1977), Reprinted by permission of Elsevier

North-Holland Publishing Company

• Table A.15, Mathai and Katiyar (1979), Reprinted by permission of Biometrika

Trustees

Trang 24

Multivariate analysis consists of a collection of methods that can be used when eral measurements are made on each individual or object in one or more samples We

sev-will refer to the measurements as variables and to the individuals or objects as units (research units, sampling units, or experimental units) or observations In practice,

multivariate data sets are common, although they are not always analyzed as such.But the exclusive use of univariate procedures with such data is no longer excusable,given the availability of multivariate techniques and inexpensive computing power

to carry them out

Historically, the bulk of applications of multivariate techniques have been in thebehavioral and biological sciences However, interest in multivariate methods hasnow spread to numerous other fields of investigation For example, I have collab-orated on multivariate problems with researchers in education, chemistry, physics,geology, engineering, law, business, literature, religion, public broadcasting, nurs-ing, mining, linguistics, biology, psychology, and many other fields Table 1.1 showssome examples of multivariate observations

The reader will notice that in some cases all the variables are measured in the samescale (see 1 and 2 in Table 1.1) In other cases, measurements are in different scales(see 3 in Table 1.1) In a few techniques, such as profile analysis (Sections 5.9 and6.8), the variables must be commensurate, that is, similar in scale of measurement;however, most multivariate methods do not require this

Ordinarily the variables are measured simultaneously on each sampling unit ically, these variables are correlated If this were not so, there would be little use formany of the techniques of multivariate analysis We need to untangle the overlappinginformation provided by correlated variables and peer beneath the surface to see the

Typ-underlying structure Thus the goal of many multivariate approaches is tion We seek to express what is going on in terms of a reduced set of dimensions Such multivariate techniques are exploratory; they essentially generate hypotheses

simplifica-rather than test them

On the other hand, if our goal is a formal hypothesis test, we need a technique thatwill (1) allow several variables to be tested and still preserve the significance level

1

Trang 25

Table 1.1 Examples of Multivariate Data

1 Students Several exam scores in a single course

2 Students Grades in mathematics, history, music, art, physics

3 People Height, weight, percentage of body fat, resting heart

rate

4 Skulls Length, width, cranial capacity

5 Companies Expenditures for advertising, labor, raw materials

6 Manufactured items Various measurements to check on compliance with

specifications

7 Applicants for bank loans Income, education level, length of residence, savings

account, current debt load

8 Segments of literature Sentence length, frequency of usage of certain words

and of style characteristics

9 Human hairs Composition of various elements

and (2) do this for any intercorrelation structure of the variables Many such tests areavailable

As the two preceding paragraphs imply, multivariate analysis is concerned

gener-ally with two areas, descriptive and inferential statistics In the descriptive realm, we

often obtain optimal linear combinations of variables The optimality criterion variesfrom one technique to another, depending on the goal in each case Although linearcombinations may seem too simple to reveal the underlying structure, we use themfor two obvious reasons: (1) they have mathematical tractability (linear approxima-tions are used throughout all science for the same reason) and (2) they often performwell in practice These linear functions may also be useful as a follow-up to infer-ential procedures When we have a statistically significant test result that comparesseveral groups, for example, we can find the linear combination (or combinations)

of variables that led to rejection of the hypothesis Then the contribution of eachvariable to these linear combinations is of interest

In the inferential area, many multivariate techniques are extensions of univariateprocedures In such cases, we review the univariate procedure before presenting theanalogous multivariate approach

Multivariate inference is especially useful in curbing the researcher’s natural dency to read too much into the data Total control is provided for experimentwiseerror rate; that is, no matter how many variables are tested simultaneously, the value

ten-ofα (the significance level) remains at the level set by the researcher.

Some authors warn against applying the common multivariate techniques to datafor which the measurement scale is not interval or ratio It has been found, however,that many multivariate techniques give reliable results when applied to ordinal data.For many years the applications lagged behind the theory because the compu-tations were beyond the power of the available desktop calculators However, withmodern computers, virtually any analysis one desires, no matter how many variables

Trang 26

or observations are involved, can be quickly and easily carried out Perhaps it is notpremature to say that multivariate analysis has come of age.

1.2 PREREQUISITES

The mathematical prerequisite for reading this book is matrix algebra Calculus is notused [with a brief exception in equation (4.29)] But the basic tools of matrix algebraare essential, and the presentation in Chapter 2 is intended to be sufficiently complete

so that the reader with no previous experience can master matrix manipulation up tothe level required in this book

The statistical prerequisites are basic familiarity with the normal distribution,

t -tests, confidence intervals, multiple regression, and analysis of variance These

techniques are reviewed as each is extended to the analogous multivariate procedure.This is a multivariate methods text Most of the results are given without proof In

a few cases proofs are provided, but the major emphasis is on heuristic explanations.Our goal is an intuitive grasp of multivariate analysis, in the same mode as otherstatistical methods courses Some problems are algebraic in nature, but the majorityinvolve data sets to be analyzed

I have formulated three objectives that I hope this book will achieve for the reader.These objectives are based on long experience teaching a course in multivariatemethods, consulting on multivariate problems with researchers in many fields, andguiding statistics graduate students as they consulted with similar clients

The first objective is to gain a thorough understanding of the details of variousmultivariate techniques, their purposes, their assumptions, their limitations, and so

on Many of these techniques are related; yet they differ in some essential ways Weemphasize these similarities and differences

The second objective is to be able to select one or more appropriate techniques for

a given multivariate data set Recognizing the essential nature of a multivariate dataset is the first step in a meaningful analysis We introduce basic types of multivariatedata in Section 1.4

The third objective is to be able to interpret the results of a computer analysis

of a multivariate data set Reading the manual for a particular program package isnot enough to make an intelligent appraisal of the output Achievement of the firstobjective and practice on data sets in the text should help achieve the third objective

1.4 BASIC TYPES OF DATA AND ANALYSIS

We will list four basic types of (continuous) multivariate data and then brieflydescribe some possible analyses Some writers would consider this an oversimpli-

Trang 27

fication and might prefer elaborate tree diagrams of data structure However, manydata sets can fit into one of these categories, and the simplicity of this structuremakes it easier to remember The four basic data types are as follows:

1 A single sample with several variables measured on each sampling unit

(sub-ject or ob(sub-ject);

2 A single sample with two sets of variables measured on each unit;

3 Two samples with several variables measured on each unit;

4 Three or more samples with several variables measured on each unit.

Each data type has extensions, and various combinations of the four are possible

A few examples of analyses for each case are as follows:

1 A single sample with several variables measured on each sampling unit: (a) Test the hypothesis that the means of the variables have specified values (b) Test the hypothesis that the variables are uncorrelated and have a common

variance

(c) Find a small set of linear combinations of the original variables that

sum-marizes most of the variation in the data (principal components)

(d) Express the original variables as linear functions of a smaller set of

under-lying variables that account for the original variables and their lations (factor analysis)

intercorre-2 A single sample with two sets of variables measured on each unit:

(a) Determine the number, the size, and the nature of relationships between

the two sets of variables (canonical correlation) For example, you maywish to relate a set of interest variables to a set of achievement variables.How much overall correlation is there between these two sets?

(b) Find a model to predict one set of variables from the other set (multivariate

multiple regression)

3 Two samples with several variables measured on each unit:

(a) Compare the means of the variables across the two samples (Hotelling’s

T2-test)

(b) Find a linear combination of the variables that best separates the two

sam-ples (discriminant analysis)

(c) Find a function of the variables that accurately allocates the units into the

two groups (classification analysis)

4 Three or more samples with several variables measured on each unit:

(a) Compare the means of the variables across the groups (multivariate

anal-ysis of variance)

(b) Extension of 3(b) to more than two groups.

(c) Extension of 3(c) to more than two groups.

Trang 28

Matrix Algebra

This chapter introduces the basic elements of matrix algebra used in the remainder

of this book It is essentially a review of the requisite matrix tools and is not intended

to be a complete development However, it is sufficiently self-contained so that thosewith no previous exposure to the subject should need no other reference Anyoneunfamiliar with matrix algebra should plan to work most of the problems entailingnumerical illustrations It would also be helpful to explore some of the problemsinvolving general matrix manipulation

With the exception of a few derivations that seemed instructive, most of the resultsare given without proof Some additional proofs are requested in the problems Forthe remaining proofs, see any general text on matrix theory or one of the specializedmatrix texts oriented to statistics, such as Graybill (1969), Searle (1982), or Harville(1997)

2.2 NOTATION AND BASIC DEFINITIONS

2.2.1 Matrices, Vectors, and Scalars

A matrix is a rectangular or square array of numbers or variables arranged in rows

and columns We use uppercase boldface letters to represent matrices All entries inmatrices will be real numbers or variables representing real numbers The elements

of a matrix are displayed in brackets For example, the ACT score and GPA for threestudents can be conveniently listed in the following matrix:

The elements of A can also be variables, representing possible values of ACT and

GPA for three students:

5

Trang 29

In this double-subscript notation for the elements of a matrix, the first subscript

indi-cates the row; the second identifies the column The matrix A in (2.2) can also be

expressed as

where a i j is a general element

With three rows and two columns, the matrix A in (2.1) or (2.2) is said to be

3× 2 In general, if a matrix A has n rows and p columns, it is said to be n × p Alternatively, we say the size of A is n × p.

A vector is a matrix with a single column or row The following could be the test

scores of a student in a course in multivariate analysis:

Geometrically, a vector with p elements identifies a point in a p-dimensional

space The elements in the vector are the coordinates of the point In (2.35) in tion 2.3.3, we define the distance from the origin to the point In Section 3.12, wedefine the distance between two vectors In some cases, we will be interested in adirected line segment or arrow from the origin to the point

Sec-A single real number is called a scalar, to distinguish it from a vector or matrix.

Thus 2,−4, and 125 are scalars A variable representing a scalar is usually denoted

by a lowercase nonbolded letter, such as a = 5 A product involving vectors andmatrices may reduce to a matrix of size 1× 1, which then becomes a scalar

Trang 30

2.2.2 Equality of Vectors and Matrices

Two matrices are equal if they are the same size and the elements in corresponding

positions are equal Thus if A= (a i j ) and B = (bi j ), then A = B if ai j = b i jfor all

i and j For example, let

2.2.3 Transpose and Symmetric Matrices

The transpose of a matrix A, denoted by A, is obtained from A by interchanging

rows and columns Thus the columns of Aare the rows of A, and the rows of A

are the columns of A The following examples illustrate the transpose of a matrix or

If the transpose of a matrix is the same as the original matrix, the matrix is said to

be symmetric; that is, A is symmetric if A= A For example,

Trang 31

the elements 5, 9, and 1 lie on the diagonal If a matrix contains zeros in all

off-diagonal positions, it is said to be a off-diagonal matrix An example of a off-diagonal

A diagonal matrix with a 1 in each diagonal position is called an identity matrix

and is denoted by I For example, a 3× 3 identity matrix is given by

Trang 32

2.3.1 Summation and Product Notation

For completeness, we review the standard mathematical notation for sums and

prod-ucts The sum of a sequence of numbers a1, a2, , anis indicated by

n

i=1

ai = a1+ a2+ · · · + a n.

If the n numbers are all the same, then i n=1a = a + a + · · · + a = na The sum of

all the numbers in an array with double subscripts, such as

Trang 33

The product of a sequence of numbers a1, a2, , a nis indicated by

2.3.2 Addition of Matrices and Vectors

If two matrices (or two vectors) are the same size, their sum is found by adding

corresponding elements; that is, if A is n × p and B is n × p, then C = A + B is also

n × p and is found as (c i j ) = (a i j + b i j ) For example,

 +

 −153

 =

 6210

Similarly, the difference between two matrices or two vectors of the same size is

found by subtracting corresponding elements Thus C= A − B is found as (c i j ) = (ai j − b i j ) For example,

Trang 34

2.3.3 Multiplication of Matrices and Vectors

In order for the product AB to be defined, the number of columns in A must be the

same as the number of rows in B, in which case A and B are said to be conformable Then the (i j )th element of C= AB is

ci j =

k

Thus c i j is the sum of products of the i th row of A and the j th column of B We

therefore multiply each row of A by each column of B, and the size of AB consists

of the number of rows of A and the number of columns of B Thus, if A is n × m and

B is m × p, then C = AB is n × p For example, if

Trang 35

Thus we must be careful to specify the order of multiplication If we wish to multiply

both sides of a matrix equation by a matrix, we must multiply on the left or on the right and be consistent on both sides of the equation.

Multiplication is distributive over addition or subtraction:

Note that (2.27) holds as long as A and B are conformable They need not be square.

Multiplication involving vectors follows the same rules as for matrices Suppose

A is n × p, a is p × 1, b is p × 1, and c is n × 1 Then some possible products are

Ab, cA, ab, ba, and ab For example, let

 234

2

 = 1631



,

Trang 36

Note that Ab is a column vector, cA is a row vector, cAb is a scalar, and ab = ba.

The triple product cAb was obtained as c(Ab) The same result would be obtained

if we multiplied in the order(cA)b:

(cA)b = (1 −19 −17)

 234

 = −123.

This is true in general for a triple product:

Thus multiplication of three matrices can be defined in terms of the product of twomatrices, since (fortunately) it does not matter which two are multiplied first Note

that A and B must be conformable for multiplication, and B and C must be

con-formable For example, if A is n × p, B is p × q, and C is q × m, then both plications are possible and the product ABC is n × m.

multi-We can sometimes factor a sum of triple products on both the right and left sides.For example,

As another illustration, let X be n × p and A be n × n Then

XX − XAX = X(X − AX) = X(I − A)X. (2.30)

Trang 37

If a and b are both n× 1, then

ab= a1b1+ a2b2+ · · · + a nbn (2.31)

is a sum of products and is a scalar On the other hand, abis defined for any size a

and b and is a matrix, either rectangular or square:

Thus aa is a sum of squares, and aais a square (symmetric) matrix The products aa

and aaare sometimes referred to as the dot product and matrix product, respectively.

The square root of the sum of squares of the elements of a is the distance from the origin to the point a and is also referred to as the length of a:

Trang 38

Thus aj is the sum of the elements in a, jA contains the column sums of A, and Aj

contains the row sums of A In aj, the vector j is n× 1; in jA, the vector j is n× 1;

and in Aj, the vector j is p× 1

Since ab is a scalar, it is equal to its transpose:

ab= (ab)= b(a)= ba. (2.39)This allows us to write(ab)2in the form

(ab)2= (ab)(ab) = (ab)(ba) = a(bb)a. (2.40)From (2.18), (2.26), and (2.39) we obtain

(x − y)(x − y) = xx − 2xy + yy. (2.41)Note that in analogous expressions with matrices, however, the two middle termscannot be combined:

(A − B)(A − B) = AA − AB − BA + BB,

(A − B)2= (A − B)(A − B) = A2− AB − BA + B2.

If a and x1, x2, , x n are all p ×1 and A is p× p, we obtain the following factoring

results as extensions of (2.21) and (2.29):

We can express matrix multiplication in terms of row vectors and column vectors

If a is the i th row of A and b j is the j th column of B, then the (i j )th element of AB

Trang 39

a 2

a 3

a 2

a 3

a 2

a 3

a 2

Any matrix can be multiplied by its transpose If A is n × p, then

AAis n × n and is obtained as products of rows of A [see (2.52)].

Trang 40

AA is p × p and is obtained as products of columns of A [see (2.54)].

From (2.6) and (2.27), it is clear that both AAand AA are symmetric.

In the preceding illustration for AB in terms of row and column vectors, the rows

of A were denoted by a

i and the columns of B, by bj If both rows and columns of

a matrix A are under discussion, as in AAand AA, we will use the notation a

a 2

a 3

 = (a (1), a(2), a(3), a(4)),

where, for example,

With this notation for rows and columns of A, we can express the elements of

AA or of AAas products of the rows of A or of the columns of A Thus if we write

A in terms of its rows as

a 2

a 2

a 2

Ngày đăng: 14/12/2018, 08:21

TỪ KHÓA LIÊN QUAN