Selected subjects from both under-graduate- and graduate-level topics from industrial, electrical, computer, and mechanical engineering as well asmaterial science are included to provide
Trang 1Marcel Dekker, Inc New York•Basel
Handbook of
Industrial Automation
edited by
Richard L Shell Ernest L Hall
University of Cincinnati Cincinnati, Ohio
Trang 2ISBN: 0-8247-0373-1
This book is printed on acid-free paper
Headquarters
Marcel Dekker, Inc
270 Madison Avenue, New York, NY 10016
Copyright # 2000 by Marcel Dekker, Inc All Rights Reserved
Neither this book nor any part may be reproduced or transmitted in any form or by any means, electronic or mechanical,including photocopying, micro®lming, and recording, or by any information storage and retrieval system, without permission inwriting from the publisher
Current printing (last digit):
10 9 8 7 6 5 4 3 2 1
PRINTED IN THE UNITED STATES OF AMERICA
Trang 3This handbook is designed as a comprehensive reference for the industrial automation engineer Whether in a small
or large manufacturing plant, the industrial or manufacturing engineer is usually responsible for using the latest andbest technology in the safest, most economic manner to build products This responsibility requires an enormousknowledge base that, because of changing technology, can never be considered complete The handbook willprovide a handy starting reference covering technical, economic, certain legal standards, and guidelines that should
be the ®rst source for solutions to many problems The book will also be useful to students in the ®eld as it provides
a single source for information on industrial automation
The handbook is also designed to present a related and connected survey of engineering methods useful in avariety of industrial and factory automation applications Each chapter is arranged to permit review of an entiresubject, with illustrations to provide guideposts for the more complex topics Numerous references are provided toother material for more detailed study
The mathematical de®nitions, concepts, equations, principles, and application notes for the practicing industrialautomation engineer have been carefully selected to provide broad coverage Selected subjects from both under-graduate- and graduate-level topics from industrial, electrical, computer, and mechanical engineering as well asmaterial science are included to provide continuity and depth on a variety of topics found useful in our work inteaching thousands of engineers who work in the factory environment The topics are presented in a tutorial style,without detailed proofs, in order to incorporate a large number of topics in a single volume
The handbook is organized into ten parts Each part contains several chapters on important selected topics Part
1 is devoted to the foundations of mathematical and numerical analysis The rational thought process developed inthe study of mathematics is vital in developing the ability to satisfy every concern in a manufacturing process.Chapters include: an introduction to probability theory, sets and relations, linear algebra, calculus, differentialequations, Boolean algebra and algebraic structures and applications Part 2 provides background information onmeasurements and control engineering Unless we measure we cannot control any process The chapter topicsinclude: an introduction to measurements and control instrumentation, digital motion control, and in-processmeasurement
Part 3 provides background on automatic control Using feedback control in which a desired output is compared
to a measured output is essential in automated manufacturing Chapter topics include distributed control systems,stability, digital signal processing and sampled-data systems Part 4 introduces modeling and operations research.Given a criterion or goal such as maximizing pro®t, using an overall model to determine the optimal solutionsubject to a variety of constraints is the essence of operations research If an optimal goal cannot be obtained, thencontinually improving the process is necessary Chapter topics include: regression, simulation and analysis ofmanufacturing systems, Petri nets, and decision analysis
iii
Trang 4Part 5 deals with sensor systems Sensors are used to provide the basic measurements necessary to control amanufacturing operation Human senses are often used but modern systems include important physical sensors.Chapter topics include: sensors for touch, force, and torque, fundamentals of machine vision, low-cost machinevision and three-dimensional vision Part 6 introduces the topic of manufacturing Advanced manufacturing pro-cesses are continually improved in a search for faster and cheaper ways to produce parts Chapter topics include: thefuture of manufacturing, manufacturing systems, intelligent manufacturing systems in industrial automation, mea-surements, intelligent industrial robots, industrial materials science, forming and shaping processes, and moldingprocesses Part 7 deals with material handling and storage systems Material handling is often considered a neces-sary evil in manufacturing but an ef®cient material handling system may also be the key to success Topics include
an introduction to material handling and storage systems, automated storage and retrieval systems, tion, and robotic palletizing of ®xed- and variable-size parcels
containeriza-Part 8 deals with safety and risk assessment Safety is vitally important, and government programs monitor themanufacturing process to ensure the safety of the public Chapter topics include: investigative programs, govern-ment regulation and OSHA, and standards Part 9 introduces ergonomics Even with advanced automation,humans are a vital part of the manufacturing process Reducing risks to their safety and health is especiallyimportant Topics include: human interface with automation, workstation design, and physical-strength assessment
in ergonomics Part 10 deals with economic analysis Returns on investment are a driver to manufacturing systems.Chapter topics include: engineering economy and manufacturing cost recovery and estimating systems
We believe that this handbook will give the reader an opportunity to quickly and thoroughly scan the ®eld ofindustrial automation in suf®cient depth to provide both specialized knowledge and a broad background of speci®cinformation required for industrial automation Great care was taken to ensure the completeness and topicalimportance of each chapter
We are grateful to the many authors, reviewers, readers, and support staff who helped to improve the script We earnestly solicit comments and suggestions for future improvements
manu-Richard L ShellErnest L Hall
Trang 5Preface iii
Contributors ix
Part 1 Mathematics and Numerical Analysis
1.1 Some Probability Concepts for Engineers 1
Enrique Castillo and Ali S Hadi
1.2 Introduction to Sets and Relations
Part 2 Measurements and Computer Control
2.1 Measurement and Control Instrumentation Error-Modeled Performance
Patrick H Garrett
2.2 Fundamentals of Digital Motion Control
Ernest L Hall, Krishnamohan Kola, and Ming Cao
v
Trang 62.3 In-Process Measurement
William E Barkman
Part 3 Automatic Control
3.1 Distributed Control Systems
Dobrivoje Popovic
3.2 Stability
Allen R Stubberud and Stephen C Stubberud
3.3 Digital Signal Processing
Richard Brook and Denny Meyer
4.2 A Brief Introduction to Linear and Dynamic Programming
Part 5 Sensor Systems
5.1 Sensors: Touch, Force, and Torque
Richard M Crowder
5.2 Machine Vision Fundamentals
Prasanthi Guda, Jin Cao, Jeannine Gailey, and Ernest L Hall
Trang 76.2 Manufacturing Systems
Jon Marvel and Ken Bloemer
6.3 Intelligent Manufacturing in Industrial Automation
George N Saridis
6.4 Measurements
John Mandel
6.5 Intelligent Industrial Robots
Wanek Golnazarian and Ernest L Hall
6.6 Industrial Materials Science and Engineering
Part 7 Material Handling and Storage
7.1 Material Handling and Storage Systems
William Wrennall and Herbert R Tuttle
7.2 Automated Storage and Retrieval Systems
Stephen L Parsley
7.3 Containerization
A Kader Mazouz and C P Han
7.4 Robotic Palletizing of Fixed- and Variable-Size/Content Parcels
Hyder Nihal Agha, William H DeCamp, Richard L Shell, and Ernest L Hall
Part 8 Safety, Risk Assessment, and Standards
9.1 Perspectives on Designing Human Interfaces for Automated Systems
Anil Mital and Arunkumar Pennathur
9.2 Workstation Design
Christin Shoaf and Ashraf M Genaidy
Trang 89.3 Physical Strength Assessment in Ergonomics
Sean Gallagher, J Steven Moore, Terrence J Stobbe, James D McGlothlin, and Amit Bhattacharya
Part 10 Economic Analysis
10.1 Engineering Economy
Thomas R Huston
10.2 Manufacturing-Cost Recovery and Estimating Systems
Eric M Malstrom and Terry R Collins
Index 863
viii Contents
Trang 9William H DeCamp Motoman, Inc., West Carrollton, Ohio
Steve Dickerson Department of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GeorgiaVerna Fitzsimmons Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
Jeannine Gailey Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
Sean Gallagher Pittsburgh Research Laboratory, National Institute for Occupational Safety and Health,Pittsburgh, Pennsylvania
Patrick H Garrett Department of Electrical and Computer Engineering and Computer Science, University ofCincinnati, Cincinnati, Ohio
Ashraf M Genaidy Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
Wanek Golnazarian General Dynamics Armament Systems, Burlington, Vermont
Prasanthi Guda Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
Ali S Hadi Department of Statistical Sciences, Cornell University, Ithaca, New York
Ernest L Hall Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
C P Han Department of Mechanical Engineering, Florida Atlantic University, Boca Raton, Florida
Thomas R Huston Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
Avraam I Isayev Department of Polymer Engineering, The University of Akron, Akron, Ohio
Ki Hang Kim Mathematics Research Group, Alabama State University, Montgomery, Alabama
Krishnamohan Kola Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
Eric M Malstromy Department of Industrial Engineering, University of Arkansas, Fayetteville, ArkansasJohn Mandel National Institute of Standards and Technology, Gaithersburg, Maryland
Jon Marvel Padnos School of Engineering, Grand Valley State University, Grand Rapids, Michigan
A Kader Mazouz Department of Mechanical Engineering, Florida Atlantic University, Boca Raton, FloridaJames D McGlothlin Purdue University, West Lafayette, Indiana
M Eugene Merchant Institute of Advanced Manufacturing Sciences, Cincinnati, Ohio
Denny Meyer Institute of Information and Mathematical Sciences, Massey University±Albany, PalmerstonNorth, New Zealand
Angelo B Mingarelli School of Mathematics and Statistics, Carleton University, Ottawa, Ontario, CanadaAnil Mital Department of Industrial Engineering, University of Cincinnati, Cincinnati, Ohio
J Steven Moore Department of Occupational and Environmental Medicine, The University of Texas HealthCenter, Tyler, Texas
x Contributors
*Retired
yDeceased
Trang 10Diego A Murio Department of Mathematical Sciences, University of Cincinnati, Cincinnati, Ohio
Lawrence E Murr Department of Metallurgical and Materials Engineering, The University of Texas at El Paso, ElPaso, Texas
Joseph H Nurre School of Electrical Engineering and Computer Science, Ohio University, Athens, OhioStephen L Parsley ESKAYCorporation, Salt Lake City, Utah
Arunkumar Pennathur University of Texas at El Paso, El Paso, Texas
Dobrivoje Popovic Institute of Automation Technology, University of Bremen, Bremen, Germany
Shivakumar Raman Department of Industrial Engineering, University of Oklahoma, Norman, OklahomaGeorge N Saridis Professor Emeritus, Electrical, Computer, and Systems Engineering Department, RensselaerPolytechnic Institute, Troy, New York
Richard L Shell Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
Christin Shoaf Department of Mechanical, Industrial, and Nuclear Engineering, University of Cincinnati,Cincinnati, Ohio
J B Srivastava Department of Mathematics, Indian Institute of Technology, Delhi, New Delhi, India
Terrence J Stobbe Industrial Engineering Department, West Virginia University, Morgantown, West VirginiaAllen R Stubberud Department of Electrical and Computer Engineering, University of California Irvine, Irvine,California
Stephen C Stubberud ORINCON Corporation, San Diego, California
Hiroyuki Tamura Graduate School of Engineering Science, Osaka University, Toyonaka, Osaka, Japan
Fred J Taylor Department of Electrical and Computer Engineering and Department of Computer andInformation Science Engineering, University of Florida, Gainesville, Florida
Herbert R Tuttle Graduate Engineering Management, University of Kansas, Lawrence, Kansas
William Wrennall The Leawood Group Ltd., Leawood, Kansas
Contributors xi
Trang 11Many engineering applications involve some element
of uncertainty [1] Probability is one of the most
com-monly used ways to measure and deal with
uncer-tainty In this chapter we present some of the most
important probability concepts used in engineering
applications
The chapter is organized as follows Section 1.2 ®rst
introduces some elementary concepts, such as random
experiments, types of events, and sample spaces Then
it introduces the axioms of probability and some of the
most important properties derived from them, as well
as the concepts of conditional probability and
indepen-dence It also includes the product rule, the total
prob-ability theorem, and Bayes' theorem
Section 1.3 deals with unidimensional random
vari-ables and introduces three types of varivari-ables (discrete,
continuous, and mixed) and the corresponding
prob-ability mass, density, and distribution functions
Sections 1.4 and 1.5 describe the most commonly
used univariate discrete and continuous models,
respectively
Section 1.6 extends the above concepts of univariate
models to the case of bivariate and multivariate
mod-els Special attention is given to joint, marginal, and
conditional probability distributions
Section 1.7 discusses some characteristics of randomvariables, such as the moment-generating function andthe characteristic function
Section 1.8 treats the techniques of variable formations, that is, how to obtain the probaiblity dis-tribution function of a set of transformed variableswhen the probability distribution function of the initialset of variables is known Section 1.9 uses the transfor-mation techniques of Sec 1.8 to simulate univariateand multivariate data
trans-Section 1.10 is devoted to order statistics, givingmethods for obtaining the joint distribution of anysubset of order statistics It also deals with the problem
of limit or asymptotic distribution of maxima andminima
Finally, Sec 1.11 introduces probability plots andhow to build and use them in making inferences fromdata
1.2 BASIC PROBABILITY CONCEPTS
In this section we introduce some basic probabilityconcepts and de®nitions These are easily understoodfrom examples Classic examples include whether amachine will malfunction at least once during the
®rst month of operation, whether a given structurewill last for the next 20 years, or whether a ¯ood will
1
Trang 12occur during the next year, etc Other examples include
how many cars will cross a given intersection during a
given rush hour, how long we will have to wait for a
certain event to occur, how much stress level a given
structure can withstand, etc We start our exposition
with some de®nitions in the following subsection
1.2.1 Random Experiment and Sample Space
Each of the above examples can be described as a
ran-dom experiment because we cannot predict in advance
the outcome at the end of the experiment This leads to
the following de®nition:
De®nition 1 Random Experiment and Sample
Space: Any activity that will result in one and only
one of several well-de®ned outcomes, but does not
allow us to tell in advance which one will occur is called
a random experiment Each of these possible outcomes is
called an elementary event The set of all possible
ele-mentary events of a given random experiment is called
Therefore, for each random experiment there is an
associated sample space The following are examples of
random experiments and their associated sample
spaces:
Rolling a six-sided fair die once yields
Waiting for a machine to malfunction yields
How many cars will cross a given intersection yields
De®nition 2 Union and Intersection: If C is a set
con-taining all elementary events found in A or in Bor in
both, then write C A [ B to denote the union of A
and B, whereas, if C is a set containing all elementary
events found in both A and B, then we write C A \ B
to denote the intersection of A and B
Referring to the six-sided die, for example, if
A f1; 3; 5g, B f2; 4; 6g, and C f1; 2; 3g, then A [
f1; 3g and A \ B , where denotes the empty set
Random events in a sample space associated with a
random experiment can be classi®ed into several types:
which contains more than one elementary event
is called a composite event Thus, for example,
observing an odd number when rolling a sided die once is a composite event because itconsists of three elementary events
six-2 Compatible vs mutually exclusive events Twoevents A and B are said to be compatible ifthey can simultaneously occur, otherwise theyare said to be mutually exclusive or incompatibleevents For example, referring to rolling a six-sided die once, the events A f1; 3; 5g and B f2; 4; 6g are incompatible because if one eventoccurs, the other does not, whereas the events
A and C f1; 2; 3g are compatible because if weobserve 1 or 3, then both A and C occur
3 Collectively exhaustive events If the union ofseveral events is the sample space, then theevents are said to be collectively exhaustive.f1; 3; 5g and B f2; 4; 6g are collectivelyexhaustive events but A f1; 3; 5g and C f1;2; 3g are not
Then A and B are said to be complementaryevents or B is the complement of A (or viceversa) The complement of A is usually denoted
by A For example, in the six-sided die example,
if A f1; 2g, A f3; 4; 5; 6g Note that an eventand its complement are always de®ned with
A and A are always mutually exclusive and lectively exhaustive events, hence A \ A 1.2.2 Probability Measure
col-To measure uncertainty we start with a given sampletively exhaustive outcomes of a given experiment areare closed under the union, intersection, complemen-tary and limit operations Such a class is called a -algebra Then, the aim is to assign to every subset in
a real value measuring the degree of uncertainty aboutits occurrence In order to obtain measures with clearphysical and practical meanings, some general andintuitive properties are used to de®ne a class of mea-sures known as probability measures
De®nition 3 Probability Measure: A function p ping any subset A into the interval 0; 1 is called aprobability measure if it satis®es the following axioms:
map-2 Castillo and Hadi
Trang 13Axiom 2 Additivity: For any (possibly in®nite)
sequence, A1; A2; ; of disjoint subsets of , then
p[AiXp Ai
Axiom 1 states that despite our degree of uncertainty,
gation formula that can be used to compute the
prob-ability of a union of disjoint subsets It states that the
uncertainty of a given subset is the sum of the
uncer-tainties of its disjoint parts
From the above axioms, many interesting properties
of the probability measure can be derived For
example:
Property 1 Boundary: p 0
Property 2 Monotonicity: If A B , then
p A p B
Property 3 Continuity±Consistency: For every
increasing sequence A1 A2 or decreasing
sequence A1 A2 of subsets of we have
lim
i!1p Ai p lim
i!1AiProperty 4 Inclusion±Exclusion: Given any pair of
subsets A and B of , the following equality
always holds:
p A [ B p A p B p A \ B 1
Property 1 states that the evidence associated with a
complete lack of information is de®ned to be zero
Property 2 shows that the evidence of the membership
of an element in a set must be at least as great as the
evidence that the element belongs to any of its subsets
In other words, the certainty of an element belonging
to a given set A must not decrease with the addition of
elements to A
Property 3 can be viewed as a consistency or a
con-tinuity property If we choose two sequences
conver-ging to the same subset of , we must get the same limit
of uncertainty Property 4 states that the probabilities
of the sets A; B; A \ B, and A [ B are not independent;
they are related by Eq (1)
Note that these properties respond to the intuitive
notion of probability that makes the mathematical
model valid for dealing with uncertainty Thus, for
example, the fact that probabilities cannot be larger
than one is not an axiom but a consequence of
Axioms 1 and 2
De®nition 4 Conditional Probability: Let A and Bbetwo subsets of variables such that p B > 0 Then, theconditional probability distribution (CPD) of A given B
is given by
p A j B p A \ Bp B 2Equation (2) implies that the probability of A \ B can
be written as
p A \ B p Bp A j B 3This can be generalized to several events as follows:
p A j B1; ; Bk p A; Bp B 1; ; Bk
1; ; Bk 41.2.3 Dependence and Independence
De®ntion 5 Independence of Two Events: Let A and B
be two events Then A is said to be independent of Bifand only if
p A j B p A 5otherwise A is said to be dependent on B
Equation (5) means that if A is independent of B,then our knowledge of B does not affect our knowl-edge about A, that is, B has no information about A.Also, if A is independent of B, we can then combineEqs (2) and (5) and obtain
p A \ B p A p B 6Equation (6) indicates that if A is independent of B,then the probability of A \ B is equal to the product oftheir probabilities Actually, Eq (6) provides a de®ni-tion of independence equivalent to that in Eq (5).One important property of the independence rela-tion is its symmetry, that is, if A is independent of B,then B is independent of A This is because
p B j A p A \ Bp A p A p Bp A p B
Because of the symmetry property, we say that A and
B are independent or mutually independent The cal implication of symmetry is that if knowledge of B isrelevant (irrelevant) to A, then knowledge of A is rele-vant (irrelevant) to B
practi-The concepts of dependence and independence oftwo events can be extended to the case of more thantwo events as follows:
Some Probability Concepts for Engineers 3
Trang 14De®nition 6 Independence of a Set of Events: The
events A1; ; Am are said to be independent if and
only if
p A1\ \ Am Ym
i1
p Ai 7
otherwise they are said to be dependent
In other words, fA1; ; Amg are said to be
indepen-dent if and only if their intersection probability is equal
to the product of their individual probabilities Note
that Eq (7) is a generalization of Eq (6)
An important implication of independence is that it
is not worthwhile gathering information about
inde-pendent (irrelevant) events That is, independence
means irrelevance
From Eq (3) we get
p A1\ A2 p A1j A2 p A2 p A2j A1 p A1
This property can be generalized, leading to the
so-called product or chain rule:
p A1\ \ An p A1 p A2j A1
p An j A1\ \ An 11.2.4 Total Probability Theorem
Theorem 1 Total Probability Theorem: Let fA1; ;
Ang be a class of events which are mutually incompatible
and such that [
Theorem 2 Bayes' Theorem: Let fA1; ; Ang be a
class of events which are mutually incompatible and
1.3 UNIDIMENSIONAL RANDOMVARIABLES
In this section we de®ne random variables, distinguishamong three of their types, and present various ways ofpresenting their probability distributions
De®nition 7 Random Variable: A possible
vector-n, which assigns to each
ele-X ! x, is called an n-dimensional random variable.random variable X is also known as the support of X.When n 1 in De®nition 7, the random variable issaid to be unidimensional and when n > 1, it is said
to be multidimensional In this and Secs 1.4 and 1.5,
we deal with unidimensional random variables.Multidimensional random variables are treated inSec 1.6
Example 1 Suppose we roll two dice once Let A bethe outcome of the ®rst die and Bbe the outcome of theconsists of 36 possible pairs (A,B), as shown in Fig 2.Suppose we de®ne a random variable X A B, that
is, X is the sum of the two numbers observed when we rolltwo dice once Then X is a unidimensional random vari-able The support of this random variable is the set f2;3; ; 12g consisting of 11 elements This is also shown
in Fig 2
1.3.1 Types of Random VariablesRandom variables can be classi®ed into three types:discrete, continuous, and mixed We de®ne and giveexamples of each type below
4 Castillo and Hadi
Figure 1 Graphical illustration of the total probability rule
Trang 15De®nition 8 Discrete Random Variables: A random
variable is said to be discrete if it can take a ®nite or
countable set of real values
As an example of a discrete random variable, let X
denote the outcome of rolling a six-sided die once
Since the support of this random variable is the ®nite
set f1; 2; 3; 4; 5; 6g, then X is discrete random variable
The random variable X A B in Fig 2 is another
example of discrete random variables
De®nition 9 Continuous Random Variables: A
ran-dom variable is said to be continuous if it can take an
uncountable set of real values
For example, let X denote the weight of an object,
then X is a continuous random variable because it can
take values in the set fx : x > 0g, which is an
uncoun-table set
De®nition 10 Mixed Random Variables: A random
variable is said to be mixed if it can take an uncountable
set of values and the probability of at least one value of x
is positive
Mixed random variables are encountered often inengineering applications which involve some type ofcensoring Consider, for example, a life-testing situa-tion where n machines are put to work for a givenperiod of time, say 30 days Let Xi denotes the time
at which the ith machine malfunctions Then Xi is
a random variable which can take the values
fx : 0 < x 30g This is clearly an uncountable set.But at the end of the 30-day period some machinesmay still be functioning For each of these machinesall what we know is that Xi 30g Then the probabilitythat Xi 30 is positive Hence the random variable Xi
is of the mixed type The data in this example is known
Some Probability Concepts for Engineers 5
Figure 2 Graphical illustration of an experiment consisting of rolling two dice once and an associated random variable which isde®ned as the sum of the two numbers observed
Trang 16sored Of course, there are situations where both right
and left censoring are present
1.3.2 Probability Distributions of Random
Variables
So far we have de®ned random variables and their
support In this section we are interested in measuring
the probability of each of these values and/or the
prob-ability of a subset of these values We know from
In other words, we are interested in ®nding the
prob-ability distribution of a given random variable Three
equivalent ways of representing the probability
distri-butions of these random variables are: tables, graphs,
and mathematical functions (also known as
mathema-tical models)
1.3.3 Probability Distribution Tables
As an example of a probability distribution that can be
displayed in a table let us ¯ip a fair coin twice and let X
be the number of heads observed Then the sample
HT; HHg, where TH, for example, denotes the
out-come: ®rst coin turned up a tail and second a head
The sample space of the random variable X is then
f0; 1; 2g For example, X 0 occurs when we observe
TT The probability of each of these possible values of
X is found simply by counting how many elements of
We can see that X 0 occurs when we observe the
outcome TT, X 1 occurs when we observe either
HT or TH, and X 2 occurs when we observe HH
Since there are four equally likely elementary events in
p X 0 1=4, p X 1 2=4, and p X 2 1=4
This probability distribution of X can be displayed in a
table as in Table 1 For obvious reasons, such tables
are called probability distribution tables Note that to
denote the random variable itself we use an uppercaseletter (e.g., X), but for its realizations we use the cor-responding lowercase letter (e.g., x)
Obviously, it is possible to use tables to display theprobability distributions of only discrete random vari-ables For continuous random variables, we have touse one of the other two means: graphs or mathema-tical functions Even in discrete random variables withlarge number of elements in their support, tables arenot the most ef®cient way of displaying the probabilitydistribution
1.3.4 Graphical Representation of ProbabilitiesThe probability distribution of a random variable canequivalently be represented graphically by displayingvalues in the support of X on a horizontal line anderecting a vertical line or bar on top of each of thesevalues The height of each line or bar represents theprobability of the corresponding value of X Forexample, Fig 3 shows the probability distribution ofthe random variable X de®ned in Example 1
For continuous random variables, we have in®nitelymany possible values in their support, each of whichhas a probability equal to zero To avoid this dif®culty,
we represent the probability of a subset of values by anarea under a curve (known as the probability densitycurve) instead of heights of vertical lines on top of each
of the values in the subset
For example, let X represent a number drawn domly from the interval 0; 10 The probability distri-bution of X can be displayed graphically as in Fig 4.The area under the curve on top of the support of Xhas to equal 1 because it represents the total probabil-ity Since all values of X are equally likely, the curve is
ran-a horizontran-al line with height equran-al to 1/10 The height
of 1/10 will make the total area under the curve equal
to 1 This type of random variable is called a
contin-6 Castillo and Hadi
Table 1 The Probability Distribution of the Random
Variable X De®ned as the Number of Heads Resulting
from Flipping a Fair Coin Twice
012
0.250.500.25 Figure 3 Graphical representation of the probability distri-bution of the random variable X in Example 1.
Trang 17F x p X x
x
1f x dxNote that the cdfs are denoted by the uppercase
letters P x and F x to distinguish them from the
pmf p x and the pdf f x Note also that since p X
x 0 for the continuous case, then
p X x p X < x The cdf has the following
properties as a direct consequence of the de®nitions
Every distribution function can be written as a
lin-ear convex combination of continuous
distribu-tions and step funcdistribu-tions
1.3.7 Moments of Random Variables
The pmf or pdf of random variables contains all the
information about the random variables For example,
given the pmf or the pdf of a given random variable,
we can ®nd the mean, the variance, and other moments
of the random variable The results in this section are
presented for the continuous random variables using
the pdf and cdf, f x and F x, respectively For the
discrete random variables, the results are obtained by
replacing f x, F x, and the integration symbol by
p x, P x, and the summation symbol, respectively
De®nition 11 Moments of Order k: Let X be a
ran-dom variable with pdf f x, cdf F x, and support A
Then the kth moment mkaround a 2 A is the real
Note that the Stieltjes±Lebesgue integral, Eq (10),
does not always exist In such a case we say that the
corresponding moment does not exist However, Eq
(10) implies the existence of
Ajx ajkf x dxwhich leads to the following theorem:
Theorem 3 Existence of Moments of Lower Order: Ifthe tth moment around a of a random variable X exists,then the sth moment around a also exists for 0 < s t.The ®rst central moment is called the mean or theexpected value of the random variable X, and isdenoted by or EX Let X and Y be random vari-ables, then the expectation operator has the followingimportant properties:
Ec c, where c is a constant
EaX bY c aEX bEY c; 8a; b; c 2 A
a Y b ) a EY b:
jEYj Ej yj:
The second moment around the mean is called thevariance of the random variable, and is denoted byVar X or 2 The square root of the variance, , iscalled the standard deviation of the random variable.The physical meanings of the mean and the varianceare similar to the center of gravity and the moment ofinertia, used in mechanics They are the central anddispersion measures, respectively
Using the above properties we can write
E X a2 2 a2
1.4 UNIVARIATE DISCRETE MODELS
In this section we present several important discreteprobability distributions that often arise in engineeringapplications.Table 2shows the pmf of these distribu-tions For additional probability distributions, seeChristensen [2] and Johnson et al [3]
8 Castillo and Hadi
Trang 181.4.3 The Binomial Distribution
Suppose now that we repeat a Bernoulli experiment n
times under identical conditions (that is, the outcome
of one trial does not affect the outcomes of the others)
In this case the trials are said to be independent
Suppose also that the probability of success is p and
that we are interested in the number of trials, X in
which the outcomes are successes The random
vari-able giving the number of successes after n realizations
of independent Bernoulli experiments is called a
bino-mial random variable and is denoted as B n; p Its pmf
is given inTable 2 Figure 6 shows some examples of
pmfs associated with binomial random variables
In certain situations the event X 0 cannot occur
The pmf of the binomial distribution can be modi®ed
to accommodate this case The resultant random able is called the nonzero binomial Its pmf is given inTable 2
vari-1.4.4 The Geometric or Pascal DistributionSuppose again that we repeat a Bernoulli experiment ntimes, but now we are interested in the random vari-able X, de®ned to be the number of Bernoulli trialsthat are required until we get the ®rst success Notethat if the ®rst success occurs in the trial number x,then the ®rst x 1 trials must be failures (see Fig 7).Since the probability of a success is p and the prob-ability of the x 1 failures is 1 px 1 (becausethe trials are independent), then the
p X x p 1 px 1 This random variable is calledthe geometric or Pascal random variable and isdenoted by G p
1.4.5 The Negative Binomial DistributionThe geometric distribution arises when we are inter-ested in the number of Bernoulli trials that are requireduntil we get the ®rst success Now suppose that wede®ne the random variable X as the number ofBernoulli trials that are required until we get the rthsuccess For the rth success to occur at the xth trial, wemust have r 1 successes in the x 1 previoustrials and one success in the rth trial (see Fig 8).This random variable is called the negative binomialrandom variable and is denoted by NB r; p Its pmf
is given in Table 2 Note that the gometric distribution
is a special case of the negative binomial distributionobtained by setting r 1, that is, G p NB 1; p.1.4.6 The Hypergeometric Distribution
Consider a set of N items (products, machines, etc.), Ditems of which are defective and the remaining N Ditems are acceptable Obtaining a random sample ofsize n from this ®nite population is equivalent to with-drawing the items one by one without replacement
10 Castillo and Hadi
Figure 5 A graph of the pmf and cdf of a Bernoulli
distribution
Figure 6 Examples of the pmf of binomial random variables Figure 7 Illustration of the Pascal or geometric randomvariable, where s denotes success and f denotes failure
Trang 191.5.3 The Gamma Distribution
Let Y be a Poisson random variable with parameter
Let X be the time up to the kth Poisson event, that is,
the time it takes for Y to be equal to k Thus theprobability that X is in the interval x; x dx is
f x dx But this probability is equal to the probability
of there having occurred k 1 Poisson events in aperiod of duration x times the probability of occur-rence of one event in a period of duration dx Thus,
we have
f x dx e k 1!x xk 1 dxfrom which we obtain
f x x k 1!k 1e x 0 x < 1 12Expression (12), taking into account that the gammafunction for an integer k satis®es
12 Castillo and Hadi
Table 3 Some Continuous Probability Density Functions that Arise in EngineeringApplications
1 < x < 1Central F n1 n2=2nn1 =2
Trang 20particular cases of the beta distribution are interesting.
Setting (r=1, s=1), gives the standard uniform U 0; 1
distribution, while setting r 1; s 2 or r 2; s 1)
gives the triangular random variable whose cdf is given
1.5.5 The Normal or Gaussian Distribution
One of the most important distributions in probability
and statistics is the normal distribution (also known as
the Gaussian distribution), which arises in various
applications For example, consider the random
vari-able, X, which is the sum of n identically and
indepen-dently distributed (iid) random variables Xi Then, by
the central limit theorem, X is asymptotically normal,
regardless of the form of the distribution of the
ran-dom variables Xi
The normal random variable with parameters and
2 is denoted by N ; 2 and its pdf is
f x 1
p2 exp
x 222
!
1 < x < 1
The change of variable, Z X =, transforms
a normal N ; 2 random variable X in another
ran-dom variable Z, which is N 0:1 This variable is called
the standard normal random variable The main
inter-est of this change of variable is that we can use tables
for the standard normal distribution to calculate
prob-abilities for any other normal distribution For
exam-ple, if X is N ; 2, then
p X < x p X <x
p Z < x x where z is the cdf of the standard normal distribu-
tion The cdf z cannot be given in closed form
However, it has been computed numerically and tables
for z are found at the end of probability and
statis-tics textbooks Thus we can use the tables for the
stan-dard normal distribution to calculate probabilities for
any other normal distribution
1.5.6 The Log-Normal Distribution
We have seen in the previous subsection that the sum
of iid random variables has given rise to a normaldistribution In some cases, however, some randomvariables are de®ned to be the products instead ofsums of iid random variables In these cases, takingthe logarithm of the product yields the log-normal dis-tribution, because the logarithm of a product is thesum of the logarithms of its components Thus, wesay that a random variable X is log-normal when itslogarithm ln X is normal
Using Theorem 7, the pdf of the log-normal randomvariable can be expressed as
f x 1xp2 exp
ln x 222
!
x 0
where the parameters and are the mean and thestandard deviation of the initial random normal vari-able The mean and variance of the log-normal ran-dom variable are e 2 =2 and e2 e2 2
e 2
,respectively
1.5.7 The Chi-Squared and Related DistributionsLet Y1; ; Yn be independent random variables,where Yi is distributed as N i; 1 Then, the variable
X Xn
i1
Yi2
is called a noncentral chi-squared random variable with
n degrees of freedom, noncenrality parameter
Pni12i; and is denoted as 2n When 0 weobtain the central chi-squared random variable, which
is denoted by 2n The pdf of the central chi-squaredrandom variable with n degrees of freedom is given in
Table 3, where : is the gamma function de®ned in
Eq (13)
The positive square root of a 2
n random variable
is called a chi random variable and is denoted by
n An interesting particular case of the n isthe Rayleigh random variable, which is obtained for
n 2 and 0 The pdf of the Rayleigh randomvariable is given in Table 3 The Rayleigh distribution
is used, for example, to model wave heights [5].1.5.8 The t Distribution
Let Y1be a normal N ; 1 and Y2be a 2independentrandom variables Then, the random variable
14 Castillo and Hadi