1. Trang chủ
  2. » Khoa Học Tự Nhiên

tài liệu vật lý statistical thermodynamics fundermental and application

466 1,1K 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 466
Dung lượng 3,07 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Probability Theory and Statistical Mathematics 3.1 Essential Concepts from Quantum Mechanics 303.2 The Ensemble Method of Statistical Thermodynamics 313.3 The Two Basic Postulates of Sta

Trang 2

This page intentionally left blank

Trang 3

STATISTICAL THERMODYNAMICS: FUNDAMENTALS

AND APPLICATIONS

Statistical Thermodynamics: Fundamentals and Applications discusses the

fundamentals and applications of statistical thermodynamics for beginninggraduate students in the engineering sciences Building on the prototypicalMaxwell–Boltzmann method and maintaining a step-by-step development ofthe subject, this book makes few presumptions concerning students’ previousexposure to statistics, quantum mechanics, or spectroscopy The book beginswith the essentials of statistical thermodynamics, pauses to recover neededknowledge from quantum mechanics and spectroscopy, and then moves on toapplications involving ideal gases, the solid state, and radiation A full intro-duction to kinetic theory is provided, including its applications to transportphenomena and chemical kinetics A highlight of the textbook is its discussion

of modern applications, such as laser-based diagnostics The book concludeswith a thorough presentation of the ensemble method, featuring its use for realgases Each chapter is carefully written to address student difficulties in learn-ing this challenging subject, which is fundamental to combustion, propulsion,transport phenomena, spectroscopic measurements, and nanotechnology Stu-dents are made comfortable with their new knowledge by the inclusion of bothexample and prompted homework problems

Normand M Laurendeau is the Ralph and Bettye Bailey Professor of

Combus-tion at Purdue University He teaches at both the undergraduate and graduatelevels in the areas of thermodynamics, combustion, and engineering ethics Heconducts research in the combustion sciences, with particular emphasis on laserdiagnostics, pollutant formation, and flame structure Dr Laurendeau is wellknown for his pioneering research on the development and application of bothnanosecond and picosecond laser-induced fluorescence strategies to quantita-tive species concentration measurements in laminar and turbulent flames Hehas authored or coauthored over 150 publications in the archival scientific andengineering literature Professor Laurendeau is a Fellow of the American Soci-ety of Mechanical Engineers and a member of the Editorial Advisory Board

for the peer-reviewed journal Combustion Science and Technology.

i

Trang 4

ii

Trang 6

camʙʀɪdɢe uɴɪveʀsɪtʏ pʀess

Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo

Cambridge University Press

The Edinburgh Building, Cambridge cʙ2 2ʀu, UK

First published in print format

ɪsʙɴ-13 978-0-521-84635-6

ɪsʙɴ-13 978-0-511-14062-4

© Cambridge University Press 2005

2005

Information on this title: www.cambridge.org/9780521846356

This publication is in copyright Subject to statutory exception and to the provision of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University Press.

ɪsʙɴ-10 0-511-14062-2

ɪsʙɴ-10 0-521-84635-8

Cambridge University Press has no responsibility for the persistence or accuracy of uʀʟs for external or third-party internet websites referred to in this publication, and does not guarantee that any content on such websites is, or will remain, accurate or appropriate.

Published in the United States of America by Cambridge University Press, New York

www.cambridge.org

hardback

eBook (NetLibrary) eBook (NetLibrary) hardback

Trang 7

I dedicate this book to my parents,

Maurice and Lydia Roy Laurendeau.

Their gift of bountiful love and support

Continues to fill me with the joy of discovery.

v

Trang 8

vi

Trang 9

PART ONE FUNDAMENTALS OF STATISTICAL

THERMODYNAMICS

2.1 Probability: Definitions and Basic Concepts 7

2.3 Probability Distributions: Discrete and Continuous 11

2.7 Combinatorial Analysis for Statistical Thermodynamics 18

Problem Set I Probability Theory and Statistical Mathematics

3.1 Essential Concepts from Quantum Mechanics 303.2 The Ensemble Method of Statistical Thermodynamics 313.3 The Two Basic Postulates of Statistical Thermodynamics 323.3.1 The M–B Method: System Constraints and Particle

3.3.2 The M–B Method: Microstates and Macrostates 33

vii

Trang 10

3.5 Bose–Einstein and Fermi–Dirac Statistics 37

3.5.3 The Most Probable Particle Distribution 393.6 Entropy and the Equilibrium Particle Distribution 403.6.1 The Boltzmann Relation for Entropy 403.6.2 Identification of Lagrange Multipliers 413.6.3 The Equilibrium Particle Distribution 42

4.4 Internal Energy and Entropy in the Dilute Limit 514.5 Additional Thermodynamic Properties in the Dilute

PART TWO QUANTUM MECHANICS AND SPECTROSCOPY

5.2 The Bohr Model for the Spectrum of Atomic Hydrogen 72

5.4 A Heuristic Introduction to the Schr ¨odinger Equation 78

5.6 The Steady-State Schr ¨odinger Equation 83

6.1 Schr ¨odinger Wave Equation for Two-Particle System 976.1.1 Conversion to Center-of-Mass Coordinates 986.1.2 Separation of External from Internal Modes 996.2 The Internal Motion for a Two-Particle System 996.3 The Rotational Energy Mode for a Diatomic Molecule 1006.4 The Vibrational Energy Mode for a Diatomic Molecule 104

Trang 11

Contents r ix

6.5 The Electronic Energy Mode for Atomic Hydrogen 1086.6 The Electronic Energy Mode for Multielectron Species 1156.6.1 Electron Configuration for Multielectron Atoms 1166.6.2 Spectroscopic Term Symbols for Multielectron

7.1 Rotational Spectroscopy Using the Rigid-Rotor Model 1307.2 Vibrational Spectroscopy Using the Harmonic-Oscillator

7.3 Rovibrational Spectroscopy: The Simplex Model 1327.4 The Complex Model for Combined Rotation and Vibration 1367.5 Rovibrational Spectroscopy: The Complex Model 138

7.7 Energy-Mode Parameters for Diatomic Molecules 144

Problem Set III Quantum Mechanics and Spectroscopy

PART THREE STATISTICAL THERMODYNAMICS IN THE

DILUTE LIMIT

8.4 The Partition Function and Thermodynamic Properties 1618.5 Energy-Mode Contributions in Classical Mechanics 163

9.2.1 Translational and Electronic Modes 176

9.2.4 Quantum Origin of Rotational Symmetry Factor 182

Trang 12

9.3 Rigorous and Semirigorous Models for the Diatomic Gas 187

9.4.3 Property Calculations for Polyatomic Molecules 198

Problem Set IV Thermodynamic Properties of the Ideal Gas

10 Statistical Thermodynamics for Ideal Gas Mixtures 20510.1 Equilibrium Particle Distribution for the Ideal Gas Mixture 20510.2 Thermodynamic Properties of the Ideal Gas Mixture 208

10.3.1 Equilibrium Particle Distribution for Reactive Ideal Gas

Problem Set V Chemical Equilibrium and Diagnostics

PART FOUR STATISTICAL THERMODYNAMICS BEYOND THE

DILUTE LIMIT

12.5 Spray Size Distribution from Information Theory 256

Trang 13

Contents r xi

13.1 Statistical Thermodynamics of the Crystalline Solid 25913.2 Einstein Theory for the Crystalline Solid 26213.3 Debye Theory for the Crystalline Solid 26313.4 Critical Evaluation of the Debye Formulation 266

13.6 Thermodynamic Properties of the Electron Gas 27013.7 The Metallic Crystal near Absolute Zero 273

14.1 Bose–Einstein Statistics for the Photon Gas 275

14.4 Thermodynamics of Blackbody Radiation 27814.5 The Influence of Wavelength for the Planck Distribution 280

Problem Set VI The Solid State and Radiation (Chapters 13–14) 283

PART FIVE NONEQUILIBRIUM STATISTICAL

THERMODYNAMICS

15.1 The Maxwell–Boltzmann Velocity Distribution 28915.2 The Maxwell–Boltzmann Speed Distribution 29115.3 The Maxwell–Boltzmann Energy Distribution 294

16.3.1 Dimensionless Transport Parameters 312

16.3.4 Rigorous Expressions for Transport Properties 316

17.3 Chemical Kinetics from Collision Theory 32117.4 The Significance of Internal Energy Modes 32417.5 Chemical Kinetics from Transition State Theory 325

Problem Set VII Kinetic Theory and Molecular Transport

Trang 14

PART SIX THE ENSEMBLE METHOD OF STATISTICAL

THERMODYNAMICS

18.2.1 The Equilibrium Distribution for the Canonical

18.2.2 Equilibrium Properties for the Canonical Ensemble 34218.2.3 Independent Particles in the Dilute Limit 345

18.3.1 The Equilibrium Distribution for the Grand Canonical

18.3.2 Equilibrium Properties for the Grand Canonical

18.3.3 Independent Particles in the Dilute Limit Revisited 355

19.2.1 Canonical Partition Function for Real Gases 361

19.3.1 Rigid-Sphere and Square-Well Potentials 36619.3.2 Implementation of Lennard–Jones Potential 367

Problem Set VIII Ensemble Theory and the Nonideal Gas

20.3 The Continuing Challenge of Thermodynamics 385

PART SEVEN APPENDICES

Trang 15

Contents r xiii

O Force Constants for the Lennard–Jones Potential 436

P Collision Integrals for Calculating Transport Properties from the

Trang 16

xiv

Trang 17

My intention in this textbook is to provide a self-contained exposition of the fundamentalsand applications of statistical thermodynamics for beginning graduate students in the engi-neering sciences Especially within engineering, most students enter a course in statisticalthermodynamics with limited exposure to statistics, quantum mechanics, and spectroscopy.Hence, I have found it necessary over the years to “start from the beginning,” not leavingout intermediary steps and presuming little knowledge in the discrete, as compared tothe continuum, domain of physics Once these things are done carefully, I find that goodgraduate students can follow the ideas, and that they leave the course excited and satisfiedwith their newfound understanding of both statistical and classical thermodynamics.Nevertheless, a first course in statistical thermodynamics remains challenging andsometimes threatening to many graduate students Typically, all their previous experience

is with the equations of continuum mechanics, whether applied to thermodynamics, fluidmechanics, or heat transfer For most students, therefore, the mathematics of probabilitytheory, the novelty of quantum mechanics, the confrontation with entropy, and indeedthe whole new way of thinking that surrounds statistical thermodynamics are all built-inhills that must be climbed to develop competence and confidence in the subject For thisreason, although I introduce the ensemble method at the beginning of the book, I havefound it preferable to build on the related Maxwell–Boltzmann method so that novicesare not confronted immediately with the conceptual difficulties of ensemble theory Inthis way, students tend to become more comfortable with their new knowledge earlier inthe course Moreover, they are prepared relatively quickly for applications, which is veryimportant to maintaining an active interest in the subject for most engineering students.Using this pedagogy, I find that the ensemble approach then becomes very easy to teachlater in the semester, thus effectively preparing the students for more advanced coursesthat apply statistical mechanics to liquids, polymers, and semiconductors

To hold the students’ attention, I begin the book with the fundamentals of cal thermodynamics, pause to recover needed knowledge from quantum mechanics andspectroscopy, and then move on to applications involving ideal gases, the solid state, andradiation An important distinction between this book and previous textbooks is the inclu-sion of an entire chapter devoted to laser-based diagnostics, as applied to the thermalsciences Here, I cover the essentials of absorption, emission, and laser-induced fluores-cence techniques for the measurement of species concentrations and temperature A full

statisti-xv

Trang 18

introduction to kinetic theory is also provided, including its applications to transport nomena and chemical kinetics.

phe-During the past two decades, I have developed many problems for this textbook that arequite different from the typical assignments found in other textbooks, which are often eithertoo facile or too ambiguous Typically, the students at Purdue complete eight problem setsduring a semester, with 4–6 problems per set Hence, there are enough problems included

in the book for approximately three such course presentations My approach has been toconstruct problems using integrally related subcomponents so that students can learn thesubject in a more prompted fashion Even so, I find that many students need helpful hints

at times, and the instructor should indeed be prepared to do so In fact, I trust that theinstructor will find, as I have, that these interactions with students, showing you what theyhave done and where they are stuck, invariably end up being one of the most rewardingparts of conducting the course The reason is obvious One-on-one discussions give theinstructor the opportunity to get to know a person and to impart to each student his or herenthusiasm for the drama and subtleties of statistical thermodynamics

As a guide to the instructor, the following table indicates the number of 50-minutelectures devoted to each chapter in a 42-lecture semester at Purdue University

Chapter

Number ofLectures Chapter

Number ofLectures

for-of embellished portions for-of his course notes to the text I thank Prfor-ofessor Michael Renfrofor his reading of the drafts and for his helpful suggestions Many useful comments werealso submitted by graduate students who put up with preliminary versions of the book atPurdue University and at the University of Connecticut I appreciate Professor RobertLucht, who aided me in maintaining several active research projects during the writing ofthe book Finally, I thank the School of Mechanical Engineering at Purdue for providing

me with the opportunity and the resources over these many years to blend my enthusiasmfor statistical thermodynamics with that for my various research programs in combustionand optical diagnostics

Trang 19

1 Introduction

To this point in your career, you have probably dealt almost exclusively with the

behav-ior of macroscopic systems, either from a scientific or engineering viewpoint Examples

of such systems might include a piston–cylinder assembly, a heat exchanger, or a battery.Typically, the analysis of macroscopic systems uses conservation or field equations related

to classical mechanics, thermodynamics, or electromagnetics In this book, our focus is

on thermal devices, as usually described by thermodynamics, fluid mechanics, and heattransfer For such devices, first-order calculations often employ a series of simple ther-modynamic analyses Nevertheless, you should understand that classical thermodynamics

is inherently limited in its ability to explain the behavior of even the simplest namic system The reason for this deficiency rests with its inadequate treatment of theatomic behavior underlying the gaseous, liquid, or solid states of matter Without proper

thermody-consideration of constituent microscopic systems, such as a single atom or molecule, it

is impossible for the practitioner to understand fully the evaluation of thermodynamicproperties, the meaning of thermodynamic equilibrium, or the influence of temperature

on transport properties such as the thermal conductivity or viscosity Developing this mentary viewpoint is the purpose of a course in statistical thermodynamics As you willsee, such fundamental understanding is also the basis for creative applications of classicalthermodynamics to macroscopic devices

Since a typical thermodynamic system is composed of an assembly of atoms or molecules,

we can surely presume that its macroscopic behavior can be expressed in terms of themicroscopic properties of its constituent particles This basic tenet provides the founda-tion for the subject of statistical thermodynamics Clearly, statistical methods are manda-tory as even one cm3 of a perfect gas contains some 1019 atoms or molecules In otherwords, the huge number of particles forces us to eschew any approach based on having

an exact knowledge of the position and momentum of each particle within a macroscopicthermodynamic system

The properties of individual particles can be obtained only through the methods of

quantum mechanics One of the most important results of quantum mechanics is that the energy of a single atom or molecule is not continuous, but discrete Discreteness arises

1

Trang 20

Figure 1.1 Schematic of simplified (a) continuous spectrum and (b) discrete spectrum.

from the distinct energy values permitted for either an atom or molecule The best dence for this quantized behavior comes from the field of spectroscopy Consider, for exam-ple, the simplified emission spectra shown in Fig.1.1 Spectrum (a) displays a continuousvariation of emissive signal versus wavelength, while spectrum (b) displays individual

evi-“lines” at specific wavelengths Spectrum (a) is typical of the radiation given off by a hotsolid while spectrum (b) is typical of that from a hot gas As we will see in Chapter7, theindividual lines of spectrum (b) reflect discrete changes in the energy stored by an atom ormolecule Moreover, the height of each line is related to the number of particles causingthe emissive signal From the point of view of statistical thermodynamics, the number of

relevant particles (atoms or molecules) can only be determined by using probability theory,

as introduced in Chapter2

The total energy of a single molecule can be taken, for simplicity, as the sum of vidual contributions from its translational, rotational, vibrational, and electronic energy

indi-modes The external or translational mode specifies the kinetic energy of the molecule’s

center of mass In comparison, the internal energy modes reflect any molecular motion with

respect to the center of mass Hence, the rotational mode describes energy stored by ular rotation, the vibrational mode energy stored by vibrating bonds, and the electronic mode energy stored by the motion of electrons within the molecule By combining pre-

molec-dictions from quantum mechanics with experimental data obtained via spectroscopy, itturns out that we can evaluate the contributions from each mode and thus determine themicroscopic properties of individual molecules Such properties include bond distances,rotational or vibrational frequencies, and translational or electronic energies Employ-ing statistical methods, we can then average over all particles to calculate the macroscopicproperties associated with classical thermodynamics Typical phenomenological propertiesinclude the temperature, the internal energy, and the entropy

Figure1.2summarizes the above discussion and also provides a convenient road mapfor our upcoming study of statistical thermodynamics Notice that the primary subject ofthis book plays a central role in linking the microscopic and macroscopic worlds More-over, while statistical thermodynamics undoubtedly constitutes an impressive application

of probability theory, we observe that the entire subject can be founded on only two majorpostulates As for all scientific adventures, our acceptance of these basic postulates as

Trang 21

1.3 Why Statistical Thermodynamics? r 3

Quantum Mechanics

• Properties of individual atoms or molecules

Figure 1.2 Flow chart for statistical thermodynamics.

objective truths rests solely on their predictive power; fortunately, the plethora of resultingpredictions invariably comports well with experimental observations in classical thermo-dynamics Therefore, despite its analytical nature, the study of statistical thermodynamics

is well worth the effort as the final results are indeed quite practical In fact, as we will see,much of classical thermodynamics ultimately rests on the conceptual bridge provided bystatistical thermodynamics, a bridge linking the real world of compressors and gas turbines

to the quantized world of ultimate uncertainty and molecular behavior

The framework of statistical thermodynamics can be divided into three conceptual themes

The first is equilibrium statistical thermodynamics with a focus on independent particles.

Here, we assume no intermolecular interactions among the particles of interest; the

result-ing simplicity permits excellent a priori calculations of macroscopic behavior Examples

include the ideal gas, the pure crystalline metal, and blackbody radiation The second theme

is again equilibrium statistical thermodynamics, but now with a focus on dependent ticles In this case, intermolecular interactions dominate as, for example, with real gases,

par-liquids, and polymers Typically, such intermolecular interactions become important only athigher densities; because of the resulting mathematical difficulties, calculations of macro-scopic properties often require semi-empirical procedures, as discussed in Chapters 19and20

The third conceptual theme might be labeled nonequilibrium statistical ics Here, we are concerned with the dynamic behavior that arises when shifting between

thermodynam-different equilibrium states of a macroscopic system Although time-correlation methodspresently constitute an active research program within nonequilibrium statistical thermo-dynamics, we focus in this book on those dynamic processes that can be linked to basickinetic theory As such, we will explore the molecular behavior underlying macroscopictransport of momentum, energy, and mass In this way, kinetic theory can provide a deeperunderstanding of the principles of fluid mechanics, heat transfer, and molecular diffusion

As we will see in PartFive, nonequilibrium statistical thermodynamics also provides animportant path for the understanding and modeling of chemical kinetics, specifically, therates of elementary chemical reactions

While the above classification scheme might please the engineering mind, it does little

to acquaint you with the drama and excitement of both learning and applying statisticalthermodynamics Yes, you will eventually be able to calculate from atomic and molecularproperties the thermodynamic properties of ideal gases, real gases, and metals Examples

Trang 22

might include equations of state, measurable properties such as specific heats and theinternal energy, and also ephemeral properties such as the entropy and free energies.And yes, you will learn how to calculate various transport properties, such as the thermalconductivity and the diffusion coefficient Furthermore, with respect to chemical reactions,you will eventually be able to determine equilibrium constants and estimate elementaryrate coefficients.

While these pragmatic aspects of statistical thermodynamics are important, the realdrama of the subject lies instead in its revelations about our natural world As you workthrough this book, you will slowly appreciate the limitations of classical thermodynamics

In particular, the first, second, and third laws of thermodynamics should take on a wholenew meaning for you You will understand that volumetric work occurs because of micro-scopic energy transfers and that heat flow occurs because of redistributions in molecularpopulation You will realize that entropy rises in isolated systems because of a fundamen-tal enhancement in molecular probabilities You will also appreciate in a new way theimportant underlying link between absolute property values and crystalline behavior nearabsolute zero

Perhaps more importantly, you will come to understand in a whole new light the realmeaning of thermodynamic equilibrium and the crucial role that temperature plays indefining both thermal and chemical equilibrium This new understanding of equilibriumwill pave the path for laser-based applications of statistical thermodynamics to measure-ments of both temperature and species concentrations, as discussed in Chapter11 Suchoptical measurements are extremely important to current research in all of the thermalsciences, including fluid mechanics, heat transfer, combustion, plasmas, and various aspects

of nanotechnology and manufacturing

In summary, the goal of this book is to help you master classical thermodynamics from

a molecular viewpoint Given information from quantum mechanics and spectroscopy, tistical thermodynamics provides the analytical framework needed to determine importantthermodynamic and transport properties associated with practical systems and processes

sta-A significant feature of such calculations is that they can bypass difficult experimentalconditions, such as those involving very high or low temperatures, or chemically unstablematerials More fundamentally, however, a study of statistical thermodynamics can pro-vide you with a whole new understanding of thermodynamic equilibrium and of the crucialrole that entropy plays in the operation of our universe That universe surely encompassesboth the physical and biological aspects of both humankind and the surrounding cosmos

As such, you should realize that statistical thermodynamics is of prime importance to allstudents of science and engineering as we enter the postmodern world

Trang 23

PART ONE

FUNDAMENTALS OF STATISTICAL

THERMODYNAMICS

5

Trang 24

6

Trang 25

2 Probability and Statistics

In preparation for our study of statistical thermodynamics, we first review some tal notions of probability theory, with a special focus on those statistical concepts relevant

fundamen-to afundamen-tomic and molecular systems Depending on your background, you might be able fundamen-toscan quickly Sections2.1–2.3, but you should pay careful attention to Sections2.4–2.7

Probability theory is concerned with predicting statistical outcomes Simple examples ofsuch outcomes include observing a head or tail when tossing a coin, or obtaining thenumbers 1, 2, 3, 4, 5, or 6 when throwing a die For a fairly-weighted coin, we would, ofcourse, expect to see a head for 1/2 of a large number of tosses; similarly, using a fairly-

weighted die, we would expect to get a four for 1/6 of all throws We can then say thatthe probability of observing a head on one toss of a fairly-weighted coin is 1/2 and thatfor obtaining a four on one throw of a fairly-weighted die is 1/6 This heuristic notion of

probability can be given mathematical formality via the following definition:

Given N s mutually exclusive, equally likely points in sample space, with N eof these

points corresponding to the random event A, then the probability P( A) = N e / N s

Here, sample space designates the available N s occurrences while random event A denotes the subset of sample space given by N e ≤ N s The phrase mutually exclusive indicates

that no two outcomes can occur simultaneously in a single sample space; this criterion isobviously required if we are to convert our heuristic understanding of chance to a well-defined mathematical probability

As a further example, for a standard deck of playing cards, we have 52 points in samplespace, of which four represent aces Hence, the probability of drawing a single ace from a

well-mixed deck is P(A) = 4/52 = 1/13, where the event A designates the random drawing

of an ace Visually, the relation between event A and sample space can be described by a

so-called Venn diagram, as shown in Fig.2.1 Here, sample points resulting in event A fall

within the area A, while those not resulting in event A fall elsewhere in the surrounding

box, whose total area represents the entire sample space Hence, assuming a uniform pointdensity, we find that the ratio of the cross-hatched area to the total area in Fig.2.1provides

a visual representation of P(A) Similarly, from the viewpoint of set theory, we observe

7

Trang 26

A Figure 2.1 Venn diagram representing that portion of

sam-ple space which corresponds to random event A.

that for a fairly-weighted die the random event of obtaining an even number E = {2, 4, 6} from within the entire sample space S = {1, 2, 3, 4, 5, 6} clearly occurs with probability P(A) = 1/2.

Our notion of probability becomes more complicated if we consider two different

random events, A and B, which can both occur within a given sample space On this basis, we may define the compound probability, P(AB), which represents events A and B, and also the total probability, P( A + B), which represents events A or B (including both) From the viewpoint of set theory, P(AB) is called the intersection of A and B ( A∩ B), while P( A+ B) is labeled the union of A and B (A∪ B) Pictorial displays of the (a) intersection and (b) union of A and B are given by the two Venn diagrams shown in Fig.2.2

If the events A and B are mutually exclusive, a single trial by definition permits no overlap in sample space Therefore, P( AB)= 0 so that

as displayed by the Venn diagram of Fig.2.3(a) As an example, the probability of picking a

king (K) or a queen (Q) from a single deck of playing cards is given by the total probability P(K + Q) = P(K) + P(Q) = 2/13 In comparison, the probability of picking a king from one deck and a queen from a different deck is P(KQ) = (1/13)2 In the latter case, wehave two different sample spaces, as indicated by the Venn diagram of Fig.2.3(b), so that

the events are now mutually independent Hence, in general, the compound probability

Trang 27

2.1 Probability: Definitions and Basic Concepts r 9

P( A + B + C) = P(A) + P(B) + P(C) Mutual Exclusivity (2.3)

P( ABC) = P(A) · P(B) · P(C) Mutual Independence. (2.4)

to each other or one person apart, we must account for the fact that there are many ways

of achieving either specification, each of which will enhance the previously unconstrainedcompound probability As for many probability analyses, a combination of visual andconceptual approaches often constitutes the most fruitful tactic for solving the problem.(a)Visualization indicates that for five people in a row, four possible pairs of people canexist next to each other Conceptually, the persons comprising each pair can also beswitched, thus giving eight independent ways of obtaining two people next to eachother among five people in a row Hence, the final probability that two people will

be next to each other when five people are arranged in a row at random must be(1/5)(1/4)(8) = 2/5

Possible pairs of people for (a) two people next to each other and (b) two people one person apart.

Trang 28

(b)Similarly, for two people separated by another person, a combined visual and ceptual analysis gives a final probability of (1/5)(1/4)(3)(2) = 3/10 Here, three pairs

con-of people are possible one person apart and the individuals comprising each pair canagain be switched

Suppose instead that the five people are arranged in a circle You should be able to convinceyourself that the probability for two people to be either next to each other or separated

by another person is now always 1/2.

We now apply probability theory to a sequence of distinguishable objects Consider, for

example, an urn containing four marbles labeled A, B, C, and D, respectively Our aim is

to randomly select marbles from the urn without replacement The first marble chosen can

be any of four possibilities, the second can be any of the three remaining possibilities, thethird chosen must be one of the two remaining possibilities, and the fourth can only be onepossibility Hence, the number of ways that the four sequential but independent choicescan be made must be 4· 3 · 2 · 1 = 24 These 24 possible ways of randomly selecting the four

original marbles can be taken as the number of possible arrangements or permutations of any single sequence of the four marbles, e.g., ACDB If, on the other hand, the marbles

were not labeled, then the 24 possible rearrangements would be irrelevant as the marbleswould be indistinguishable In this case, the 24 permutations would become only one

combination Moreover, only a single collection or combination of the four marbles would

exist, even if labeled, if we simply chose to disregard any ordering of the random objects.This distinction between permutations and combinations can be pursued further by

considering the number of ways by which we can choose M items from a sample of N

available objects without replacement, in one case including and in the other case excluding

the effect of labeling or ordering The objects, for example, could be M marbles chosen from an urn containing N marbles, or M cards chosen from a deck of N cards Following the procedure outlined in the previous paragraph, the number of permutations is P(N, M) = N(N − 1) · · · (N − M + 1) or

P(N , M) = N!

which is defined as the number of permutations of N objects taken M at a time We note,

by the way, that P(N , M) = N! when M = N, so that Eq (2.5) requires that we define0!= 1

In comparison, the number of combinations represents all the different subsets

contain-ing M items that can be sampled from N distinct objects Here, the particular arrangement

of M objects within a subset is irrelevant; thus, the number of combinations can be obtained

from the number of permutations of Eq (2.5) via division by the number of permutations,

M!, for the subset of M distinct objects Hence, C(N, M) = P(N, M)/M! or

Trang 29

2.3 Probability Distributions: Discrete and Continuous r 11

which is defined as the number of combinations of N objects taken M at a time We note that C(N, M) can also be interpreted as the number of different arrangements of N objects when M of these objects are of one distinct type and (N − M) are of a second type This interpretation of C(N , M) will be employed in Section2.4, when we consider the binomialdistribution

(b)The four combinations are ABC, ABD, ACD, and BCD Each of the four combinations

can be permuted 3!= 6 ways, for a total of 24 permutations Consider, for example,

the ABC combination, which offers the following six permutations: ABC, ACB, BAC, BCA, CAB, and CBA.

You are no doubt familiar with the concept of a grade distribution as a way of reportingresults for a course examination Consider, for example, the simplified distribution oftest scores shown for a class of 20 students in Table2.1 If we convert the number ofstudents associated with each test score to an appropriate fraction of students, we obtain

Table 2.1 Simplified grade distribution

Number ofstudents

Fraction ofstudents Test scores

Trang 30

Test Score

30 40 50 60 70 80 90 100 110 0.00

0.05 0.10 0.15 0.20 0.25 0.30 0.35

Figure 2.4 Histogram representing Table 2.1

the histogram shown in Fig. 2.4 This histogram is an example of a discrete probability

distribution, for which

more continuous distribution, as displayed by the probability density function, f (x), in

Fig.2.5 Normalization would now be given by



f (x) dx= 1

Trang 31

2.4 The Binomial Distribution r 13

Figure 2.5 Continuous distribution function.

when integrated over all possible values of x Therefore, the probability density function itself does not represent a probability; rather, the probability must be evaluated from knowledge of f (x) via integration As an example, the probability of achieving values of

x between a and b would be obtained from

so that f (x) represents a statistical weighting function for H(x) Hence, for H(x) = x,

H(x) represents the mean, in analogy with Eq (2.8) Similarly, for H(x) = (x − ¯x)2,

H(x) represents the variance, in accord with this same statistical parameter for a discrete

distribution

The binomial distribution is of fundamental importance in probability theory, as it describesquite simply any sequence of experiments having two possible outcomes As an example,consider the tossing of an unfairly-weighted coin, for which the two outcomes are either

a head or tail Suppose that the probability of obtaining a head is p, while that for a tail is

q = 1 − p Now, for a sequence of N tosses, the probability of M heads and (N − M) tails

in a particular sequence is p M q N −M , as each toss is an independent event in a new sample space However, because M heads and (N − M) tails can be achieved in more than one

way, we must determine the number of possible sequences of heads and tails if we wish todetermine the final probability But the number of possible sequences is just the number

Trang 32

of ways N total objects can be arranged into M identical objects of one type and (N − M)

identical objects of a second type This description defines the number of combinations of

N objects taken M at a time, C(N , M), as specified by Eq (2.6) Hence, the probability of

tossing M heads and (N − M) tails regardless of order becomes

B(M) = C(N, M)p M q N −M

or

M! (N − M)! p M(1− p) N −M , (2.14)where B(M) represents the well-known binomial probability distribution This discrete distribution can be interpreted in many different ways For example, the probabilities p

and (1− p) can indicate the chances of success and failure or right and left steps of a random walk, as well as of heads and tails in the tossing of a coin Therefore, in general, N

always represents the total number of repeated trials in any binary sequence

EXAMPLE 2.3

Determine the probability that, in six throws of a fairly-weighted die, the side with fourpips will land upright at least twice

Solution

The probability of landing a four on any throw is 1/6 (success) and thus the probability of

not landing a four on any throw is 5/6 (failure) Consequently, the probability that four

pips will not appear (M = 0) in six throws (N = 6) must be

B(0)= 6!

0! (6− 0)!

16

056

 56

5

0.402.

As B(0) and B(1) represent mutually exclusive events, the probability from Eq (2.3) that

four pips will appear at least twice in a sequence of six throws must be

Trang 33

2.5 The Poisson Distribution r 15

and

Hence, for a fairly-weighted coin (p =1/2), the mean number of heads is M = N/2, as

expected The standard deviation isσ =N/2, so that σ/M = 1/N and thus the

rel-ative width of the binomial distribution always narrows with an increasing number oftrials

While the binomial distribution holds for any finite number of repeated trials, physicalprocesses involving large numbers of particles, such as in statistical thermodynamics, imply

N→ ∞ For such circumstances, the binomial distribution can be simplified to two morefamiliar distributions, one discrete and the other continuous We now proceed with these

simplifications by first assuming p → 0, which we will find leads to the Poisson distribution.

This distribution is particularly applicable to photon-counting processes, for which the total

number of photons counted, N → ∞, while the possibility of observing any single photon,

M(1− p) N −M ,

where the meanµ ≡ M = Np from Eq (2.15) We then have

lim

M!

 µ N

M

(1− p) N= µ M

M!(1− p) µ/p From the fundamental mathematical definition of the quantity, e = 2.71828, it can be

shown that

lim

p→0(1− p)1/p = e−1 Hence, for N → ∞ and p → 0, the binomial distribution becomes the discrete Poisson

distribution,

P(M)= e −µ µ M

Because P(M) is based on B(M), the standard deviation for the Poisson distribution

can be obtained from Eq (2.16) by invoking p→ 0, thus giving

as can also be demonstrated (Prob 1.4) from direct application of Eq (2.10) We thus find,from Eq (2.18), that a greater mean value implies a broader range of expected outcomes

Trang 34

when the physical system of interest follows Poisson statistics Employing Eq (2.17), wealso note that

When the number of trials N → ∞, but p is not small, the binomial distribution becomes the continuous Gaussian distribution rather than the discrete Poisson distribution ( p→ 0).The Gaussian distribution is particularly applicable to various diffusive processes, for which

the total number of molecules N→ ∞

We begin by applying the natural logarithm to the binomial distribution, Eq (2.14),thus obtaining

, where q = 1 − p Employing Stirling’s approximation (AppendixD.2),



Np M

Trang 35

2.6 The Gaussian Distribution r 17

as y /N scales with the relative width of the binomial distribution, which we previously

found to display a 1/√N dependence For the remaining two terms,

ln



Np M

M

= −M ln



M Np

Employing the logarithmic series, ln(1± z) ± z − z2/2 for |z| < 1 at moderate values of

p, we subsequently find to second order in y,

lim

N→∞ln



Np M

M

−12



For a continuous distribution, the discrete variable M must be replaced by its continuous analog x so that, from Eq (2.20), z = (x − µ)/σ where again µ ≡ Np Note that G(z) is symmetrical about z because of its dependence on z2, unlike many cases for the discretebinomial or Poisson distributions Equation (2.25) also indicates that the peak value for

G(z) is always 1 /√2πσ.

In general, the Gaussian distribution can be shown to be a satisfactory approximation

to the binomial distribution if both Np ≥ 5 and Nq ≥ 5 If the Gaussian distribution holds, the probability for a specified range of the independent variable x can be determined from



Trang 36

Equation (2.26) represents the most convenient method for calculating probabilities whenthe physical system of interest follows Gaussian statistics.

EXAMPLE 2.4

Verify by direct calculation that the mean and variance of the Gaussian distribution areequivalent toµ and σ2, respectively.

Solution

The Gaussian distribution is given by Eq (2.25) where z= (x − µ)/σ Direct calculation

of the mean and variance requires application of Eq (2.13)

(a)For the mean,



dz=√12π

 ∞

−∞µ exp



z22



dz

¯x= √2µ

 ∞0exp



z22



dz= √2µ

(2π)1/2

z2exp



z22



dz=√2σ22π

1

4(8π)1/2

= σ2,

where the Gaussian integration has been evaluated by using AppendixB

We have previously indicated that quantum mechanics ultimately predicts discrete energy levels for molecular systems As we will see later, each such level is actually composed of

a finite number of allowed energy states The number of energy states per energy level is called the degeneracy For our purposes, we can model each energy level of energy, ε j ,

as an independent bookshelf holding a specified number of baskets equal to the value of

the degeneracy, g j , as shown in Fig.2.6 The height of each individual shelf represents itsenergy The equivalent containers denote potential storage locations for the molecules ofthe thermodynamic system at each energy level

For statistical purposes, we will eventually need to know the distribution of moleculesamong these energy states, as discussed further in Chapter3 We now move toward this

goal by considering the number of ways that N objects (molecules) can be placed in

M containers (energy states) on a single shelf (energy level) Before we can make such

combinatorial calculations, however, we must introduce two other important features ofquantum mechanics, the details of which we again defer to later discussion (Chapter5)

Trang 37

2.7 Combinatorial Analysis for Statistical Thermodynamics r 19

tion and momentum of atomic or molecular

par-ticles You may recall from your undergraduate

chemistry that the motion of electrons within

atoms or molecules is often described in terms

of an electron cloud This cloud analogy reflects

the probabilistic nature of fundamental atomic

particles; for this reason, such particles are

labeled indistinguishable In contrast, the motion of larger bodies such as billiard balls or

planets can be determined precisely by solving the equations of classical mechanics Suchbodies can obviously be tracked by observation; hence, in comparison to atomic particles,

classical objects are labeled distinguishable.

The second important feature of quantum mechanics required for statistical tions concerns the existence of potential limitations on the population within each energystate We will show in Chapter5that some atomic or molecular particles are inherentlylimited to one particle per energy state Other particles, in comparison, have no limit ontheir occupancy For proper statistical calculations, we must account for both of these cases,

calcula-as well calcula-as for objects that can be either distinguishable or indistinguishable

2.7.1 Distinguishable Objects

Combinatorial analyses for distinguishable objects encompass three significant cases Eachcase can be considered by posing and answering a different fundamental query

1 In how many ways may N identical, distinguishable objects be placed in M different

containers with a limit of one object per container?

The limitation of one object per container requires N ≤ M The first object may be placed in any of M available containers, the second in (M− 1) available containers,and so on Hence the number of ways for this case becomes

The total number of permutations for N objects is N! However, within each container,

permutations are irrelevant as we are concerned only with their number rather than

their identity Hence, the number of permutations, N!, overcounts the number of ways

Trang 38

by the number of permutations, N i!, for each container Therefore, the number of ways

3 In how many ways may N identical, distinguishable objects be placed in M different

containers with no limitation on the number per container?

Because no limit exists, each object can be placed in any of the M containers Therefore,

2.7.2 Indistinguishable Objects

Combinatorial analyses for indistinguishable objects encompass two rather than threecases of significance Each case can again be considered by posing and answering a funda-mental query

4 In how many ways may N identical, indistinguishable objects be placed in M different

containers with a limit of one object per container?

A similar query for distinguishable objects previously led to Eq (2.27) For

indistin-guishable objects, however, any rearrangement among the N objects is unrecognizable Hence, W1overcounts the number of ways for indistinguishable objects by a factor of

N! Therefore,

5 In how many ways may N identical, indistinguishable objects be placed in M different

containers with no limitation on the number per container?

This fully unconstrained case (indistinguishable objects, no limitation) mandates a

totally different approach from that used for W4 We begin by initially assuming guishable objects labeled 1, 2, 3, , N Let us now arrange these N objects in a row,

distin-with the M containers identified and separated by partitions As an example,

1, 2, 3 | 4, 5 | 6 | .| N − 1, Nspecifies that objects 1, 2, and 3 are in the first container, objects 4 and 5 are in thesecond container, object 6 is in the third container, and so on Now, regardless of their

actual arrangement, the maximum number of rearrangements among the N objects and M − 1 partitions is (N + M − 1)! However, interchanging the partitions produces

no new arrangements; thus, we have overcounted by a factor of (M− 1)! Similarly,

because the N objects are actually indistinguishable, we have again overcounted by a factor of N! , as in query 4 Therefore, the number of ways for this case becomes

W5=(N + M − 1)!

Trang 39

2.7 Combinatorial Analysis for Statistical Thermodynamics r 21

The combinatorial analyses conducted for Cases 3–5 will prove to be of most interest to

us for practical calculations As we will see in the following chapter, Eq (2.29) corresponds

to Boltzmann statistics, Eq (2.30) corresponds to Fermi–Dirac statistics, and Eq (2.31)corresponds to Bose–Einstein statistics

Trang 40

(c) For Bose–Einstein statistics, the balls are indistinguishable, with no limit on the number

of balls per container Hence, using Eq (2.32), we have

W5=(N + M − 1)!

N! (M− 1)! =

4!

2! 2! = 6.

These six distributions are as follows:

Way Container 1 Container 2 Container 3

Ngày đăng: 13/03/2016, 20:50

Nguồn tham khảo

Tài liệu tham khảo Loại Chi tiết
1. Baierlein, R., Thermal Physics, Cambridge University Press, Cambridge, UK (1999) Sách, tạp chí
Tiêu đề: Thermal Physics
2. Barrow, G. M., Molecular Spectroscopy, McGraw-Hill, New York (1962) Sách, tạp chí
Tiêu đề: Molecular Spectroscopy
3. Bernath, P. F., Spectra of Atoms and Molecules, Oxford University Press, New York (1995) Sách, tạp chí
Tiêu đề: Spectra of Atoms and Molecules
4. Callen, H. B., Thermodynamics, Wiley, New York (1985) Sách, tạp chí
Tiêu đề: Thermodynamics
5. Caretto, L. S., “Course Notes on Statistical Thermodynamics,” University of California, Berkeley, CA (1968) Sách, tạp chí
Tiêu đề: Course Notes on Statistical Thermodynamics
6. Carey, V. P., Statistical Thermodynamics and Microscale Thermophysics, Cambridge University Press, Cambridge, UK (1999) Sách, tạp chí
Tiêu đề: Statistical Thermodynamics and Microscale Thermophysics
7. Chandler, D., Introduction to Modern Statistical Mechanics, Oxford University Press, New York (1987) Sách, tạp chí
Tiêu đề: Introduction to Modern Statistical Mechanics
8. Chase, W. W., Jr., Davies, C. A., Davies, J. R., Jr., Fulrip, D. J., McDonald, R. A., and Syverud, A. N., J. Phys. Chem. Reference Data 14, Supplement 1 (1985) Sách, tạp chí
Tiêu đề: J. Phys. Chem. Reference Data
9. Davidson, N., Statistical Mechanics, McGraw-Hill, New York (1962) Sách, tạp chí
Tiêu đề: Statistical Mechanics
10. Davis, J. C., Advanced Physical Chemistry, Wiley, New York (1965) Sách, tạp chí
Tiêu đề: Advanced Physical Chemistry
11. DeGroot, M. H., Probability and Statistics, Addison-Wesley, New York (1975) Sách, tạp chí
Tiêu đề: Probability and Statistics
12. Eckbreth, A. C., Laser Diagnostics for Combustion Temperature and Species, Gordon and Breach, Amsterdam (1996) Sách, tạp chí
Tiêu đề: Laser Diagnostics for Combustion Temperature and Species
13. Garrod, C., Statistical Mechanics and Thermodynamics, Oxford University Press, New York (1995) Sách, tạp chí
Tiêu đề: Statistical Mechanics and Thermodynamics
14. Glasstone, S., Laidler, K. J., and Eyring, H., The Theory of Rate Processes, McGraw- Hill, New York (1941) Sách, tạp chí
Tiêu đề: The Theory of Rate Processes
15. Goodisman, J., Statistical Mechanics for Chemists, Wiley, New York (1997) Sách, tạp chí
Tiêu đề: Statistical Mechanics for Chemists
16. Gopal, E. S. R., Statistical Mechanics and Properties of Matter, Wiley, New York (1974) Sách, tạp chí
Tiêu đề: Statistical Mechanics and Properties of Matter
17. Hamming, R. W., The Art of Probability for Scientists and Engineers, Addison-Wesley, New York (1991) Sách, tạp chí
Tiêu đề: The Art of Probability for Scientists and Engineers
18. Hecht, C. E., Statistical Thermodynamics and Kinetic Theory, Dover, Mineola, NY (1990) Sách, tạp chí
Tiêu đề: Statistical Thermodynamics and Kinetic Theory
19. Herzberg, G., Atomic Spectra and Atomic Structure, Dover, New York (1944) Sách, tạp chí
Tiêu đề: Atomic Spectra and Atomic Structure
20. Herzberg, G., Molecular Spectra and Molecular Structure: Spectra of Diatomic Molecules, Krieger, Malabar, FL (1989) Sách, tạp chí
Tiêu đề: Molecular Spectra and Molecular Structure: Spectra of DiatomicMolecules

TỪ KHÓA LIÊN QUAN

w