POMPETZKI 46 SECTION II: MATERIAL VARIABILITY Probabilistic Fracture Toughness and Fatigue Crack Growth Estimation Resulting From Material Uncertainties---B.. Although the examples are
Trang 2STP 1450
Probabilistic Aspects of
Life Prediction
W Steven Johnson and Ben M Hillberry, editors
ASTM Stock Number: STP1450
ASTM International
100 Barr Harbor Drive
PO Box C700 West Conshohocken, PA 19428-2959
Trang 3(To come)
Copyright 9 2004 ASTM International, West Gonshohocken, PA All rights reserved This material may not be reproduced or copied, in whole or in part, in any printed, mechanical, electronic, film, or other distribution and storage media, without the wdtten consent of the publisher
Photocopy Rights Authorization to photocopy Items for Internal, personal, or educational classroom use,
or the Internal, personal, or educational classroom use of specific clients, is granted by ASTM International (ASTM) provided that the appropriate fee is paid to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923; Tel: 978-750-8400; online:
http J/www.cop yri ght.com/
Peer Review Policy
Each paper published in this volume was evaluated by two peer reviewers and at least one editor The authors addressed all of the reviewers' comments to the satisfaction of both the technical editor(s) and the ASTM Intemational Committee on Publications
To make technical information available as quickly as possible, the peer-reviewed papers in this publication were prepared "camera-ready" as submitted by the authors
The quality of the papers in this publication reflects not only the obvious efforts of the authors and the technical editor(s), but also the work of the peer reviewers In keeping with long-standing publication practices, ASTM International maintains the anonymity of the peer reviewers The ASTM International Committee on Publications acknowledges with appreciation their dedication and contribution of time and effort on behalf of ASTM International
Trang 4Foreword
The Symposium on Probabilistic Aspects of Life Prediction was held in Miami, FL on 6-7 November 2002 ASTM International Committee E8 on Fatigue and Fracture served as sponsor Symposium chairmen and co-editors of this publication were W Steven Johnson, Georgia Institute
of Technology, Atlanta, GA and Ben Hillberry, Purdue University, West Lafayette, IN,
Trang 5Overview vii
SECTION I: PROBABILISTIC MODELING
Probabilistie Life Prediction Isn't as Easy as It Looks -c ANNIS 3 Probab'distic Fatigue: Computational Shnuation -c c CHAMIS AND S S PAl 15 The Prediction of Fatigue Life Distributions from the Analysis of Plain Specimen
Modeling Variability in Service Loading Spectra D F SOCIE AND M A POMPETZKI 46
SECTION II: MATERIAL VARIABILITY Probabilistic Fracture Toughness and Fatigue Crack Growth Estimation Resulting
From Material Uncertainties -B FARAHMAND AND F ABDI 61 Predicting Fatigue Life Under Spectrum Loading in 2024-T3 Aluminum Using a
Measured Initial Flaw Size Distribution E A DEBARTOLO AND B M HILLBERRY 75 Extension of a Microstructure-Based Fatigue Crack Growth Model for Predicting
Fatigue Life Variability M p E~GlCr AND K S CHAN 87 Scatter in Fatigue Crack Growth Rate in a Directionaliy Solidified Nickel-Base
Snperalloybs HIGHSMITH, JR AND W S JOHNSON i 0 4
Mechanism-Based Variability in Fatigue Life of Ti-6A1-2Sn-4Zr-6Mo -s, K JHA,
J M LARSEN, A H ROSENBERGER, AND G A HARTMAN 116 Predicting the Reliability of Ceramics Under Transient Loads and Temperatures with
C A R E S / L i f e - - - N N NEMETH, O M JADAAN, T PALF1, AND E H BAKER 128
Trang 6vi CONTENTS
Fatigue Life Variability Prediction Based on Crack Forming Inclusions in a High
Strength Alloy Steel P s SHAME, B M HILLBERRY, AND B A CRAIG
SECTION III: APPLICATIONS
150
Preliminary Results of the United States Nuclear Regulatory Commissions
Pressurized Thermal Shock Rule Reevaluation Project T L DICKSON,
Corrosion Risk Assessment of Aircraft Structures -M LIAO AND J P KOMOROWSKI 183
A Software Framework for Probabilistic Fatigue Life Assessment of Gas Turbine
Engine Rotors -R CRAIG MCCLUNG, M P ENRIGHT, H R M[LLWATER,
Application of Probabllistie Fracture Mechanics in Structural Design of Magnet
Components Parts Operating Under Cyclic Loads at Cryogenic Temperatures
- - M YATOMI, A NYILAS, A PORTONE, C SBORCHIA, N MITCHELL, AND K NIKBIN 216
A Methodology for Assessing Fatigue Crack Growth in Reliability of Railroad Tank
C a r s - - - w ZltAO, M A SU'ITON, AND J PEN/~ 240 Effect of Individual Component Life Distribution on Engine Life Prediction
E V ZARETSKY, R C HENDRICKS, AND S M SODITUS 255 Author Index
Subject Index
273
275
Trang 7As fatigue and fracture mechanics approaches are used more often for determining the useful life
and/or inspection intervals for complex structures, realization sets in that all factors are not well
known or characterized Indeed, inherent scatter exists in initial material quality and in material per-
formance Furthermore, projections of component usage in determination of applied stresses are in-
exact at best and are subject to much discrepancy between projected and actual usage Even the mod-
els for predicting life contain inherent sources of error based on assumptions and/or empirically fitted
parameters All of these factors need to be accounted for to determine a distribution of potential lives
based on a combination of the aforementioned variables, as well as other factors The purpose of this
symposium was to create a forum for assessment of the state-of-the-art in incorporating these uncer-
tainties and inherent scatter into systematic probabilistic methods for conducting life assessment
This is not the first ASTM symposium on this subject On 19 October 1981 ASTM Committees E9
on Fatigue and E24 on Fracture Testing (today they are combined into Committee E8 an Fatigue and
Fracture) jointly sponsored a symposium in St Louis, MO The symposium resulted in an ASTM
STP 798, "'Probabilistic Fracture Mechanics and Fatigue Methods: Applications for Structural
Design and Maintenance." The STP contained 1 ! papers Both of the editors of this current STP were
present At that time, we were very involved with deterministic crack growth predictions under spec-
trum loading, trying to be as accurate as possible We had little use for the statistics and probability
One thing that stood out in my listening to the speakers was the level of probability that they were
predicting using the ASME boiler and pressure vessel code (author was G M Jouris) Some of their
estimated probabilities of failure were on the order of 1 X 10 - H A member of the audience noted
that the inverse of this number was greater than the number of atoms in the universe The audience
laughed
As time went by, a greater appreciation was developed for all the uncertainties in real world ap-
plications (as opposed to a more controlled laboratory testing environment) This confounded by
needs to assure safety, avoid costly litigation suits, set meaningful inspection intervals, and establish
economic risks, have brought more emphasis to the need to use probability in the lifing of compo-
nents Since the aforementioned symposium was almost 20 years ago, ASTM Committee E8 agreed
to sponsor this symposium The response was outstanding
On 6-7 November 2002, in Miami, FL, 29 presentations were given Lively discussions followed
essentially all the talks The presentations collectively did a great job on assessing the current state of
the art in probabilisitc fatigue life prediction methodology We would like to take this opportunity to
recognize and thank our session chairs: Dr Christos Chamis, Dr Duncan Shepherd, Dr James
Larsen, Prof Wole Soboyejo, Mr Shelby Highsmith, Jr., Dr Fred Holland, and Mr Bill Abbott A
special thanks to Dr Chamis for organizing a session
Due to a number of factors, including paper attrition and a tough peer review process, only 17 pa-
pers have made it through the process to be included in this Special Technical Publication The 17 pa-
pers have been divided into three topical groups for presentation in this publication: tour papers are
Trang 8viii OVERVIEW
in the section on Probabilistic Modeling; seven papers are in the section on Material Variability; and
six papers are in the section on Applications
We sincerely hope that you find this publication useful and that it helps make the world a safer place
Prof W Steven Johnson
School of Materials Science and Engineering George W Woodruff School of Mechanical Engineering Georgia Institute of Technology
Atlanta, GA
Prof Ben M Hillberry
School of Mechanical Engineering Purdue University
West Lafayette, IN
Trang 10Journal of ASTM International, Feb 2004, Vol 1, No 2
Paper ID JAIl 1557 Available online at: www.astm.org
Charles Annis t
ABSTRACT: Many engineers effect "probabilistic life prediction" by replacing constants with probability distributions and carefully modeling the physical relationships among the parameters Surprisingly, the statistical relationships among the "constants" are often given short shrift, if not ignored altogether Few recognize that while this simple substitution of distributions for constants will indeed produce a nondeterministic result, the corresponding
"probabilities" are often woefully inaccurate In fact, even the "trend" can be wrong, so these results can't even be used for sensitivity studies This paper explores the familiar Paris equation relating crack growth rate and applied stress intensity to illustrate many statistical realities that are often ignored by otherwise careful engineers Although the examples are Monte Carlo, the lessons also apply to other methods of probabilistic life prediction, including FORM/SORM (First/Second Order Reliability Method) and related "fast probability integration" methods
I ~ Y W O R D S : life prediction, crack growth, Paris equation, probability, statistics, simulation, Monte Carlo, nondeterministic, probabilistic, joint, conditional, marginal, multivariate
There is more to probabilistic life prediction than replacing constants with probability densities The purpose o f this study is to demonstrate this by comparing the observed distribution of lives of 68 nominally identical crack growth specimens with Monte Carlo (MC) simulations of lives based on the distributions o f their Paris law parameters It will
be shown that several common MC sampling techniques produce wildly inaccurate results, one with a standard deviation that is 7X larger than was exhibited by the specimen lives themselves The cause of such aberrant behavior is explained It is further observed that the Paris law parameters are jointly distributed as bivariate normal, and a Monte Carlo simulation using this joint density reproduces the specimen mean and standard
just to these data, nor only to crack growth rate models, nor are they limited only to MC
The Data
In the mid-1970s Dennis Virkler, then a Ph.D student of Professor Ben Hillberry at Purdue, conducted 68 crack growth tests of 2024-T3 aluminum [1,2] These tests were unusual for several reasons They were conducted expressly to observe random behavior
in fatigue While almost all crack growth tests measure crack length after some number
of cycles, Virkler measured cycle count at 164 specific crack lengths This provided a direct measure of variability in cycles, rather than the usually observed variability in crack length at arbitrary cyclic intervals While two of the specimens appear to stand out from their brethren, the purpose of this investigation is not to play Monday Morning
Manuscript received Aug 29 2002; accepted for publication Aug 29 2003; published February 2004 Presented at ASTM Symposium on Prohahilistic Aspects of Life Prediction on Nov 6, 2002 in Miami Beach, FL; W S Johnson and B HiUberry, Guest Editors
Principal, Charles Annis, P.E., Statistical Engineering, Palm Beach Gardens, FL 33418-7161 Charles.Anins@StatisticalEngineering.com
Trang 11Quarterback 25 years after the game, and there is no reason not to consider all 68
specimens here In any event their exclusion changes only the numeric details The
fundamental results are not affected, nor are they affected by using a normal, rather than
lognormal density to describe them
of the same material tested under the same conditions of temperature, stress ratio, and
frequency In the study reported here, however, 68 individual Paris models were used
Fitting a single curve describes the mean trend behavior very well, but it obscures
similar randomness, it is necessary to capture that effect as well
Fatigue Lives Are Lognormal
It has been long recognized that fatigue lives are satisfactorily modeled using the
lognormal density For these 68 specimens that model is less than optimal and there is
some evidence that the probability density may be a mixture of two densities It is not the
purpose of this paper to repeat the earlier work by Virkler, Hillberry and Goel [2], and as
it turns out, the actual form of the distribution of the specimen lives themselves only
influences the numeric details of this study, since each specimen's crack growth rate
curve was treated individually (Treating the data as normal, however, results in a bias in
the simulated mean of about 5% The bias using the lognormal is negligible.)
Conventional Monte Carlo Simulation
Unlike many engineering analytical results, probability estimates are difficult to
verify experimentally This unfortunate reality has perpetuated the misuse of a valid
statistical tool, and the consequences may not be apparent for years to come
Most engineering Monte Carlo simulations are performed this way
1 Set up a conventional deterministic analysis;
2 Replace constants with probability distributions;
3 Sample once from each distribution;
4 Compute the deterministic result and store the answer;
5 Repeat steps 3 and 4 many times;
6 Compute the mean and standard deviation of the collected results
at the foundation of Monte Carlo simulation, but as been observed elsewhere [3] "Simply
not understanding the nature of the assumptions being made does not mean that they do
not exist."
What possibly could be wrong with this paradigm? Luckily we (the engineering
collected by Virkler and Hillberry, as part o f Virkler's Ph.D dissertation Professor
Hillberry graciously made these available for further study
Monte Carlo Modeling Specifics
After fitting individual Pads equations to each of the 68 specimens, the mean and
standard deviation for the individual Pads parameters, intercept, C, and slope, n, were
computed The well-known Pads model for fatigue crack growth is given in equation 1
Trang 12ANNIS ON PROBABILISTIC LIFE PREDICTION 5
intensity factor, in MPa~lm, given by equation 2
function o f the specimen (or component) geometry and eraek length Of course, when
Assuming for the sake of simplicity that there was no variation in the starting crack size,
the final crack size, or the test stress, the calculated cyclic lifetime can be computed from
the individual Paris fits using equation 3
da / dN = 10 c [ A o ' ~ f ( a [ geometry)]"
dN=da/{lOC[Ao'.qU~f(algeometry)]" }
In practice this integration is usually carried out numerically
equation 3, and i ranges from 1 to say 1000 (or 10 000)
Many MC practitioners then calculate a mean and standard deviation for N, or
logloOV), report the results and stop there, since there is nothing against which to compare
the distribution of computed values for N; Virkler's data show the observed distribution
o f actual specimen lives and thus provide a direct comparison for these calculations
The Paris Law is Adequate
Before going further it is prudent to check the goodness-of-fit of the Paris equation
itself If the underlying model for crack growth rate is inadequate there is little hope for
accurate life prediction based on it The sigmoidal shape of the da/dN AK data (Fig 1)
suggests a model such as the SINH [4] might do a better job than the straight line Paris
model (and it does, increasing the ratio o f standard deviations o f calculated lives, 0.918
for Paris, to 0.957 for the SINH by reducing the disagreement between calculated and
observed specimen lives from 8.2% to 4.3%) The added model complexity, however,
obscures the real issue here, namely the abysmal performance of a rather common Monte
Carlo simulation (700% error in predicted scatter) Since the Paris law is adequate it is
used here for simplicity
A Note on Modeling
Statisticians often assess the efficacy of a mathematical model by decomposing the
sums-of-squares of differences between the model and the observations We, however,
and their Paris model, than we are in their integrated collective behavior, as given by
equation 3 Such an integrated metric summarizes all sources o f "error" - material
variability, lack-of-fit, testing uncertainties - into the difference between the observed
specimen life, and that provided by equation 3 We thus have traded the potential for
better arithmetic diagnostics (scrutiny o f the Paris model) for a more direct measure of
what we are really interested in - life prediction performance
Trang 13How Well Does the Conventional Monte Carlo Algorithm Perform?
The conventional MC simulation o f I000 samples, with independent model parameters, C and n, did an acceptable job predicting the mean lifetime, after the log
transform Because the data are skewed to the right, as all fatigue data are, the untransformed simulated results overestimate means o f the symmetrical normal models slightly
The simulated standard deviations were another matter: The actual observed standard deviation for 68 specimens is 0.03015 loglo units (18 447 cycles) 2 The conventional MC simulation o f 1000 samples, with independent model parameters, C and
n, produced a standard deviation o f 0.19778 loglo units (140 261 cycles), 6.6Xtoo large!
A closer look shows the situation gets even worse To be fair, the best possible Paris model would use the 68 individual Paris fits, since no simulation could be expected to be better than the actual specimens' behavior Using the 68 Paris equations in equation 3 produces a standard deviation o f 0.02769 loglo units (16 332 cycles), which is smaller
than the observed standard deviation by about 8% Why?
O f the 68 specimens, two seemed to exhibit longer lives than what might have been inferred by from the behavior o f the other 66 All 68 specimens were used here Since the actual specimen life doesn't directly influence its daMN vs AK behavior, predicted lives based on these two Paris fits would be more like their sister specimens, resulting in the smaller standard deviation for the integrated Paris equations So to provide a fair
2 The analyses were carried out using loglo(cycles), and again using untransformed cycles The reported loglo result can not, of course, be determined simply by taking the log of the mean and standard deviation of the unt~ansformed results All calculations are summarized in Tables 1 and 2 and Fig 5
Trang 14ANNIS ON PROBABILISTIC LIFE PREDICTION 7
comparison with simulated Paris models, the behavior of the 68 integrated Paris laws
integrated Paris law baseline shows the simulation to have overestimated the scatter by
useless since it would likely compel a costly redesign Put in perspective, the probability
of failure before about 207 000 cycles is 0.1%, determined from the mean and standard
This absurd simulation result has been observed by every engineer who has
performed similar MC simulations, since it doesn't require any statistics to detect an
answer that is wrong by a factor approaching an order of magnitude in standard
deviation Sadly the most common palliatives proposed as remedies do not perform
much better
What Went Wrong?
The model parameters, C and n, are assumed to be normally distributed Is this a
good assumption in this case?
approximate, the normal density is not an altogether improper model; surely these
deviation A closer look at the figures provides a clue There are two observations that
are high for parameter C, and two that are low for parameter n Perhaps these should be
considered as pairs, rather than as independent observations Figure 3, a schematic plot
tandem: when the slope, n, is shallow the intercept, C, must be larger for the resulting line
to go through the data Similarly, a steeper slope requires a smaller intercept
Trang 15FIG 3 Schematic showing why Paris Parameters must be correlated
Note that in this schematic the intercept is C = loglO(da/dN) = -10, at loglO(DK)=O
Possible Remedies (All of Them Wrong)
Assuming C and n to be independent, when they obviously are not (the most common
error in Monte Carlo modeling), results in unacceptable error in simulated lifetime
scatter Possible remedies that have been suggested are:
1 n assumed fixed, C is normal
2 C assumed fixed, n is normal
3 C assumed a linear function o f n
Fixing either n or C seems at first blush like a reasonable solution, and it does reduce
the over-prediction o f scatter from 7.1Xto 5.1X(n fixed) or 5.4X(C fixed) While this is
an obvious improvement, the error remains wildly unacceptable Sadly, it is at this stage
when the standard deviation o f C or n is arbitrarily "adjusted," i.e., fudged until a
believable result is achieved
Figure 4 also shows why assuming either C or n as fixed is not reasonable The
horizontal line is at n = 2.87, the average o f 68 Paris slopes This is a reasonable value
only when -6.58 < C < -6.45 When C is outside this range, as it will be often, the
resulting simulated combination is very, very improbable In fact observations in either
the first or third quadrants (large n with large C, or small n with small C) are exceedingly
unlikely in reality but occur about half the time in uncorrelated simulation
Another option for remedy suggests itself since the two parameters are obviously so
closely related: let one be a function o f the other A linear fit o f C=bt + b2n, with n
being sampled from a normal density, does indeed improve things But this time the
resulting error ratio is 0.51, Le.: the scatter has been over-corrected, and now is
underestimated by almost hale Clearly this nonconservative result is also unacceptable
Trang 16ANNIS ON PROBABILISTIC LIFE PREDICTION 9
3.4 3.2 ~ / / 95% confidence ellipse
2.6 2.4-
-7.o -6.8 -6.6 ,-6.4 -6.2 -6.0
C
FIG 4 -Paris Parameters C and n are obviously correlated (r=0.982)
To understand why such an appealing suggestion should have such an undesirable
result, look again at Fig 4 which also shows the 95% confidence ellipse for the C and n
pairs Assuming that one is a linear function of the other, in effect collapses this ellipse
into a line, thus underestimating the overall variability (The confidence ellipse also
suggests that two of the tests may be different from the others, as was noted earlier.)
The Right Way
We have considered four very common oversights in Monte Carlo modeling So,
how do you do it correctly?
Parameters estimates for C and n are jointly distributed (Notice that this is not
optional It is how regression model parameters naturally behave You can't choose the
ratio of a circle's circumference to its diameter to be an integer because it might be more
convenient The fact is that ~ is inconveniently transcendental Similarly, regression
parameter estimates are asymptotically multivariate normal, and correlated, so any
realistic simulation must sample from their correlated joint density.) Modeling them as
bivariate normal in a MC simulation produces a standard deviation o f 0.02802 in loglO
integrated lifetime for 1000 samples, which is very close to the standard deviation of the
integrated individual Paris fits, 0.02769 The ratio o f standard deviations is 1.012 In
other words, correctly modeling the joint behavior reduces the greater than 700% error
in the estimate of the standard deviation to about 1%
Notice, too, that replacing a constant n (the horizontal line in Fig 4) with a
(conditional) probability density has the paradoxical effect of decreasing the resulting
variability in calculated lifetime, since it corrects Mistake #2 (see Tables 1 and 2) This
refutes the common misconception that replacing a constant with a probability density in
a Monte Carlo simulation always results in increased scatter in the output All these
results are summarized in Tables 1 and 2 and in Fig 5
TABLE 1 MC Results Assuming Cycles are LogNormally Distributed
Correct Mistake #1 Mistake #2 Mistake #3 Mistake #4 Actual N Eqn 3 N C, n joint C, n indept n fixed C fixed C=bo + bt*n
mean 5.40916 5.39773 5.39909 5.41404 5.39414 5.39911 5.42217
Trang 17mean
stdev
How to Sample from a Joint Probability Density:
As a consequence of the Central Limit Theorem in statistics (see the Appendix), regression model parameters are asymptotically multivariate normal Thus while the
assumption of Gaussian behavior isn't always appropriate for physical parameters, it is
often justified for regression parameters The following algorithm can be used to sample from a hivariate normal density
Let zl, z2 be iid (independent and identically (from the same probability density) distributed) N(O,1), and letx 1 =/~1 + s~z I and
(x~ ,x2 )' - BN[(/.z, ,/~2 )',sj ,s 2 PL2], then
where the symbol " ' is read "is distributed as," N( g, ~2) is a normal density with mean /~ and variance a 2, and B N [ ( p h p 2 ) ' ~ 7 , ~ r / , p ] represents a bivariate normal density with means/~/, ~2, variances cry, ~ , and correlation, p Equation 4 can
be generalized to higher dimension regression models, which will of course require the
parameter covariance matrix as the extension of sl, s2 and pl.2 here
Trang 18ANNIS ON PROBABILISTIC LIFE PREDICTION 11
Sampling from Higher Dimension, Non-Normal, Joint Densities
physical parameters since such situations are rarely multivariate normal Rare too is
statistical independence It is lamentable then that many Monte Carlo users hope to avoid
these difficulties by assuming them away (If all the variables are assumed to be
independent, then their marginal densities can be used.) Convenience, however, is scant
justification, and consensus is a poor measure of veracity Mother Nature will do as she
will whether our simulations portend effectively or not
All is not hopeless, however, and great progress has been made by taking advantage
o f conditional independence, and modeling the joint density as a network connected by
statements of conditional probability [5] A practical example is presented in [6]
Another underappreciated difficulty with direct-sampling Monte Carlo is what is
referred to in the Bayesian literature as the "curse of dimensionality" [of: 7] This is the
requirement that the number of sampled points must increase exponentially with the
number of random variables to maintain a given level of precision This places a
practical limit on direct-sampling Monte Carlo
A "new" method, Markov Chain Monte Carlo, (fifty years old but only recently
rediscovered [8]) isn't encumbered by this impediment Direct-sampling methods must
sample directly from the entire probability space to obtain a sample from the joint
probability density of interest In contrast, Markov Chain Monte Carlo methods can
sample directly from the desired joint probability density itself Because they do not
have to sample everywhere in the probability space, and only sample where the variables
most probably reside, MCMC methods are not fettered by the problem of large
dimensions MCMC has revolutionized Bayesian statistics during the past decade, yet
Trang 19remains almost unknown to the engineering community, where it is sometimes misunderstood to be "just importance sampling."
Putting Things in Perspective
If engineering Monte Carlo analysis is vulnerable to such enormous errors why do so many MC studies produce reasonable results? Here, flouting Murphy's Law, serendipity
vulnerable to the errors illustrated here (For at least 75 years it has been well known in the applied statistics community that regression model parameters are correlated [cf.: 9], yet that fact is almost universally unknown to, or worse, ignored by, us engineers.) The effects of ignoring model parameter correlations are sometimes mitigated by a second piece of good luck: the effects of scatter in model parameters may be overwhelmed by other sources of variability in the system being simulated For example, the erroneously large standard deviation caused by treating Paris model parameters, C
the resulting effect of having ignored correlation would be about (0.2 2 + 0.5 2)~ = 0.54
logto units, an increase in the total error of less than 10%
Summary
There is more to Monte Carlo simulation than replacing constants with probability
demonstrated their unacceptable consequences, using the 68 specimen Virkler-Hillberry data as an example These errors and their consequences are not confined to the example
Carlo analysis that relies on regression models (and most do)
We have further demonstrated that correctly modeling the regression parameters as multivariate normal nearly eliminates the MC model error in this example
Monte Carlo simulation is a powerful engineering analysis tool Used properly it can provide insights that are otherwise unattainable Lamentably, many practitioners are not aware of the statistical assumptions they are making, and that violating any one of them could eviscerate their analysis
Acknowledgments
I wish to thank Professor Ben Hillberry of Purdue University for graciously making the data, as well as specimen geometry and testing details, available for this study I also wish to thank my longtime friend and colleague Dr AI Berens of the University of Dayton Research Institute for suggesting the data to me
3 Under some circumstances, for example when the data are centered at X,Y some of the model covariances are zero
Trang 20ANNIS ON PROBABILISTIC LIFE PREDICTION 13
joint probability: f(x, y I B) wherefis the probability o f x a n d y together as a pair, given
the distribution parameters, 0
multivariate distribution: A joint probability density of two or more variables It is
often summarized by a vector of parameters, O For example, the MVnormal is
summarized (sufficiently) by a mean vector and covariance matrix
marginal probability: : f ( x [ O) wherefis the probability density of x, for all possible
values of y, given the distribution parameters, 0 The marginal probability is determined
from the joint distribution o f x and y by integrating over all values of y, thus integrating
out the variable y In applications of Bayes's Theorem, y is often a matrix of possible
parameter values
conditional probability: f ( x [ y; 19) w h e r e f i s the probability o f x by itself, given specific
value of variable y, and the distribution parameters, 0 I f x and y represent events A and
B, then P(A[B) = nAB~riB ,where nxB is the number of times both A and B occur, and nB is
the number of times B occurs P(A IB) = P(AB)/P(B), since P(AB) = nAB/N and
P(B) = ns/N so that P(AI B) - n~s / N _ nAs/nB Note that in general the conditional
nB/N
probability of A given B is not the same as B given A The probability of both A and B
together is P(AB), and P(AIB) x P(B) = P(AB) = P(BIA ) x P(A), if both P(A) and P(B) are non-zero This leads to a statement o f Bayes's Theorem: P(BIA) = P(AIB) xP(B)/P(A)
Conditional probability is also the basis for statistical dependence and independence
Joint, marginal and conditional densities are summarized in Fig A-1
Trang 21The Central Limit Theorem justifies using a multivariate normal density to model the
collective behavior o f regression model parameters The CLT states that the distribution
of an average tends to be normal, even when the distribution from which the average is
same mean as the parent distribution, and variance equal to the variance of the parent
only that the mean and variance are finite And "large" n may be on the order of a dozen
zn converges in distribution to N(O, 1) as n becomes large, and
References
Fatigue Crack Propagation," AFFDL-TR-78-43, Air Force Flight Dynamics
Laboratory, April, 1978
Method: Applications for Structural Design and Maintenance, ASTM STP-798, J
.M Bloom and J C Ekvall, Eds., American Society for Testing and Materials,
1983, pp 97-115
Elevated Temperature Fatigue Crack Propagation," AFML-TR-76-176, Part I,
November 1976, presented at 1977 Society for Experimental Stress Analysis
Spring Meeting, Dallas, Texas, May 1977
Acyclic Graph Paradigm for Probabilistic High Cycle Fatigue Risk Assessment,"
supported under Independent Contractor Agreement 01-$441-48-01-C4, Universal
Technology Corporation Prime Contract F33615-98-C-2807, September, 2001
Analysis, Chapman and Hall/CRC, 1996
Look at an Old Idea," AIAA 2002-13800, presented at 43rd
AIAMASME/ASCE/AHS Structures and Dynamics Conference, Denver, CO, 22-
25 April, 2002
14th edition was ready for publication in 1962, when Fisher died, and was
and Scientific Inference, as a single volume.)
Trang 22Joumal of ASTM International, September 2004, Vol 1, No 8
Paper ID JAIl 1560 Available online at www.astm.org
Christos C Chamis I and Shantaram S Pai 2
Probabilistic Fatigue: Computational Simulation
ABSTRACT: Probabilistic computational simulation of fatigue-life is illustrated in terms of several sample cases that have been generated over the past ten years The cases are selected to illustrate applications to multi-scale, multi-discipline, and multi-physics These cases include composite laminate; coupled thermal, mechanical, fatigue, and creep; pressurized tank; engine blades; engine rotor; and
composite combustor liner The fundamentals for probabilistic computational fatigue are briefly described, and general comments are included on what it takes to perform probabflistic computational fatigue and to validate it Typical results show that fatigue-life can be evaluated for complex components and for complex loadings Probability of survival curves can be generated, and probabilistic sensitivities influencing fatigue-life can be determined The paper describes what can be done rather than details of a specific case
KEYWORDS: composite, metals, components, sensitivities, results
I n t r o d u c t i o n
Fatigue is a primary consideration in the design o f aerospace structures for long-term durability and reliability There are several types o f fatigue that must be considered in the design, including low cycle, high cycle, and combined for different cyclic loading conditions - for example, mechanical, thermal, and erosion The traditional approach to evaluate fatigue has been to conduct many tests in the various service environmental conditions that the component will be subjected to in a specific design This approach is reasonable and robust for that specific design However, it is time consuming and~ostly, and it must be repeated for designs in different operating conditions in general
Recent research has demonstrated that fatigue o f structural components/structures can be evaluated by computational simulation based on a novel paradigm The main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring, and progressive structural fracture, encompassed by probabilistic simulation These generic features o f this approach are to probabilistically scale-telescope, to scale local material point damage all the way up to the structural component, and to probabilisticaUy scale-decompose structural loads and boundary conditions all the w a y down to material point Additional features include a multi-factor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load, and other mutually interacting effects The objective o f this paper is to describe this novel paradigm o f computational simulation and present typical fatigue results for structural components that have been generated over the past ten years Additionally, advantages, versatility, and inclusiveness o f computational simulation versus testing are discussed Guidelines for complementing simulated results with strategic testing are outlined Typical results are shown for computational simulation o f fatigue in composite and
Manuscript received 6 September 2002; accepted for publication 24 February 2004; published September 2004 Presented at ASTM Symposium on Probabilistic Aspects of Life Prediction on 6 November 2002 in Miami Beach, FL; W S Johnson and B M Hillberry, Guest Editors
z Senior Aerospace Scientist, NASA Glenn Research Center, Cleveland, OH 44135
z Aerospace Engineer, NASA Glenn Research Center, Cleveland, OH 44135
Trang 23metallic structures to demonstrate the versatility of this novel paradigm in predicting a priori fatigue-life
One interesting and perhaps unexpected result is that the survival probability of disk burst is about 1.00, while disk burst is considered to be the most catastrophic fracture However, fracture
at the bore has a survivability of about 0.85 and at the rim has a survivability of about 0.70, which is the same as that for system multi-failure mode According to this evaluation, the two- stage rotor will most probably fracture at the rim that has the lowest survival probability rating Other specific sample cases include fatigue-life for a composite laminate, an engine blade, an internal pressurized tank, and a combustor liner Discussion on the significance of the results is included, and general comments are made on what is required to perform probabilistic computational simulation of fatigue-life, especially in generating probability of survival with limited data In this context, the paper presents a review of what has been done at NASA Glenn Research Center in order to demonstrate what can be done in general Specifics about individual cases are described in the references The authors consider the paper a major contribution because of its inclusiveness in this new and emerging area
F u n d a m e n t a l s
The fundamentals that lead to computational simulation ofprobabilistic fatigue have evolved over three decades [1,2] Here a brief description is summarized for completeness The description is multi-discipline, multi-scale [4], and multi-factor for material interaction effects [5] In the evolution timeline, multi-discipline is simulated by CSTEM (Coupled Structural Thermal Electro-maguetie Acoustic Tailoring) [6]; multi-scale is simulated by scale telescoping/tunneling mechanics [7]; and multi-factor material properties interaction is represented by the multi-factor interaction model [8]
A schematic of scale telescoping mechanics in composites is illustrated in Fig 1, where incorporation of the uncertainties at that scale is depicted in the Bell Diagram schematic The schematic illustrates that uncertainties from a lower scale contribute to the uncertainties in the scale of observation as well as uncertainties unique to the scale o f observation A schematic of the disciplines included in CSTEM is shown in Fig 2 CSTEM includes discipline modules for structural/stress analysis (static transient), heat transfer (conductive, convective, radiation), Electromagnetic (Maxwell's Equations and appropriate approximation), Acoustic (structural vibration generated), Optimization (Optimal Feasible Direction), Composite Mechanics (micro, macro, laminate), and Finite Elements Model Generator (8, 16, and 20 node brick elements) CSTEM may be viewed as "virtual coupled discipline interaction." A schematic of the probabilistic multi-factor interaction model (MFIM) is shown in Fig 3 The schematic depicts the MFIM to be a surface in space defined by a set of vectors for each effect Each effect has its respective uncertainties that are represented in the surface Probabilistic component structural fatigue is a complex manifestation of coupled known and unknown effects The equation for MFIM is unique to the simulations described herein It is instructive to show a form of this equation (Fig 3) and therefore requires coupled multi-scale, multi-discipline, and multi-factor material models to be described "adequately" Application to select examples follows in subsequent sections
Trang 24CHAMIS AND PAl ON PROBABILISTIC FATIGUE 17
Integrated Composite ANalyzer - PICAN)
(with integrated material behavior models)
Trang 25T: thermal cyclic load
Superscripts: m, n, q, r, u and v are exponents for the factor that describes the effect on that material property
Probabilistic Fatigue in C o m p o s i t e L a m i n a t e s
A typical probabilistic fatigue resistance of composite laminate is shown in Fig 4 The details of the simulation are described in [9]; here we present select results and describe some of their significance The schematics at the top show the panel and the loading In the middle left
of the figure, the probability o f occurrence of remaining life in terms o f frequency and cycles ratio is shown In the middle right, the probabilistie sensitivity factors that affect the probability
as noted under the figure are shown The dominant failure made is noted below the figure Important observations from the results in Fig 4 are: (1) probabilistic composite fatigue can be simulated as ratios of cycles to failure and affects o f frequency; (2) for a given fatigue cycle, the higher the cyclic frequency, the higher the probability of occurrence; (3) conversely, for a given probability, the higher the frequency, the lower the fatigue cycle ratio (fatigue-life); (4) frequency has negligible affects on the sensitivity factors; and (5) the higher the frequency, the greater the scatter range on the fatigue cycle ratio (fatigue-life) o f the composite laminate
C o u p l e d T h e r m a l , Fatigue, and Creep A n a l y s i s
The effectiveness o f the MFIM to represent complex material behavior is illustrated by its application to coupled thermal, fatigue, and creep problems The details o f how that was done are described in [10] Here, it suffices to comment on the significance o f the computational simulation and the results obtained therein Typical results are shown in Fig 5 for a nickel-based
Trang 26CHAMIS AND PAl ON PROBABILISTIC FATIGUE 19
super alloy suitable for high temperature space shuttle engine turbines As can be seen, the probability of occurrence is plotted versus lifetime strength to reference strength ratio Reference strength is that obtained by uniaxial static test to material fracture It is important to state that: (1) there is no data fit for this simulation, and (2) the results represent material qualifications at those conditions9 As expected, the strength decreases as the use temperature increases However, the important and subtle points to observe are that: (1) the probability curves are parallel; (2) there is a greater spread between the 781~ and the 1562~ curves than there is between the 68~ and the 781~ curves; (3) the scatter in lifetime strength for each curve may be obtained from the difference between a high probability of about 0.99 and a low probability of about 0.01 (roughly from the curves 0.31-0.23 for 68~ 0.28-0.19 for 781~ and 0.23-0.13 for 1562~ curves); (4) there is considerable overlap between the scatters among the three different probability curves; this, in part, explains the difficulties encountered to set service environment allowables by testing; and (5) since the probability curves are parallel, only the 68~ curve needs to be probabilisticaUy evaluated Other higher/lower temperature curves may
be obtained by a parallel shift of the 68~ curve
It should be evident from the above discussion that the computational simulation method described represents practical applications and has the potential for substantial savings in material, time, and cost during the material characterization and acceptance phase of development programs
Tranverse tension In 90 ~ ply at low frequency
Compression In 9El' ply at hlgh frequency
Trang 270.4 t 0.2
Lifetime Strength / Reference Strength
FIG 5 Simulated lifetime strength for a nickel-based superalloy (subjected to 3162 stress cycles and 100 h of creep)
Space Shuttle Main Engine Blade
Space shuttle main engine blades are subjected to severe thermomechanical loads The finite element model of the blade simulated is shown in Fig 6, where the loading conditions are also shown The details of the probabilistic computational simulation are described in [11] It is important to note that the material behavior in those conditions was modeled by using the MFIM
as described in the previous section We present the results and comment on their relevance and significance
Two sets of probability levels are shown in Fig 7 for damage initiation and progression for survival in the operating conditions The most probable path to occur first is the one with the largest probability (0.0002) This implies one occurrence in 5000 flight cycles with a "safety factor" of 10 since the engine blades are designed for about 500 flight cycles It is interesting to observe in Fig 7 that both initiation paths have the same end point This is significant because it shows in part that a specific structure will sustain a certain amount of damage starting from an undamaged state and operating in specified loading conditions This tentative conclusion may have profound implications in evaluating the damage tolerance of critical (load bearing) components in complex loading environments For example, the strain energy released along this path can be plotted as shown in Fig 8, the rapid increase from state 3 to state 4 at which the blade separated into two parts (structural fracture) The results in Fig 8 illustrate the following important points: (1) Structural fracture is imminent when a rapid increase in strain energy release rate occurs, which is state 3 to 4 in Fig 8 (2) Structural fracture parameters can be inferred from this figure The amount of damage prior to rapid propagation is obtained by extending the "time" curve from state 4 to state 3 until it intersects the abscissa This then would
be the critical damage The corresponding strain energy is obtained by drawing a line parallel to the abscissa to intersect the ordinate That value is the critical strain energy release The values for critical damage and strain energy are about 2.9 and 25, respectively The significance of this inference is that critical fracture toughness parameters can be probabilistically simulated
Trang 28CHAMIS AND PAl ON PROBABILISTIC FATIGUE 21
eomputationally without recourse to complex, if not impossible, testing Another inference is that health monitoring systems can be designed based on the information from that shown in Figs 7 and 8 or other information, such as displacements, vibration frequencies that can be evaluated along the probable fracture path The major conclusion from the above discussion is that probabilistie computational simulation of fatigue-life provides a wealth of information that enriches the knowledge o f a component-design operating in complex environments
Trang 29STATE I DAMAGE INITIATED AT NODE 10 STATE 2
DAMAGE EXTENDED TO NODE 9 STATE, 3
DAMAGe' EXTENDED TO NODE 14 STATE 4
DAMAGE EXTENDED TO NODE 18
P r e s s u r i z e d T a n k
The fatigue-life o f a pressurized tank is evaluated by using conventional finite element modeling in conjunction with Paris law for local fatigue crack growth and node unzipping for the progressive fracture of the tank The details are described in [12] This sample case is included herein to illustrate an approach to computational fatigue-life alternative to that described previously with the use of MFIM The f'mite element model of the tank is shown in Fig 9 The tank is subjected to internal pressure The tank bottom is evaluated for probabilistic fatigue-life The progressive opening of the crack is shown in Fig 10 As can be seen, the crack progressed
in a self-similar manner along the nodal line The results are summarized in Fig 11 where the Pads Crack Growth Law is also included The results summary shows the number of fatigue cycles required to grow the crack from initiation to the point where it became unstable It is worth noting that the number of cycles to initiate the crack and grow it to the next node was about 75 % of the fatigue-life, while the number of cycles to grow it to unstable state was only
25 % (about 28 000 and 9000 o f 37 000 cycles total)
The major difference of this approach compared to that using the MFIM includes the following: (1) The approach requires empirical data for C, AK, M, and Y in addition to the data required for the finite element The use of MFIM requires only the data that is used in the finite element (2) This approach does not account for frequency or temperature effects The use of MFIM accounts for both (3) This approach will require comparable data for C, AK, M, and Y for thermal cycles, while MFIM does not
One major conclusion of this sample case and the discussion is that the "how" to
"probabilistic computational fatigue" is not unique It is very much dependent on the knowledge
of the evaluator and on the information available at the time of the evaluation It is also important to note that the MFIM is more inclusive in the representation o f material behavior in complex conditions than in other comparable single or segmented representations
Trang 30CHAMIS AND PAl ON PROBABIUSTIC FATIGUE 2:3
Number of Cycles to Grow Crack Computed Using Crack Growth Law Given Crack Increment
- r
E v ~ i # Cy~
Crack Initiation at node 144 i 1'00"104
Fraeturo, N143 > N142 ~ 5.99.103 Fracture, N142 -> N,41 i Z99"103
Trang 31Engine Rotor S y s t e m
This sample case is presented to illustrate that multiple failure modes can be probabilistically simulated The schematic o f the two-stage rotor, the failure cracks evaluated, and the results from the evaluation are summarized in (Figure 12) The details on how that was done, the traditional equations used for each of the three failure modes, and the progressive fracture
significance The schematic of the rotor and respective dimensions are shown in the upper left of Fig 12 The survival probability is shown in the upper right, while the description of failure modes is shown at the bottom of Fig 12 The survival cumulative probability is plotted versus the remaining resistance to initial resistance ratio for each of the failure modes in the right upper part of Fig 12 Note that fracture at the rim and system failure coincide, indicating that rim fracture is the dominant failure mode for that rotor
Burst strength 10,000 cycles I0,000 cycles Yield strength
It was mentioned previously that an important by-product o f probabilistic computational simulation is the prediction of probabilistic sensitivities The probabilistic sensitivities for the rotor fatigue-life are summarized in Table 1 All of the fundamental physics variables that
numerical affects are normalized such that the sum of their squares is unity By doing so, the affects can be ordered, and their contributions can be identified by their relative magnitude R can be observed in Table 1 that the rotor speed (applied load) has the greatest affect, foUowed by the rotor density about half of the speed, by the rotor temperature (about one-fourth of the
Trang 32CHAMIS AND PAl ON PROBABILISTIC FATIGUE 25
speed) The empirical constant, C, in the Paris Crack Growth Law is about one-seventh of the speed The rest of them have about 10 % affect or less relative to speed, including all the other parameters used for evaluating fatigue-life, traditionally, AO initial crack length, NI exponent in the Paris Law, Kt, notch sensitivity, and A used in low cycle fatigue evaluations
Typical results obtained are from the CIVIC combustor shown schematically in Fig 13 Probabilistic stresses versus fatigue cycles are shown in Fig 14 Observe that the mean stress (at about 0.5 probability) and the probable scatter decrease as the fatigue cycles increase Corresponding sensitivity factors are shown in Fig 15 Note that these sensitivities are grouped into location (field) point near field for field combustor thickness and combustor hoop stiffness, since this stiffness contributes minimally to the probability of the stress It is worth noting that the location has the greatest contribution followed by near field and far field In these regions, the sensitivity appears to be independent o f the fatigue cycles as well as the hoop stiffness However, the thickness sensitivity is dependent on the fatigue cycles and increases as the thickness increases
Trang 33FIG I3 Finite element model of engine hot
section (ceramic-matrix-composite
component)
different load cycles
different load cycles at O 01 probabilitY
The probabilistic strength versus fatigue cycles is shown in Fig 16 Note that the mean strength (at about 0.5 probability) and the probable scatter decrease as the fatigue cycles increase Though the decrease in the mean strength is expected, the decrease in the probable scatter is not The authors have no rational explanation for this behavior at this time The variations o f hoop stress and strength versus fatigue cycles are plotted in Fig 17 for 0.01 and 0.001 probabilities It can be seen that the 0.01 probability strength is considerably higher than the 0.01 probability stress However, the 0.001 probability strength curve is progressively lower than the 0.001 probability stress cure except at very low cycles The results in Fig 17 simply demonstrate the importance o f probabilistic evaluations - that large margins at some probability
m a y diminish and even reverse at lower probabilities In other words, probabilistic evaluations reveal pitfalls about structural fatigue-life
Information from Figs 14, 16, and 17 can be combined to produce a probability o f survival master curve versus fatigue cycles This type o f curve is illustrated in Fig 18 It can be observed that this curve is limited to high probability o f survival 0.999 (rounded-off) Also, the curve shows three distinct fatigue life regions few cycles (less than 25 % o f life) progressive
Trang 34CHAMIS AND PAl ON PROBABILISTIC FATIGUE 27
survival degradations; intermediate (25-60 % of life) with negligible survival degradation; and high (greater than 60 % of life) with repaid survival degradation Another interesting point is that probabilistie fatigue-life indicates 14 losses in 10 000, for about 95 % of the total fatigue- life
This sample case illustrates what can be done to estimate probabilistie fatigue-life with limited data, but with relevant knowledge and resources Relevant knowledge is knowledge about the technical disciplines involved: finite element analysis, composite mechanics fatigue, damage initiation, and growth and representation of multiple interaction factors on material behavior Relevant resources include available computer codes (computational simulation and probability evaluation algorithms) used to implement the relevant knowledge, such as those cited
Hoop Strength (ksi)
FIG 16 Probabilistic compressive hoop strength for different load cycles (cycles
degrade strength and reduce scatter)
FIG 17 Probabilistic evolution of local strength and stress for high
cycle fatigue environment
Trang 35i.O000E+O0
\
9.994QE-01~
9.99o0E.~ k '~~ 9.9880E-0#
Conclusion
Several sample cases of probabilistic computational fatigue were presented and discussed These sample cases include fatigue-life of composite laminates, hot engine blades, pressurized tanks, engine simulated rotor, and ceramic matrix composite combustor liner Results from these simulations show that fatigue-life can be predicted for just about any situation Probabilistic sensitivity factors can be used to identify factors that have significant effect on the probability of fatigue-life A multi-factor interaction model is available to represent the complex material behavior in a multi-scale, multi-discipline, multi-physics environment Results also show there
is no unique way to perform probabilistic computational simulation of fatigue-life Fatigue-life cycles can be predicted, and survival curves can be generated by available methods with an understanding of the complexity involved and with knowledge of available relevant technology Probabilistic computational simulation has the potential to minimize the present effort required
to evaluate fatigue-life experimentally
Trang 36CHAMIS AND PAl ON PROBABILISTIC FATIGUE 29
References
[1] Chamis, C C., Murthy, P L N., and Minnetyan, L., "Progressive Fracture in
Composite Structures," Composite Materials, Fatigue and Fracture, ASTM STP 1285,
ASTM International, West Conshohocken, PA, 1997, pp 70-84
[2] Chamis, C C., "Probabilistic Composite Design," Composite Materials: Testing and
Design, ASTM STP 1242, ASTM International, West Conshohocken, PA, 1997, pp 23-
42
[3] Singhal, S N and Chamis, C C., "Multidisciplinary Tailoring of Hot Composite Structures," NASA-TM 106027, 1993
[4] Liaw, D G., Shiao, M C, Singhal, S, N., and Chamis, C C., "Probabilistic Simulation
of Multi-Scale Composite Behavior," NASA-TM 10696, 1992
[5] Tong, M T., Singhal, S N., Chamis, C C., and Murthy, P L N., "Simulation of
Fatigue Behavior of High Temperature Metal Matrix, Composites," American
Technical Publications, 1253, ASTM International, West Conshohocken, PA, 1996, pp 540-551
[6] Singhal, S N., Murthy, P L N., Chamis, C C., Nagpal, V R., and Sutjahjo, E.,
"Computational Simulation of Acoustic Fatigue for Hot Composite Structures," NASA-
TM 104379, 1991
[7] Chamis, C C., Murthy, P L N., Gotsis, P K., and Mital, S K., "Telescoping
Composite Mechanics for Composite Behavior Simulation," Computer Methods in
Applied Mechanics and Engineering, 2000, pp 399-411
[8] Murthy, P L N., Chamis, C C., and Singhal, S N., "Hierarchical Nonlinear Behavior
of Hot Composite Structures," NASA-TM 106229, 1993
[9] Shah, A R and Chamis, C C., "Cyclic Load Frequency Effect on Fatigue Reliability of
Polymer Matrix Composites," 37 ~hAIAA/ASME/ASCE/AHS/ASC Structures, Structural
Dynamics and Materials, Part 4, 1996, pp 2133-2143
[10] Boyce, L and Chamis, C C., "Quantification of Uncertainties in Coupled Material
Degradation Processes: High Temperature, Fatigue and Creep," 32 ~a AIAA/ASME/
ASCE/AHS Structures, Structural Dynamics and Materials Conference, Part 1,1991, pp 66 72
[11] Shiao, M C and Chamis, C C., "Probability of Failure and Risk Assessment of Propulsion Structural Components," NASA-TM 102119, 1989
[12] MiUwater, H R and Wu, Y T., "Structural Reliability Analysis Using a Probabilistic Finite Element Program," 3 0 th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference, Part 4, 1989, pp 1846-1851
[13] Mahadevan, S and Chamis, C C., "Structural System Reliability Under Multiple
Failure Modes," 34 th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and
Materials Conference, 1993, pp 707-713
[14] Pal, S S and Chamis, C C., "Quantification of Thermal-Structural Uncertainties in
Engine Combustor Composite Liners," ASME Technical Paper, No 97-6T-108, 1997
Trang 37Duncan P Shepherd 1
Available online at www.astm.org
The Prediction of Fatigue Life Distributions from the Analysis
of Plain Specimen Data
ABSTRACT: For any structurally critical component subject to fatigue, the safety of the structure depends on an accurate prediction of the life under this failure mode However, in such circumstances it
is insufficient to consider only the mean behavior of the material To ensure structural integrity, a model for the distribution of life to failure is required, which will allow lives to be assessed relative to acceptable safety levels
In previous work, a methodology for deriving fatigue life estimates for arbitrary specimen and component geometries from plain specimen data has been developed [4] The methodology is based on a procedure for developing a model for the initiation behavior of the material from the specimen data and for applying this to an arbitrary material geometry or stress field In the current paper, this method is further developed to allow for the associated distribution of fatigue lives to be calculated This involves direct consideration of the statistical relationship between crack initiation and crack propagation, so that the distribution of initiation lives can be derived accurately However, incorporating these considerations directly into the methodology reveals some inconsistencies in the formulation of the original model These relate to the fact that, at high stresses, the specimens will fail in tension rather than classical fracture, thus altering the interpretation of the data It is shown that a more robust model can be developed, but only by including the distribution of tensile strength as an additional variable, and by considering the statistical relationship between this and the other fundamental variables
The methodology which arises from the incorporation of these considerations into the basic calculation scheme is then developed, including a means for estimating the distribution of life to failure at all points
on the stress against cyclic life curve
KEYWORDS: low cycle fatigue, statistics, probability distribution, 3-parameter Weibuil, S-N curve
Introduction
W i t h i n an aero gas turbine engine, a c o m p o n e n t is designated as "fracture critical" i f the fatigue failure o f the c o m p o n e n t could result in a loss o f the entire aircraft Airworthiness regulations require that in-service lives for these c o m p o n e n t s b e managed such that the probability o f failure w i t h i n the allowed lifetime is kept b e l o w an acceptable threshold (often characterized as " e x t r e m e l y remote") The derivation o f this service lifetime is a highly c o m p l e x task, i n v o l v i n g a detailed understanding o f the material b e h a v i o r under all the relevant conditions likely to b e experienced within the engine
Manuscript received 11 December 2002; accepted for publication 24 February 2004; published September 2004 Presented at ASTM Symposium on Probabilistic Aspects of Life Prediction on 6 November 2002 in Miami Beach, FL; W S Johnson and B M Hillberry, Guest Editors
t Senior Mathematician, Airworthiness and Structural Integrity Group, QinetiQ Ltd, Cody Technology Park, Ively Road, Famborough, Hants, GU14 0LX, UK
Trang 38SHEPHERD ON FATIGUE LIFE DISTRIBUTIONS 31
Historically, the regulations governing the life management of aeroengines within the UK have required that such service lifetimes be derived exclusively from full-scale component rig tests The success of this method in providing very high levels of safety can be gauged from the extremely rare occurrence of such failures in both civil and military aeroengine operation However, due to the very high cost of performing the required rig tests, there is increasing pressure to broaden the basis on which component lives are derived While this has occurred with the introduction of databank lifing for civil engines, there is considerable interest from within the industry to expand this further The aim is to exploit the very great advances in materials modeling and behavioral understanding which have occurred since the original regulations were derived, to ensure that optimum component lives are derived in the most efficient manner possible
The work reported here is the continuation of a program of research conducted in collaboration with Rolls-Royce, developing a methodology for deriving fracture critical
cylindrical specimens) The proposed methodology, together with a series of initial results describing the derivation of mean life predictions for a number of featured specimens and components, have been described previously [1-4] In this paper, the question of how these methods should be extended to provide a prediction of the distribution of fatigue life is addressed In particular, issues relating to the interpretation of the S-N curve are discussed, and methods for describing (in probabilistic terms) the behavior close to the ultimate tensile strength are developed In the second section, "integrated Lifing Methodology," the basic framework of the Integrated Lifing Methodology (ILM) is described, and a summary of the results obtained to date is given In the third section, "Extension to Fatigue Life Distribution Prediction," the problems to be addressed in extending this framework to the prediction of fatigue life distributions are described, and certain difficulties with the original model are highlighted In the fourth section, "Redefinition of the Model," these problems are addressed, and it is demonstrated that any probabilistic model of the S-N curve which seeks to describe the behavior close to the tensile strength of the material must include the distribution of tensile strengths In the fifth section, "Description of the Fatigue Life Distribution," a possible method for modeling the necessary distributions is described, which is followed by the conclusions
Integrated Lifing Methodology
The development of the ILM is an ongoing research activity, the aim of which is to bring together the most recent advances in material modelling and finite element stress analysis to provide an accurate life estimation tool This tool must provide an accurate fatigue life distribution capability in the most efficient manner possible, yet it must still be flexible enough
methodology, including the validation of the technique using the Rolls-Royce database on Waspaloy, have been given in a number of previous publications [1-4] However, to make the current paper as self-contained as possible, a brief summary of the main features will be provided here, together with some details of how these methods have been implemented From this, the particular aspects of the methodology to be considered further in this paper will be highlighted
A schematic outline of the ILM process is shown in and is divided into a number of distinct stages (see Fig 1) The starting point for the process is a database of fatigue test results in the material to be studied, which must include sufficient plain specimen results to characterize
Trang 39adequately the material under all conditions of interest However, the inclusion of featured specimen and component tests within the database offers a considerable advantage, in that these results can be used to validate the model that is developed The Waspaloy database used in the validation contains over 1500 round bar plain specimen results, 33 notched specimen tests (Kt 1.66 and 2.29), 10 component rig tests, and 29 washer specimen results
Size effect model - integration
I Calculate crack initiation life for arbitrary specimen
or component
Add
Calculate crack initiation life for arbitrary specimen or component
The first stage of the analysis consists of a finite element stress analysis of all the specimen
specimen/component geometries for which it is intended to develop fatigue life predictions, but also any results which will be used in the validation process A fundamental feature of the ILM
is that the finite element analysis itself should incorporate the most advanced methods available,
elastic/plastic/creep analysis has been conducted for each set of conditions under which predictions are required The plasticity element of the model utilizes the Mroz multilayer hardening rule [5], which extends the Prager linear hardening rule by the superposition of several yield surfaces of different sizes, each exhibiting linear behavior with a different gradient To model the shakedown behavior accurately, a technique has been developed whereby the material
is allowed to relax over several cycles [6] The model shakedown is controlled as a linear function of the total plastic strain and is stabilized according to certain preset limits The model
is combined with standard creep algorithms, assuming that the two phenomena are uncoupled
As a part of this process, any test results within the plain specimen dataset which are suspected
of being due to creep (or for which the observed lifetime is a combined fatigue and creep failure) are removed This is to ensure that the fatigue life prediction model is not biased by the inclusion of non fatigue related results
Trang 40SHEPHERD ON FATIGUE LIFE DISTRIBUTIONS 33
The second stage o f the process involves the development o f a model for the mean 2 fatigue crack initiation behavior for the material from the plain specimen results alone This is obtained
by fitting a 3-parameter Weibull regression model to the observed lives to failure, for which the Weibull modulus is fixed, and the characteristic life and threshold parameter lines are parallel The calculated mean crack propagation lifetime for the given test conditions is then subtracted from the characteristic life to give the initiation characteristic life The crack size which defines initiation, and hence propagation, is treated as an additional calibration parameter within the fitting process and assumes a value o f 0.35 mm for the Waspaloy data The distribution o f initiation lives is obtained by using the Weibull modulus obtained from the total life data, together with a threshold parameter obtained by translating the characteristic life line the same distance as for the total life data Within the current implementation, fatigue life is characterized
in terms o f the Walker strain parameter
c r ~ / E A r ] m
where E is the material modulus, Ae denotes the strain range, and m is an empirical factor to account for the effects o f mean stress [7] The crack propagation lives were calculated using standard linear elastic fracture mechanics principles with a linear Paris relation, and failure is predicted by the fracture toughness Handbook stress intensity factors for elliptical cracks in round bars were used [8] The resulting crack initiation model for the Waspaloy results is shown
in Fig 2 (the meaning o f the vertical line is explained on page 9)
The word "mean" is used here to denote an appropriate measure of location, rather than the actual statistical mean Because the total life is modeled using a WeibuU distribution, it is much more convenient to use the characteristic life rather than the mean itself