1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Manufacturing Handbook of Best Practices 2011 Part 3 pdf

20 555 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Design of Experiments
Tác giả Jack B. ReVelle, Ph.D.
Trường học CRC Press
Chuyên ngành Manufacturing
Thể loại sách
Năm xuất bản 2001
Thành phố Boca Raton
Định dạng
Số trang 20
Dung lượng 1,89 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

3.3 GLOSSARY OF TERMS AND ACRONYMS TABLE 3.1 Glossary of Terms and Acronymns Confounding When a design is used that does not explore all the factor level setting combinations, some inter

Trang 1

Jack B ReVelle, Ph.D.

3.1 OVERVIEW

Design of experiments (DOE) does not sound like a production tool Most people who are not familiar with the subject might think that DOE sounds more like something from research and development The fact is that DOE is at the very heart of a process improvement flow that will help a manufacturing manager obtain what he or she most wants in production, a smooth and efficient operation DOE can appear complicated at first, but many researchers, writers, and software engineers have turned this concept into a useful tool for application in every manufacturing operation Don’t let the concept

of an experiment turn you away from the application of this most useful tool DOEs can be structured to obtain useful information in the most efficient way possible

3.2 BACKGROUND

DOEs grew out of the need to plan efficient experiments in agriculture in England during the early part of the 20th century Agriculture poses unique problems for experimentation The farmer has little control over the quality of soil and no control whatsoever over the weather This means that a promising new hybrid seed in a field with poor soil could show a reduced yield when compared with a less effective hybrid planted in a better soil Alternatively, weather or soil could cause a new seed

to appear better, prompting a costly change for farmers when the results actually stemmed from more favorable growing conditions during the experiment Although these considerations are more exaggerated for farmers, the same factors affect manufacturing We strive to make our operations consistent, but there are slight differences from machine to machine, operator to operator, shift to shift, supplier to supplier, lot to lot, and plant to plant These differences can affect results during experimentation with the introduction of a new material or even a small change in

a process, thus leading to incorrect conclusions

In addition, the long lead time necessary to obtain results in agriculture (the growing season) and to repeat an experiment if necessary require that experiments

be efficient and well planned After the experiment starts, it is too late to include another factor; it must wait till next season This same discipline is useful in manufacturing We want an experiment to give us the most useful information in the shortest time so our resources (personnel and equipment) can return to production One of the early pioneers in this field was Sir Ronald Fisher He determined the initial methodology for separating the experimental variance between the factors and the underlying process and began his experimentation in biology and agriculture

SL3003Ch03Frame Page 49 Tuesday, November 6, 2001 6:11 PM

Trang 2

50 The Manufacturing Handbook of Best Practices

The method he proposed we know today as ANalysis Of VAriance (ANOVA) There

is more discussion on ANOVA later in this chapter Other important researchers have been Box, Hunter, and Behnken Each contributed to what are now known as classical DOE methods Dr Genichi Taguchi developed methods for experimentation that were adopted by many engineers These methods and other related tools are now known as robust design, robust engineering, and Taguchi Methods™

3.3 GLOSSARY OF TERMS AND ACRONYMS

TABLE 3.1 Glossary of Terms and Acronymns

Confounding When a design is used that does not explore all the factor level

setting combinations, some interactions may be mixed with each other or with experimental factors such that the analysis cannot tell which factor contributes to or influences the magnitude of the response effect When responses from interactions or factors are mixed, they are said to be confounded.

DOE Design of experiments is also known as industrial experiments,

experimental design, and design of industrial experiments Factor A process setting or input to a process For example, the

temperature setting of an oven is a factor as is the type of raw material used

Factor level settings The combinations of factors and their settings for one or more

runs of the experiment For example, consider an experiment with three factors, each with two levels (H and L = high and low) The possible factor level settings are H-H-H, H-L-L, etc Factor space The hypothetical space determined by the extremes of all the

factors considered in the experiment If there are k factors in the experiment, the factor space is k-dimensional

Interaction Factors are said to have an interaction when changes in one factor

cause an increased or reduced response to changes in another factor or factors

Randomization After an experiment is planned, the order of the runs is

randomized This reduces the effect of uncontrolled changes in the environment such as tool wear, chemical depletion,

warm-up, etc

Replication When each factor level setting combination is run more than one

time, the experiment is replicated. Each run beyond the first one for a factor level setting combination is a replicate.

Response The result to be measured and improved by the experiment In

most experiments there is one response, but it is certainly possible to be concerned about more than one response

SL3003Ch03Frame Page 50 Tuesday, November 6, 2001 6:11 PM

Trang 3

Design of Experiments 51

3.4 THEORY

This section approaches theory in two parts The first part is a verbal, nontechnical discussion The second part of the theory section covers a more technical, algebraic presentation that may be skipped if the reader desires to do so

Here is the question facing a manager considering an experiment for a manufac-turing line: What are my optimal process factors for the most efficient operation pos-sible? There may be many factors to be considered in the typical process One approach may be to choose a factor and change it to observe the result Another approach might change two or three factors at the same time It is possible that an experimenter will

be lucky with either of these approaches and find an improvement It is also possible that the real improvement is not discovered, is masked by other changes, or that a cheaper alternative is not discovered In a true DOE, the most critical two, three, or four factors (although higher factors are certainly possible, most experiments are in this range) are identified and an experiment is designed to modify these factors in a planned, systematic way The result can be not only knowledge about how the factors affect the process, but also how the factors interact with each other

The following is a simple and more technical explanation of looking at the theory

in an algebraic way Let’s consider the situation of a process with three factors: A,

B, and C For now we’ll ignore interactions The response of the system in algebraic form is given by

(3.1)

where β0 is the intercept, β1, β2, and β3 are the coefficients for the factor levels represented by ΧA , ΧB , and ΧC , andε represents the inherent process variability Setting aside ε for a while, we remember from basic algebra that we need four distinct experimental runs to obtain an estimate for β0, β1, β2, and β3 (note that ε and β0 are both constants and cannot be separated in this example) This is based

on the need for at least four different equations to solve for four unknowns The algebraic explanation in the previous paragraph is close to the underlying principles of experimentation but, like many explanations constructed for simplicity, it

is incomplete The point is that we need at least four pieces of information (four equations) to solve for four unknowns However, an experiment is constructed to provide sufficient information to solve for the unknowns and to help the experimenter determine

if the results are statistically significant In most cases this requires that an experiment consist of more runs than would be required from the algebraic perspective

Statistically

significant

A factor or interaction is said to be statistically significant if its contribution to the variance of the experiment appears to be larger than would be expected from the normal variance of the process

TABLE 3.1 (continued)

Glossary of Terms and Acronymns

Y=β0+β1X A+β2X B+β3X C

SL3003Ch03Frame Page 51 Tuesday, November 6, 2001 6:11 PM

Trang 4

52 The Manufacturing Handbook of Best Practices

3.5 EXAMPLE APPLICATIONS AND PRACTICAL TIPS

3.5.1 U SING S TRUCTURED DOE S TO O PTIMIZE

P ROCESS -S ETTING T ARGETS

The most useful application for DOEs is to optimize a process This is achieved by determining which factors in a process may have the greatest effect on the response The target factors are placed in a DOE so the factors are adjusted in a planned way, and the output is analyzed with respect to the factor level setting combination

An example that the author was involved in dealt with a UV-curing process for

a medical product This process used intense ultraviolet (UV) light to cure an adhesive applied to two plastic components The process flow was for an operator

to assemble the parts, apply the adhesive, and place the assembly on a conveyor belt that passed the assembly under a bank of UV lights The responses of concern were the degree of cure as well as bond strength An additional response involved color

of the assembly since the UV light had a tendency to change the color of some components if the light was too intense The team involved with developing this process determined that the critical factors were most likely conveyor speed, strength

of the UV source (the bulb output diminishes over time), and the height of the UV source Additionally, some thought that placement of the assembly on the belt (orientation with respect to the UV source bulbs), could have an effect, so this factor was added

An experiment was planned and the results analyzed for this UV-curing process The team learned that the orientation of the assemblies on the belt was significant and that one particular orientation led to a more consistent adhesive cure This type

of find is especially important in manufacturing because there is essentially no additional cost to this benefit Occasionally, an experiment result indicates that the desired process improvement can be achieved, but only at a cost that must be balanced against the gain from improvement Additional information acquired by the team: the assembly color was affected least when the UV source was farther from the assemblies (not surprising), and sufficient cure and bond strength were attainable when the assemblies were either quickly passed close to the source or dwelt longer at a greater distance from the source What surprised the team was the penalty they would pay for process speed When the assembly was passed close to the light, they could speed the conveyor up and obtain sufficient cure, but there were always a small number of discolored assemblies In addition, the shorter time made the process more sensitive to degradation of the UV light, requiring more preventive maintenance to change the source bulbs The team chose to set the process up with

a slower conveyor speed and the light source farther from the belt This created an optimal balance between assembly throughput, reduction in defective assemblies, and preventive line maintenance

Another DOE with which the author was involved was aimed at improving a laser welding process This process was an aerospace application wherein a laser welder was used to assemble a microwave wave guide and antenna assembly The process was plagued with a significant amount of rework, ranging from 20 to 50%

of the assemblies The reworked assemblies required hand filing of nubs created on

SL3003Ch03Frame Page 52 Tuesday, November 6, 2001 6:11 PM

Trang 5

Design of Experiments 53

the back of the assembly if the weld beam had burned through the parts The welder had gone through numerous adjustments and refurbishment over the years Support engineering believed that the variation they were experiencing was due to attempted piecemeal improvements and that they must develop an optimum setting that would still probably result in rework, but the result would be steady performance The experiment was conducted using focus depth, power level, and laser pulse width (the laser was not continuous, rather it fired at a given power level for a controlled time period or pulse) The team found that the power level and pulse width ranges they had been using over the years had an essentially negligible impact on the weld The key parameter was the beam focus depth What’s more, upon further investiga-tion, the team found that the method of setting the focus depth was imprecise and, thus, dependent on operator experience and visual acuity To fix this process, the team had a small tool fabricated and installed in the process to help the operator consistently set the proper laser beam focus This resulted in a reduction of rework

to nearly zero!

3.5.2 U SING S TRUCTURED DOE S TO E STABLISH P ROCESS L IMITS

Manufacturers know it is difficult to maintain a process when the factor settings are not permitted any variation and the limits on the settings are quite small Such a process, often called a “point” process, may be indicative of high sensitivity to input parameters Alternatively, it may indicate a lack of knowledge of the effect of process settings and a desire to control the process tightly just in case.

To determine allowable process settings for key parameters, place these factors

in a DOE and monitor the key process outputs If the process outputs remain in specification and especially if the process outputs exhibit significant margin within the factor space, the process settings are certainly acceptable for manufacturing To determine the output margin, an experimenter can run sufficient experimental rep-licates to assess process capability (Cpk) or process performance (Ppk) If the output

is not acceptable in parts of the factor space, the experimenter can determine which portion of the factor space would yield acceptable results

3.5.3 U SING S TRUCTURED DOE S TO G UIDE N EW D ESIGN F EATURES

AND T OLERANCES

As stated previously, DOE is often used in development work to assess the differences between two potential designs, materials, etc This sounds like development work only, not manufacturing Properly done, DOE can serve both purposes

Planning for a DOE is not particularly challenging, but there are some approaches

to use that help to avoid pitfalls The first and most important concept is to include many process stakeholders in the planning effort Ideally, the planning group should include at least one representative each from design, production technical support, and production operators It is not necessary to assemble a big group, but these functions should all be represented

SL3003Ch03Frame Page 53 Tuesday, November 6, 2001 6:11 PM

Trang 6

54 The Manufacturing Handbook of Best Practices

The rationale for their inclusion is to obtain their input in both the planning and the execution of the experiment As you can imagine, experiments are not done every day, and communication is necessary to understand the objective, the plan, and the order of execution

When the planning team is assembled, start by brainstorming the factors that may be included in the experiment These may be tabulated (listed) and then prior-itized One tool that is frequently used for brainstorming factors is a cause-and-effect diagram, also known as a fishbone or Ishikawa diagram This tool helps prompt the planning team on some elements to be considered as experimental factors Newcomers to DOE may be overly enthusiastic and want to include too many factors in the experiment Although it is desirable to include as many factors as are considered significant, it must be remembered that each factor brings a cost For example, consider an experiment with five factors, each at two levels When all possible combinations are included in the experiment (this is called a full factorial design), the experiment will take 25 = 32 runs to complete each factor level setting combination just once! As will be discussed later, replicating an experiment at least once is very desirable For this experiment, one replication will take 64 runs In general, if an experiment has k factors at two levels, l factors at three levels, and m

factors at four levels, the number of runs to complete every experimental factor level setting is given by 2k∗ 3l∗ 4m As you can see, the size of the experiment can grow quickly It is important to prioritize the possible factors for the experiment and include what are thought to be the most significant ones with respect to the time and material that can be devoted to the DOE on the given process

If it is desirable to experiment with a large number of factors, there are ways to reduce the size of the experiment Some methods involve reducing the number of levels for the factors It is not usually necessary to run factors at levels higher than three, and often three levels is unnecessary In most cases, responses are linear over the range of experimental values and two levels are sufficient As a rule of thumb, it is not necessary

to experiment with factors at more than two levels unless the factors are qualitative (material types, suppliers, etc.) or the response is expected to be nonlinear (quadratic, exponential, or logarithmic) due to known physical phenomena

Another method to reduce the size of the experiment is somewhat beyond the scope of this chapter, but is discussed in sufficient detail to provide some additional guidance A full factorial design is generally desirable because it allows the exper-imenter to assess not only the significance of each factor, but all the interactions between the factors For example, given factors T (temperature), P (pressure), and

M (material) in an experiment, a full factorial design can detect the significance of

T, P, and M as well as interactions TP, TM, PM, and TPM There is a class of experiments wherein the experimenter deliberately reduces the size of the experiment and gives up some of the resulting potential information by a strategic reduction in factor level setting combinations This class is generally called “fractional factorial” experiments because the result is a fraction of the full factorial design For example,

a half-fractional experiment would consist of 2n–1 factor level setting combinations Many fractional factorial designs have been developed such that the design gives

up information on some or all of the potential interactions (the formal term for this

SL3003Ch03Frame Page 54 Tuesday, November 6, 2001 6:11 PM

Trang 7

Design of Experiments 55

loss of information is confounding — the interaction is not lost, it is confounded or mixed with another interaction’s or factor’s result) To use one of these designs, the experimenter should consult one or more of the reference books listed at the end of this chapter or employ one of the enumerated software applications These will have guidance tables or selection options to guide you to a design In general, employ designs that confound higher level interactions (three-way, four-way, etc.) Avoid designs that confound individual factors with each other or two-way interactions (AB, AC, etc.) and, if possible, use a design that preserves two-way interactions Most experimental practitioners will tell you that three-way or better interactions are not detected often and are not usually of engineering significance even if noted The next part of planning the experiment is to determine the factor levels Factor levels fall into two general categories Some factors are quantitative and cover a range of possible settings; temperature is one example Often these factors are continuous A subset of this type of factor is one with an ordered set of levels An example of this is high-medium-low fan settings Some experimental factors are known as attribute or qualitative factors These include material types, suppliers, operators, etc The distinction between these two types of factors really drives the experimental analysis and sometimes the experimental planning For example, while experimenting with the temperatures 100, 125, and 150°C, a regression could be performed and it could identify the optimum temperature as something between the three experimental settings, say 133°C, for example While experimenting with three materials, A, B, and C, one does not often have the option of selecting a material part way between A and B if such a material is not on the market!

Continuing our discussion of factor levels, the attribute factors are generally given Quantitative factors pose the problem of selecting the levels for the experi-ment Generally, the levels should be set wide enough apart to allow identification

of differences, but not so wide as to ruin the experiment or cause misleading settings Consider curing a material at ~100°C If your oven maintains temperature ± 5°C, then an experiment of 95, 100, 105°C may be a waste of time At the same time,

an experiment of 50, 100, 150°C may be so broad that the lower temperature material doesn’t cure and the higher temperature material burns Experimental levels of 90,

100, and 110°C are likely to be more appropriate

After the experiment is planned, it is important to randomize the order of the runs Randomization is the key to preventing some environmental factor that changes over time from confounding with an experimental factor For example, let’s suppose you are experimenting with reducing chatter on a milling machine You are exper-imenting with cutting speed and material from two suppliers, A and B If you run all of A’s samples first, would you expect tool wear to affect the output when B is run? Using randomization, the order would be mixed so that each material sample has an equal probability of the application of either a fresh or a dulled cutting edge Randomization can be accomplished by sorting on random numbers added to the rows in a spreadsheet Another method is to add telephone numbers taken sequentially from the phone book to each run and sort the runs by these numbers You can also draw the numbers from a hat or any other method that removes the human bias

SL3003Ch03Frame Page 55 Tuesday, November 6, 2001 6:11 PM

Trang 8

56 The Manufacturing Handbook of Best Practices

When you conduct an experiment that includes replicates, you may be tempted

to randomize the factor level setting combinations and run the replicates back-to-back while at the combination setting This is less desirable than full randomization for the reasons given previously Sometimes, an experiment is difficult to fully randomize due to the nature of experimental elements For example, an experiment

on a heat-treat oven or furnace for ceramics may be difficult to fully randomize because of the time involved with changing the oven temperature In this case, one can relax the randomization somewhat and randomize factor level combinations while allowing the replicates at each factor level setting combination to go back-to-back Randomization can also be achieved by randomizing how material is assigned

to the individual runs

3.5.5 E XECUTING THE DOE E FFICIENTLY

The experimenter will find it important to bring all the personnel who may handle experimental material into the planning at some point for training Every experi-menter has had one or more experiments ruined by someone who didn’t understand the objective or significance of the experimental steps Errors of this sort include mixing the material (not maintaining traceability to the experimental runs), running all the material at the same setting (not changing process setting according to plan), and other instances of Murphy’s Law that may enter the experiment It is also advisable to train everyone involved with the experiment to write down times, settings, and variances that may be observed The latter might include maintenance performed on a process during the experiment, erratic gauge readings, shift changes, power losses, etc The astute experimenter must also recognize that when an operator makes errors, you can’t berate the operator and expect cooperation on the next trial

of the experiment Everyone involved will know what happened and the next time there is a problem with your experiment, you’ll be the last to know exactly what went wrong!

3.5.6 I NTERPRETING THE DOE R ESULTS

In the year 2000, DOEs were most often analyzed using a statistical software package that provided analysis capabilities such as ANalysis Of VAriance (ANOVA) and regression ANOVA is a statistical analysis technique that decomposes the variation

of experimental results into the variance from experimental factors (and their inter-actions if the experiment supported such analysis) and the underlying variation of the process Using statistical tests, ANOVA designates which factors (and interac-tions) are statistically significant and which are not In this context, if a factor is statistically significant, it means that the observed data are not likely to normally result from the process Stated another way, the factor had a discernible effect on the process If a factor or interaction is not determined to be statistically significant, the effect is not discernible from the background process variation under the exper-imental conditions The way that most statistical software packages implementing ANOVA identify significance is by estimating a p-value for factors and interactions

A p-value indicates the probability that the resulting variance from the given factor

SL3003Ch03Frame Page 56 Tuesday, November 6, 2001 6:11 PM

Trang 9

Design of Experiments 57

or interaction would normally occur, given the underlying process When the p-value

is low, the variance shown by the factor or interaction is less likely to have normally

occurred Generally, experimenters use a p-value of 0.05 as a cut-off point When

a p-value is less than 0.05, that factor/interaction is said to be statistically significant

Regression is an experimental technique that attempts to fit an equation to the

data For example, if the experiment involves two factors, A and B, the experimenter

would be interested in fitting the following equation:

(3.2)

Regression software packages develop estimates for the constant (β0) as well as

the coefficients (βA, βB, and βAB) of the variable terms If there are sufficient

exper-imental runs, regression packages also provide an estimate for the process standard

deviation (ε) As with ANOVA, regression identifies which factors and interactions

are significant The way regression packages do this is to identify a p-value for each

coefficient As with ANOVA, experimenters generally tend to use a p-value of 0.05

as a cut-off point Any coefficient p-value that is less than 0.05 indicates that the

corresponding factor or interaction is statistically significant

These are powerful tools and are quite useful, but are a little beyond further

detailed discussion in this chapter See some of the references provided for a more

detailed explanation of these tools If you do not have a statistical package to support

ANOVA or regression, there are two options available for your analysis The first

option is to use the built-in ANOVA and regression packages in an office spreadsheet

such as Microsoft Excel The regression package in Excel is quite good; however,

the ANOVA package is somewhat limited Another option is to analyze the data

graphically For example, suppose you conduct an experiment with two factors (A

and B) at two levels (22) and you do three replicates (a total of 16 runs) Use a bar

chart or a scatter plot of factor A at both of its levels (each of the two levels will

have eight data points) Then use a bar chart or scatter plot of factor B at both of

its levels (each of the two levels will have eight data points) Finally, to show

interactions, create a line chart with one line representing factor A and one line for

factor B Each line will show the average at the corresponding factor’s level

Although this approach will not have statistical support, it may give you a path to

pursue

3.5.7 T YPES OF E XPERIMENTS

As stated in previous paragraphs, there are two main types of experiments found in

the existing literature These are full factorial experiments and fractional factorial

experiments The pros and cons of these experiments have already been discussed

and will not be covered again However, there are other types of DOEs that are

frequently mentioned in other writings

Before discussing the details of these other types, let’s look at Figure 3.1a

We see a Venn Diagram with three overlapping circles Each circle represents a

specific school or approach to designed experiments: classical methods (one thinks

Y=β0+β1X A+β2X B+β12X AB

SL3003Ch03Frame Page 57 Tuesday, November 6, 2001 6:11 PM

Trang 10

58 The Manufacturing Handbook of Best Practices

of Drs George Box and Douglas Montgomery), Taguchi Methods (referring to Dr

Genichi Taguchi), and statistical engineering (established and taught by Dorian

Shainin) In Figure 3.1b we see that all three approaches share a common focus,

i.e., the factorial principle referred to earlier in this chapter Figure 3.1c demonstrates

that each pairing of approaches shares a common focus or orientation, one approach

with another Finally, in Figure 3.1d, it is clear that each individual approach

pos-sesses its own unique focus or orientation

The predominant type of nonclassical experiment that is most often discussed

is named after Dr Genichi Taguchi and is usually referred to as Taguchi Methods

or robust design, and occasionally as quality engineering Taguchi experiments are

fractional factorial experiments In that regard, the experimental structures are not

as significantly different as is Dr Taguchi’s presentation of the experimental arrays

and his approach to the analysis of results Some practicing statisticians do not

promote Dr Taguchi’s experimental arrays due to opinions that other experimental

approaches are superior Despite this, many knowledgeable DOE professionals have

noted that practicing engineers seem to grasp experimental methods as presented by

Dr Taguchi more readily than methods advocated by classical statisticians and

quality engineers It may be that Dr Taguchi’s use of graphical analysis is a help

Although ANOVA and regression have strong grounds in statistics and are very

powerful, telling an engineer which factors and interactions are important is less

effective than showing him or her the direction of effects using graphical analysis

Despite the relatively small controversy regarding Taguchi Methods, Dr

Tagu-chi’s contributions to DOE thinking remain This influence runs from the promotion

of his experimental tools such as the signal-to-noise ratio and orthogonal array and,

perhaps more importantly, his promotion of using experiments designed to reduce

the influence of process variation and uncontrollable factors Dr Taguchi would

describe uncontrollable factors, often called noise factors, as elements in a process

FIGURE 3.1a Design of experiments — I.

Taguchi Methods

Classical Methods

Shainin Methods SL3003Ch03Frame Page 58 Tuesday, November 6, 2001 6:11 PM

Ngày đăng: 11/08/2014, 13:22

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm