1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Modern Analytical Cheymistry - Chapter 15 pps

94 325 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Quality Assurance
Trường học Modern Analytical Chemistry
Chuyên ngành Analytical Chemistry
Thể loại Chương
Năm xuất bản 1999
Định dạng
Số trang 94
Dung lượng 589,76 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

The “feedback loop” in the analytical approach is maintained by a quality assurance program Figure 15.1, whose objective is to control systematic and random sources of error.1–5The under

Trang 1

705

Quality Assurance

I n Chapter 1 we noted that each field of chemistry brings a unique

perspective to the broader discipline of chemistry For analytical

chemistry this perspective was identified as an approach to solving

problems, which was presented as a five-step process: (1) Identify and

define the problem; (2) Design the experimental procedure; (3) Conduct

an experiment and gather data; (4) Analyze the experimental data; and

(5) Propose a solution to the problem The analytical approach, as

presented thus far, appears to be a straightforward process of moving

from problem-to-solution Unfortunately (or perhaps fortunately for

those who consider themselves to be analytical chemists!), an analysis is

seldom routine Even a well-established procedure, carefully followed,

can yield poor data of little use.

An important feature of the analytical approach, which we have

neglected thus far, is the presence of a “feedback loop” involving steps

2, 3, and 4 As a result, the outcome of one step may lead to a

reevaluation of the other two steps For example, after standardizing a

spectrophotometric method for the analysis of iron we may find that its

sensitivity does not meet the original design criteria Considering this

information we might choose to select a different method, to change

the original design criteria, or to improve the sensitivity.

The “feedback loop” in the analytical approach is maintained by a

quality assurance program (Figure 15.1), whose objective is to control

systematic and random sources of error.1–5The underlying assumption

of a quality assurance program is that results obtained when an

analytical system is in statistical control are free of bias and are

characterized by well-defined confidence intervals When used

properly, a quality assurance program identifies the practices necessary

to bring a system into statistical control, allows us to determine if the

system remains in statistical control, and suggests a course of corrective

action when the system has fallen out of statistical control.

The focus of this chapter is on the two principal components of a

quality assurance program: quality control and quality assessment In

addition, considerable attention is given to the use of control charts for

routinely monitoring the quality of analytical data.

Trang 2

706 Modern Analytical Chemistry

Quality control encompasses all activities used to bring a system into statistical

control The most important facet of quality control is a set of written directives scribing all relevant laboratory-specific, technique-specific, sample-specific, method-specific, and protocol-specific operations.1,3,6 Good laboratory practices

de-(GLPs) describe the general laboratory operations that need to be followed in any analysis These practices include properly recording data and maintaining records, using chain-of-custody forms for samples that are submitted for analysis, specifying and purifying chemical reagents, preparing commonly used reagents, cleaning and calibrating glassware, training laboratory personnel, and maintaining the laboratory facilities and general laboratory equipment.

Good measurement practices (GMPs) describe operations specific to a

tech-nique In general, GMPs provide instructions for maintaining, calibrating, and using the equipment and instrumentation that form the basis for a specific tech- nique For example, a GMP for a titration describes how to calibrate a buret (if nec-

1 Identify the problem

Determine type of information needed

(qualitative, quantitative,characterization, or fundamental)Establish context of the problem

2 Design the experimental procedure

Establish design criteria (accuracy, precision,

scale of operation, sensitivity, selectivity,

cost, speed)

Identify interferents

Select method

Establish validation criteria

Establish sampling strategy

Qualityassuranceprogram

3 Conduct an experimentCalibrate instruments and equipmentStandardize reagents

Gather data

4 Analyze the experimental dataReduce or transform dataAnalyze statisticsVerify resultsInterpret results

5 Propose a solutionConduct external evaluation

Q

ua

l

it

i

ty

Figure 15.1

Schematic diagram of the analytical approach to problem solving, showing the role of the quality assurance program

quality control

Those steps taken to ensure that an

analysis is under statistical control

good laboratory practices

Those general laboratory procedures

that, when followed, help ensure the

quality of analytical work

good measurement practices

Those instructions outlining how to

properly use equipment and

instrumentation to ensure the quality of

measurements

quality assurance

The steps taken during an analysis to

ensure that the analysis is under control

and that it is properly monitored

Trang 3

Chapter 15 Quality Assurance 707

standard operations procedure

The procedure followed in collecting andanalyzing samples and in interpreting theresults of an analysis

essary), how to fill a buret with the titrant, the correct way to read the volume of

titrant in the buret, and the correct way to dispense the titrant.

The operations that need to be performed when analyzing a specific analyte in a

specific matrix are defined by a standard operations procedure (SOP) The SOP

describes all steps taken during the analysis, including: how the sample is processed

in the laboratory, the analyte’s separation from potential interferents, how the

method is standardized, how the analytical signal is measured, how the data are

transformed into the desired result, and the quality assessment tools that will be

used to maintain quality control If the laboratory is responsible for sampling, then

the SOP will also state how the sample is to be collected and preserved and the

na-ture of any prelaboratory processing A SOP may be developed and used by a single

laboratory, or it may be a standard procedure approved by an organization such as

the American Society for Testing and Materials or the Federal Food and Drug

Ad-ministration A typical SOP is provided in the following example.

EXAMPLE 15.1

Provide an SOP for the determination of cadmium in lake sediments by atomic

absorption spectrophotometry using a normal calibration curve.

SOLUTION

Sediment samples should be collected using a bottom grab sampler and stored

at 4 °C in acid-washed polyethylene bottles during transportation to the

laboratory Samples should be dried to constant weight at 105 °C and ground

to a uniform particle size The cadmium in a 1-g sample of the sediment is

extracted by adding the sediment and 25 mL of 0.5 M HCl to an acid-washed

100-mL polyethylene bottle and shaking for 24 h After filtering, the sample is

analyzed by atomic absorption spectrophotometry using an air–acetylene

flame, a wavelength of 228.8 nm, and a slit width of 0.5 nm A normal

calibration curve is prepared using five standards with nominal concentrations

of 0.20, 0.50, 1.00, 2.00, and 3.00 ppm The accuracy of the calibration curve is

checked periodically by analyzing the 1.00-ppm standard An accuracy of ± 10%

is considered acceptable.

Although an SOP provides a written procedure, it is not necessary to follow the

procedure exactly as long as any modifications are identified On the other hand, a

protocol for a specific purpose (PSP), which is the most detailed of the written

quality control directives, must be followed exactly if the results of the analysis are

to be accepted In many cases the required elements of a PSP are established by the

agency sponsoring the analysis For example, labs working under contract with the

Environmental Protection Agency must develop a PSP that addresses such items as

sampling and sample custody, frequency of calibration, schedules for the preventive

maintenance of equipment and instrumentation, and management of the quality

assurance program.

Two additional aspects of a quality control program deserve mention The first

is the physical inspection of samples, measurements and results by the individuals

responsible for collecting and analyzing the samples.1For example, sediment

sam-ples might be screened during collection, and samsam-ples containing “foreign objects,”

such as pieces of metal, be discarded without being analyzed Samples that are

dis-carded can then be replaced with additional samples When a sudden change in the

protocol for a specific purpose

A precisely written protocol for ananalysis that must be followed exactly

Trang 4

708 Modern Analytical Chemistry

performance of an instrument is observed, the analyst may choose to repeat those measurements that might be adversely influenced The analyst may also decide to reject a result and reanalyze the sample when the result is clearly unreasonable By identifying samples, measurements, and results that may be subject to gross errors, inspection helps control the quality of an analysis.

A final component of a quality control program is the certification of an lyst’s competence to perform the analysis for which he or she is responsible.7Before

ana-an ana-analyst is allowed to perform a new ana-analytical method, he or she may be required

to successfully analyze an independent check sample with acceptable accuracy and precision The check sample should be similar in composition to samples that the analyst will routinely encounter, with a concentration that is 5 to 50 times that of the method’s detection limit.

The written directives of a quality control program are a necessary, but not a cient, condition for obtaining and maintaining an analysis in a state of statistical control Although quality control directives explain how an analysis should be properly conducted, they do not indicate whether the system is under statistical

suffi-control This is the role of quality assessment, which is the second component of a

quality assurance program.

The goals of quality assessment are to determine when a system has reached a state

of statistical control; to detect when the system has moved out of statistical control; and,

if possible, to suggest why a loss of statistical control has occurred so that corrective tions can be taken For convenience, the methods of quality assessment are divided into two categories: internal methods that are coordinated within the laboratory and exter- nal methods for which an outside agency or individual is responsible The incorpora- tion of these methods into a quality assurance program is covered in Section 15C.

The most useful methods for quality assessment are those that are coordinated by the laboratory and that provide the analyst with immediate feedback about the sys- tem’s state of statistical control Internal methods of quality assessment included in this section are the analysis of duplicate samples, the analysis of blanks, the analysis

of standard samples, and spike recoveries.

Analysis of Duplicate Samples An effective method for determining the precision

of an analysis is to analyze duplicate samples In most cases the duplicate samples

are taken from a single gross sample (also called a split sample), although in some cases the duplicates must be independently collected gross samples The results

from the duplicate samples, X1 and X2, are evaluated by determining the difference,

The steps taken to evaluate whether an

analysis is under statistical control

duplicate samples

Two samples taken from a single gross

sample and used to evaluate an analytical

method’s precision

Trang 5

Chapter 15 Quality Assurance 709

Table 15.1 Selected Quality Assessment Limits for the Analysis of Waters and Wastewaters

Limits for Spike Recovery (d )r When [Analyte] < 20×MDL (d )r When [Analyte] > 20×MDL

Abbreviation: MDL = method’s detection limit.

where diis the difference between the ith pair of duplicates The degrees of freedom

for the standard deviation is the same as the number of duplicate samples If

dupli-cate samples from several sources are combined, then the precision of the

measure-ment process must be approximately the same for each The precision obtained is

then compared with the precision needed to accept the results of the analysis.

EXAMPLE 15.2

To evaluate the precision for the determination of potassium in blood serum,

duplicate analyses were performed on six samples, yielding the following

Trang 6

The Analysis of Blanks The use of a blank was introduced in Chapter 3 as a means of correcting the measured signal for contributions from sources other than the analyte The most common blank is a method, or reagent blank, in which an analyte-free sample, usually distilled water, is carried through the analysis using the same reagents, glassware, and instrumentation Method blanks are used to identify and correct systematic errors due to impurities in the reagents and con- tamination in the glassware and instrumentation At a minimum, method blanks should be analyzed whenever new reagents are used, although a more frequent analysis provides an ongoing monitoring of the purity of the reagents A new method blank should also be run whenever a sample with a high concentration of the analyte is analyzed, because any residual carryover of the analyte may contami- nate the glassware or instrumentation.

When samples are collected in the field, the method blank may be augmented with field and trip blanks.8A field blank is an analyte-free sample carried from the

laboratory to the sampling site At the sampling site the blank is transferred to a clean sample container, exposing it to the local environment, preserved, and trans- ported back to the laboratory for analysis Field blanks are used to identify and

correct systematic errors due to sampling, transport, and analysis Trip blanks are

analyte-free samples carried from the laboratory to the sampling site and returned

to the laboratory without being opened A trip blank is used to identify and correct systematic errors due to cross-contamination of volatile organic compounds during transport, handling, storage, and analysis.

Analysis of Standards The analysis of a standard containing a known tion of analyte also can be used to monitor a system’s state of statistical control Ide- ally, a standard reference material (SRM) should be used, provided that the matrix

concentra-of the SRM is similar to that concentra-of the samples being analyzed A variety concentra-of appropriate SRMs are available from the National Institute of Standards and Technology (NIST) If a suitable SRM is not available, then an independently prepared synthetic sample can be used if it is prepared from reagents of known purity At a minimum,

a standardization of the method is verified by periodically analyzing one of the bration standards In all cases, the analyte’s experimentally determined concentra- tion in the standard must fall within predetermined limits if the system is to be con- sidered under statistical control.

cali-Spike Recoveries One of the most important quality assessment tools is the ery of a known addition, or spike, of analyte to a method blank, field blank, or sam-

recov-ple To determine a spike recovery, the blank or sample is split into two portions,

and a known amount of a standard solution of the analyte is added to one portion.

The concentration of the analyte is determined for both the spiked, F, and unspiked portions, I, and the percent recovery, %R, is calculated as

where A is the concentration of the analyte added to the spiked portion.

A blank prepared in the laboratory that

accompanies a set of sample containers

in the field and laboratory

spike recovery

An analysis of a sample after spiking with

a known amount of analyte

Trang 7

Chapter 15 Quality Assurance 711

EXAMPLE 15.3

A spike recovery for the analysis of chloride in well water was performed by

adding 5.00 mL of a 25,000-ppm solution of Cl–to a 500-mL volumetric flask

and diluting to volume with the sample Analysis of the sample and the spiked

sample resulted in chloride concentrations of 183 ppm and 409 ppm,

respectively Determine the percent recovery of the spike.

SOLUTION

The concentration of the added spike is calculated by taking into account the

effect of dilution.

Thus, the spike recovery is

Spike recoveries on method blanks and field blanks are used to evaluate the

general performance of an analytical procedure The concentration of analyte added

to the blank should be between 5 and 50 times the method’s detection limit

Sys-tematic errors occurring during sampling and transport will result in an

unaccept-able recovery for the field blank, but not for the method blank Systematic errors

occurring in the laboratory, however, will affect the recoveries for both the field and

method blanks.

Spike recoveries for samples are used to detect systematic errors due to the

sample matrix or the stability of the sample after its collection Ideally, samples

should be spiked in the field at a concentration between 1 and 10 times the expected

concentration of the analyte or 5 to 50 times the method’s detection limit,

whichever is larger If the recovery for a field spike is unacceptable, then a sample is

spiked in the laboratory and analyzed immediately If the recovery for the

labora-tory spike is acceptable, then the poor recovery for the field spike may be due to the

sample’s deterioration during storage When the recovery for the laboratory spike

also is unacceptable, the most probable cause is a matrix-dependent relationship

be-tween the analytical signal and the concentration of the analyte In this case the

samples should be analyzed by the method of standard additions Typical limits for

acceptable spike recoveries for the analysis of waters and wastewaters are shown in

Table 15.1.7

Internal methods of quality assessment should always be viewed with some level of

skepticism because of the potential for bias in their execution and interpretation.

For this reason, external methods of quality assessment also play an important role

in quality assurance programs One external method of quality assessment is the

certification of a laboratory by a sponsoring agency Certification is based on the

successful analysis of a set of proficiency standards prepared by the sponsoring

agency For example, laboratories involved in environmental analyses may be

re-quired to analyze standard samples prepared by the Environmental Protection

Trang 8

712 Modern Analytical Chemistry

Agency A second example of an external method of quality assessment is the tary participation of the laboratory in a collaborative test (Chapter 14) sponsored by

volun-a professionvolun-al orgvolun-anizvolun-ation such volun-as the Associvolun-ation of Officivolun-al Anvolun-alyticvolun-al Chemists Finally, individuals contracting with a laboratory can perform their own external quality assessment by submitting blind duplicate samples and blind standard sam- ples to the laboratory for analysis If the results for the quality assessment samples are unacceptable, then there is good reason to consider the results suspect for other samples provided by the laboratory.

In the previous section we described several internal methods of quality assessment that provide quantitative estimates of the systematic and random errors present in an analytical system Now we turn our attention to how this numerical information is incorporated into the written directives of a complete quality assurance program Two approaches to developing quality assurance programs have been described9: a prescriptive approach, in which an exact method of quality assessment is prescribed; and a performance-based approach, in which any form of quality assessment is ac- ceptable, provided that an acceptable level of statistical control can be demonstrated.

With a prescriptive approach to quality assessment, duplicate samples, blanks, dards, and spike recoveries are measured following a specific protocol The result for each analysis is then compared with a single predetermined limit If this limit is exceeded, an appropriate corrective action is taken Prescriptive approaches to qual- ity assurance are common for programs and laboratories subject to federal regula- tion For example, the Food and Drug Administration (FDA) specifies quality as- surance practices that must be followed by laboratories analyzing products regulated by the FDA.

stan-A good example of a prescriptive approach to quality assessment is the protocol outlined in Figure 15.2, published by the Environmental Protection Agency (EPA) for laboratories involved in monitoring studies of water and wastewater.10Indepen- dent samples A and B are collected simultaneously at the sample site Sample A is split into two equal-volume samples, and labeled A1 and A2 Sample B is also split into two equal-volume samples, one of which, BSF, is spiked with a known amount

of analyte A field blank, DF, also is spiked with the same amount of analyte All five samples (A1, A2, B, BSF, and DF) are preserved if necessary and transported to the laboratory for analysis.

The first sample to be analyzed is the field blank If its spike recovery is ceptable, indicating that a systematic error is present, then a laboratory method blank, DL, is prepared and analyzed If the spike recovery for the method blank is also unsatisfactory, then the systematic error originated in the laboratory An ac- ceptable spike recovery for the method blank, however, indicates that the systematic error occurred in the field or during transport to the laboratory Systematic errors

unac-in the laboratory can be corrected, and the analysis contunac-inued Any systematic rors occurring in the field, however, cast uncertainty on the quality of the samples, making it necessary to collect new samples.

er-If the field blank is satisfactory, then sample B is analyzed er-If the result for B is above the method’s detection limit, or if it is within the range of 0.1 to 10 times the amount of analyte spiked into BSF, then a spike recovery for BSF is determined An

Trang 9

unacceptable spike recovery for BSFindicates the presence of a systematic error

in-volving the sample To determine the source of the systematic error, a laboratory

spike, BSL, is prepared using sample B and analyzed If the spike recovery for BSLis

acceptable, then the systematic error requires a long time to have a noticeable effect

on the spike recovery One possible explanation is that the analyte has not been

properly preserved or has been held beyond the acceptable holding time An

unac-ceptable spike recovery for BSLsuggests an immediate systematic error, such as that

due to the influence of the sample’s matrix In either case, the systematic errors are

fatal and must be corrected before the sample is reanalyzed.

If the spike recovery for BSFis acceptable, or if the result for sample B is below

the method’s detection limit or outside the range of 0.1 to 10 times the amount of

analyte spiked in BSF, then the duplicate samples A1and A2are analyzed The results

for A1and A2are discarded if the difference between their values is excessive If the

difference between the results for A1and A2is within the accepted limits, then the

results for samples A1and B are compared Since samples collected from the same

sampling site at the same time should be identical in composition, the results are

discarded if the difference between their values is unsatisfactory, and accepted if the

difference is satisfactory.

DF recoverywithin limits

DL recoverywithin limits

B > MDL, or

B > 0.1 × [spike], and

B < 10 × [spike]

Systematicerror inlaboratory

Systematicerror infield

YesNo

YesNo

Immediatesystematicerrors

dependentsystematicerrors

Time-YesNo

No

A1 – B

within limits

PoorReplication

NoYes

A1 – A2

within limits

BSF recoverywithin limits

BSL recoverywithin limits

Poorduplicatesamples

Figure 15.2

Example of a prescriptive approach to quality assurance Adapted from Environmental Monitoring and SupportLaboratory, U.S Environmental Protection Agency, “Handbook for Analytical Quality Control in Water andWastewater Laboratories,” March 1979

Trang 10

This protocol requires four to five evaluations of quality assessment data before the result for a single sample can be accepted; a process that must be repeated for each analyte and for each sample Other prescriptive protocols are equally demand- ing For example, Figure 3.7 in Chapter 3 shows a portion of the quality assurance protocol used for the graphite furnace atomic absorption analysis of trace metals in aqueous solutions This protocol involves the analysis of an initial calibration verifi- cation standard and an initial calibration blank, followed by the analysis of samples

in groups of ten Each group of samples is preceded and followed by continuing ibration verification (CCV) and continuing calibration blank (CCB) quality assess- ment samples Results for each group of ten samples can be accepted only if both sets of CCV and CCB quality assessment samples are acceptable.

cal-The advantage to a prescriptive approach to quality assurance is that a single sistent set of guidelines is used by all laboratories to control the quality of analytical results A significant disadvantage, however, is that the ability of a laboratory to pro- duce quality results is not taken into account when determining the frequency of col- lecting and analyzing quality assessment data Laboratories with a record of producing high-quality results are forced to spend more time and money on quality assessment than is perhaps necessary At the same time, the frequency of quality assessment may

con-be insufficient for laboratories with a history of producing results of poor quality.

In a performance-based approach to quality assurance, a laboratory is free to use its experience to determine the best way to gather and monitor quality assessment data The quality assessment methods remain the same (duplicate samples, blanks, standards, and spike recoveries) since they provide the necessary information about precision and bias What the laboratory can control, however, is the fre- quency with which quality assessment samples are analyzed, and the conditions in- dicating when an analytical system is no longer in a state of statistical control Fur- thermore, a performance-based approach to quality assessment allows a laboratory

to determine if an analytical system is in danger of drifting out of statistical trol Corrective measures are then taken before further problems develop.

con-The principal tool for performance-based quality assessment is the control chart In a control chart the results from the analysis of quality assessment samples

are plotted in the order in which they are collected, providing a continuous record

of the statistical state of the analytical system Quality assessment data collected over time can be summarized by a mean value and a standard deviation The fundamen- tal assumption behind the use of a control chart is that quality assessment data will show only random variations around the mean value when the analytical system is

in statistical control When an analytical system moves out of statistical control, the quality assessment data is influenced by additional sources of error, increasing the standard deviation or changing the mean value.

Control charts were originally developed in the 1920s as a quality assurance tool for the control of manufactured products.11Two types of control charts are commonly used in quality assurance: a property control chart in which results for single measurements, or the means for several replicate measurements, are plotted sequentially; and a precision control chart in which ranges or standard deviations are plotted sequentially In either case, the control chart consists of a line represent- ing the mean value for the measured property or the precision, and two or more boundary lines whose positions are determined by the precision of the measure- ment process The position of the data points about the boundary lines determines whether the system is in statistical control.

714 Modern Analytical Chemistry

control chart

A graph showing the time-dependent

change in the results of an analysis that is

used to monitor whether an analysis is in

a state of statistical control

Trang 11

Chapter 15 Quality Assurance 715

Construction of Property Control Charts The simplest form for a property control

chart is a sequence of points, each of which represents a single determination of the

property being monitored To construct the control chart, it is first necessary to

de-termine the mean value of the property and the standard deviation for its

measure-ment These statistical values are determined using a minimum of 7 to 15 samples

(although 30 or more samples are desirable), obtained while the system is known to

be under statistical control The center line (CL) of the control chart is determined

by the average of these n points

The positions of the boundary lines are determined by the standard deviation, S, of

the points used to determine the central line

with the upper and lower warning limits (UWL and LWL), and the upper and lower

control limits (UCL and LCL) given by

UWL = CL + 2S LWL = CL – 2S UCL = CL + 3S LCL = CL – 3S

EXAMPLE 15.4

Construct a property control chart for the following spike recovery data (all

values are for percentage of spike recovered).

The mean and the standard deviation for the 20 data points are 99.4 and

1.6, respectively, giving the UCL as 104.2, the UWL as 102.6, the LWL as

96.2 and the LCL as 94.6 The resulting property control chart is shown in

Trang 12

Property control charts can also be constructed using points that are the mean value, Xi, for a set of r replicate determinations on a single sample The mean for the ith sample is given by

where Xij is the jth replicate The center line for the control chart, therefore, is

To determine the standard deviation for the warning and control limits, it is

neces-sary to calculate the variance for each sample, si2.

The overall standard deviation, S, is the square root of the average variance for the

samples used to establish the control plot.

Finally, the resulting warning and control limits are

X j

i

ij j r

Sequence

20LCLLWL

CL = X

UWLUCL

1041031021011009998979695

Figure 15.3

Property control chart for Example 15.4

Trang 13

Chapter 15 Quality Assurance 717

Table 15.2 Statistical Factors for the Upper

Warning Limit and Upper Control Limit

Constructing a Precision Control Chart The most common measure of precision

used in constructing a precision control chart is the range, R, between the largest

and smallest results for a set of j replicate analyses on a sample.

To construct the control chart, ranges for a minimum of 15–20 samples

(prefer-ably 30 or more samples) are obtained while the system is known to be in statistical

control The line for the average range, R – , is determined by the mean of these

n samples

The upper control line and the upper warning line are given by

where fUCLand fUWL(Table 15.2) are statistical factors determined by the number

of replicates used to determine the range Because the range always is greater than

or equal to zero, there is no lower control limit or lower warning limit.

EXAMPLE 15.5

Construct a precision control chart using the following 20 ranges, each

determined from a duplicate analysis of a 10-ppm calibration standard

Trang 14

The average range for the 20 duplicate samples is 0.177 Because two replicates

were used for each point, the UWL and UCL are

UWL = (2.512)(0.177) = 0.44 UCL = (3.267)(0.177) = 0.58

The complete control chart is shown in Figure 15.4.

The precision control chart is strictly valid only for the replicate analysis of identical samples, such as a calibration standard or a standard reference material Its use for the analysis of nonidentical samples, such as a series of clinical or environ- mental samples, is complicated by the fact that the range usually is not independent

of the magnitude of Xlargeand Xsmall For example, Table 15.3 shows the relationship between R and the concentration of chromium in water. – 10Clearly the significant difference in the average range for these concentrations of Cr makes a single preci- sion control chart impossible One solution to this problem is to prepare separate precision control charts, each of which covers a range of concentrations for which R

is approximately constant (Figure 15.5).

Interpreting Control Charts The purpose of a control chart is to determine if a tem is in statistical control This determination is made by examining the location

sys-of individual points in relation to the warning limits and the control limits, and the distribution of the points around the central line If we assume that the data are normally distributed, then the probability of finding a point at any distance from the mean value can be determined from the normal distribution curve The upper and lower control limits for a property control chart, for example, are set to ± 3S, which, if S is a good approximation for σ , includes 99.74% of the data The proba-

bility that a point will fall outside the UCL or LCL, therefore, is only 0.26% The

718 Modern Analytical Chemistry

10.00

0.60

R

Sequence

20R

UWLUCL

Trang 15

Figure 15.5

Example of the use of subrange precisioncontrol charts for samples that span arange of analyte concentrations Theprecision control charts are used for (a) low concentrations of analyte;

(b) intermediate concentrations of analyte;and (c) high concentrations of analyte

most likely explanation when a point exceeds a control limit is that a systematic

error has occurred or that the precision of the measurement process has

deterio-rated In either case the system is assumed to be out of statistical control.

Rule 1 A system is considered to be out of statistical control if any single point

exceeds either the UCL or the LCL.

The upper and lower warning limits, which are located at ± 2S, should only be

ex-ceeded by 5% of the data; thus

Rule 2 A system is considered to be out of statistical control if two out of three

consecutive points are between the UWL and UCL or between the LWL and

LCL.

When a system is in statistical control, the data points should be randomly

dis-tributed about the center line The presence of an unlikely pattern in the data is

an-other indication that a system is no longer in statistical control.4,12Thus,

Rule 3 A system is considered to be out of statistical control if a run of seven

consecutive points is completely above or completely below the center line

(Figure 15.6a).

Table 15.3 Average Range for Duplicate Samples for Different

Concentrations of Chromium in Water

Trang 16

Figure 15.6

Examples of property control charts that

show a run of data (highlighted in box)

indicating that the system is out of statistical

CL

LWLLCL

CL

LWLLCL

(a)

(b)

(c)

Trang 17

Rule 4 A system is considered to be out of statistical control if six consecutive

points are all increasing in value or all decreasing in value (Figure 15.6b) The

points may be on either side of the center line.

Rule 5 A system is considered to be out of statistical control if 14 consecutive

points alternate up and down in value (Figure 15.6c) The points may be on

either side of the center line.

Rule 6 A system is considered to be out of statistical control if any obvious

“nonrandom” pattern is observed.

The same rules apply to precision control charts with the exception that there are

no lower warning and lower control limits.

Using Control Charts for Quality Assurance Control charts play an important role

in a performance-based program of quality assurance because they provide an easily

interpreted picture of the statistical state of an analytical system Quality assessment

samples such as blanks, standards, and spike recoveries can be monitored with

prop-erty control charts A precision control chart can be used to monitor duplicate samples.

The first step in using a control chart for quality assurance is to determine the

mean value and the standard deviation (except when using the range) for the quality

assessment data while the system is under statistical control These values must be

es-tablished under the same conditions that will be present during the normal use of the

control chart Thus, preliminary data should be randomly collected throughout the

day, as well as over several days, to account for short-term and long-term variability.

The preliminary data are used to construct an initial control chart, and discrepant

points are determined using the rules discussed in the previous section Questionable

points are dropped, and the control chart is replotted As the control chart is used, it

may become apparent that the original limits need adjusting Control limits can be

re-calculated if the number of new data points is at least equivalent to the amount of data

used to construct the original control chart For example, if 15 points were initially

used, the limits can be reevaluated after 15 additional points are collected The 30

points are pooled together to calculate the new limits A second modification can be

made after a further 30 points have been collected Another indication that a control

chart needs to be modified is when points rarely exceed the warning limits In this case

the new limits can be recalculated using the last 20 points.

Once a control chart is in use, new quality assessment data should be added

at a rate sufficient to ensure that the system remains in statistical control As

with prescriptive approaches to quality assurance, when a quality assessment

sample is found to be out of statistical control, all samples analyzed since the last

successful verification of statistical control must be reanalyzed The advantage of

a performance-based approach to quality assurance is that a laboratory may use

its experience, guided by control charts, to determine the frequency for

collect-ing quality assessment samples When the system is stable, quality assessment

samples can be acquired less frequently.

15D KEY TERMS

control chart (p 714)

duplicate samples (p 708)

field blank (p 710)

good laboratory practices (p 706)

good measurement practices (p 706)

Trang 18

722 Modern Analytical Chemistry

Few analyses are so straightforward that high-quality results are

easily obtained Good analytical work requires careful planning

and an attention to detail Creating and maintaining a quality

as-surance program is one way to help ensure the quality of analytical

results Quality assurance programs usually include elements of

quality control and quality assessment

Quality control encompasses all activities used to bring a

sys-tem into statistical control The most important facet of quality

control is written documentation, including statements of good

laboratory practices, good measurement practices, standard

oper-ating procedures, and protocols for a specific purpose

Quality assessment includes the statistical tools used to mine whether an analysis is in a state of statistical control and, ifpossible, to suggest why an analysis has drifted out of statisticalcontrol Among the tools included in quality assessment are theanalysis of duplicate samples, the analysis of blanks, the analysis ofstandards, and the analysis of spike recoveries

deter-Another important quality assessment tool, which provides anongoing evaluation of an analysis, is a control chart A control chartplots a property, such as a spike recovery, as a function of time Re-sults exceeding warning and control limits, or unusual patterns ofdata indicate that an analysis is no longer under statistical control

15E SUMMARY

Bell, S C.; Moore, J “Integration of Quality Assurance/

Quality Control into Quantitative Analysis,” J Chem Educ.

1998, 75, 874–877.

The use of several QA/QC methods is described in this

article, including control charts for monitoring the

concentration of solutions of thiosulfate that have been

prepared and stored with and without proper

preservation; the use of method blanks and standard

samples to determine the presence of determinate error

and to establish single-operator characteristics; and the use

of spiked samples and recoveries to identify the presence

of determinate errors associated with collecting and

Marcos, J.; Ríos, A.; Valcárcel, M “Practicing Quality Control

in a Bioanalytical Experiment,” J Chem Educ 1995, 72,

947–949

This experiment demonstrates how control charts and ananalysis of variance can be used to evaluate the quality ofresults in a quantitative analysis for chlorophyll a and b inplant material

The following three experiments introduce aspects of quality assurance and quality control.

1 Make a list of good laboratory practices for the lab

accompanying this course (or another lab if this course does

not have an associated laboratory) Explain the rationale for

each item on your list

2 Write directives outlining good measurement practices for

(a) a buret, (b) a pH meter, and (c) a spectrophotometer

3 A method for the analysis of lead in industrial wastewater has

a method detection limit of 10 ppb The relationship between

the analytical signal and the concentration of lead, asdetermined from a calibration curve is

Smeas= 0.349×(ppm Pb)

Analysis of a sample in duplicate gives Smeasas 0.554 and0.516 Is the precision between these two duplicatesacceptable based on the limits shown in Table 15.1?

4 The following data were obtained for the duplicate analysis of

a 5.00-ppm NO3 standard

15G PROBLEMS

Trang 19

Calculate the standard deviation for the analysis of these

duplicate samples If the maximum limit for the relative

standard deviation is 1.5%, are these results acceptable?

5 Gonzalez and colleagues developed a voltammetric procedure

for the determination of tert-butylhydroxyanisole (BHA) in

chewing gum.13Analysis of a commercial chewing gum gave

results of 0.20 mg/g To evaluate the accuracy of their results,

they performed five spike recoveries, adding an amount of

BHA equivalent to 0.135 mg/g to each sample The

experimentally determined concentrations of BHA in these

samples were reported as 0.342, 0.340, 0.340, 0.324, and 0.322

mg/g Determine the % recovery for each sample and the

average % recovery

6 A sample is to be analyzed following the protocol shown in

Figure 15.2, using a method with a detection limit of 0.05

ppm The relationship between the analytical signal and the

concentration of the analyte, as determined from a calibration

curve is

Smeas= 0.273×(ppm analyte)Answer the following questions if the limits for a successful

spike recovery are ±10%

(a) A field blank is spiked with the analyte to a concentration

of 2.00 ppm and returned to the lab Analysis of the spikedfield blank gives a signal of 0.573 Is the spike recovery forthe field blank acceptable?

(b) The analysis of a spiked field blank is found to be

unacceptable To determine the source of the problem,

a spiked method blank is prepared by spiking distilledwater with the analyte to a concentration of 2.00 ppm

Analysis of the spiked method blank gives a signal of0.464 Is the source of the problem in the laboratory

or in the field?

(c) The analysis for a spiked field sample, BSF, was found to be

unacceptable To determine the source of the problem, thesample was spiked in the laboratory by adding sufficientanalyte to increase the concentration by 2.00 ppm

Analysis of the sample before and after the spike gave

signals of 0.456 for B and 1.03 for sample BSL Consideringthese data, what is the most likely source of the systematicerror?

7 The following data were obtained for the repetitive analysis of

8 The following data were obtained for the repetitive spike

recoveries of field samples.15

Sample % Recovery Sample % Recovery Sample % Recovery

Trang 20

724 Modern Analytical Chemistry

The following texts and articles may be consulted for an additional

discussion of the various aspects of quality assurance and quality

control

Amore, F “Good Analytical Practices,” Anal Chem 1979, 51,

1105A–1110A

Barnard, Jr A J.; Mitchell, R M.; Wolf, G E “Good Analytical

Practices in Quality Control,” Anal Chem 1978, 50,

1079A–1086A

Cairns, T.; Rogers, W M “Acceptable Analytical Data for Trace

Analysis,” Anal Chem 1993, 55, 54A–57A.

Taylor, J K Quality Assurance of Chemical Measurements Lewis

Publishers: Chelsea, MI, 1987

Additional information about the construction and use of controlcharts may be found in the following sources

Miller, J C.; Miller, J N Statistics for Analytical Chemistry, 3rd ed.

Ellis Horwood Limited: Chichester, England, 1993

Ouchi, G I “Creating Control Charts with a Spreadsheet

1 Taylor, J K Anal Chem 1981, 53, 1588A–1596A

2 Taylor, J K Anal Chem 1983, 55, 600A–608A.

3 Taylor, J K Am Lab October 1985, 67–75.

4 Nadkarni, R A Anal Chem 1991, 63, 675A–682A.

5 Valcárcel, M.; Ríos, A Trends Anal Chem 1994, 13, 17–23.

6 ACS Committee for Environmental Improvement, “Principles of

Environmental Analysis,” Anal Chem 1983, 55, 2210–2218.

7 American Public Health Association, Standard Methods for

the Analysis of Water and Wastewater, 18th ed Washington,

D.C., 1992

8 Keith, L H Environmental Sampling and Analysis: A Practical Guide.

Lewis Publishers, Chelsea, MI, 1991

9 Poppiti, J Environ Sci Technol 1994, 28, 151A–152A.

10 Environmental Monitoring and Support Laboratory, U.S

Environmental Protection Agency, “Handbook for Analytical QualityControl in Water and Wastewater Laboratories,” March 1979

11 Shewhart, W A Economic Control of the Quality of Manufactured

Products Macmillan: London, 1931.

12 Mullins, E Analyst 1994, 119, 369–375.

13 Gonzalez, A.; Ruiz, M A.; Yanez-Sedeno, P.; et al Anal Chim Acta

1994, 285, 63–71.

14 American Public Health Association, Standard Methods for the

Analysis of Water and Wastewater, 18th ed Washington, D.C., 1992.

Data from Table 1030:I on page 1–10

15 American Public Health Association, Standard Methods for the

Analysis of Water and Wastewater, 18th ed Washington, D.C., 1992.

Data adapted from Table 1030:II on page 1–10

15I REFERENCES

Trang 22

726 Modern Analytical Chemistry

For example, the proportion of the area under a normal distribution curve that lies to the right of a deviation of 0.04 is 0.4840, or 48.40% The area to

the left of the deviation is given as 1 – P Thus, 51.60% of the area under the normal distribution curve lies to the left of a deviation of 0.04 When the deviation is negative, the values in the table give the proportion of the area under the normal distribution curve that lies to the left of z; therefore,

48.40% of the area lies to the left, and 51.60% of the area lies to the right of a deviation of –0.04.

Appendix 1B

t-Tablea

Value of t for confidence interval of: 90% 95% 98% 99%

Critical value of ötö for αvalues of: 0.10 0.05 0.02 0.01

Trang 24

728 Modern Analytical Chemistry

F-Table for Two-Tailed Test at α = 0.05

Trang 25

KSbOC4H4O6 324.92 compound prepared by drying KSbOC4H4O6·1/2H2O

at 110 °C and storing in desiccator

Bi2O3 465.96 not considered a primary standard

continued

constant weight at 110 °C Most compounds can be dissolved in dilute acid (1:1 HCl or 1:1 HNO3), with heating if necessary; some of the compounds are water- soluble.

A ll compounds should be of the highest available

purity Metals should be cleaned with dilute acid to

remove surface impurities and rinsed with distilled water.

Unless otherwise indicated, compounds should be dried to

Trang 26

730 Modern Analytical Chemistry

MnSO4• H2O 169.01 not considered a primary standard; may be dried at

110 °C without loss of hydrated water

Source: Information compiled from Moody, J R.; Greenberg, R R.; Pratt, K W.; et al Anal Chem 1988, 60, 1203A–1218A; and Smith, B W.; Parsons,

M L J Chem Educ 1973, 50, 679–681.

Trang 28

732 Modern Analytical Chemistry

Source: All values are from Martell, A E.; Smith, R M Critical Stability Constants, Vol 4 Plenum Press: New York, 1976 Unless otherwise stated, values

are for 25 °C and zero ionic strength.

Appendix 3B

Acid Dissociation Constants

Trang 29

Compound Conjugate Acid pKa Ka

OH

NH3

NH3 OCHCH2CNH2COOH

NH3CHCH2COOHCOOH

COOH

CH2NH3

OH

OH

Trang 30

Acid Dissociation Constants—continued

OH

NH3CHCH2SHCOOH

NOHHON

CH3

H3C

CH2COOHHOOCH2C

Trang 31

Compound Conjugate Acid pKa Ka

CHCH2COOH

NH3

HN

NH3CHCH2CH(CH3)2COOH

NH3CHCH2CH2CH2CH2NH3COOH

Trang 32

Acid Dissociation Constants—continued

OH

NH3CHCH2CH2SCH3COOH

NO2

Trang 33

Compound Conjugate Acid pKa Ka

NH3CHCH2C6H5COOH

COOH

COOH

NH2

COOHN

H2

Trang 34

Acid Dissociation Constants—continued

COOH

OH

NH3CHCH2OHCOOH

Trang 35

Compound Conjugate Acid pKa Ka

Source: All values are from Martell, A E.; Smith, R M Critical Stability Constants, Vol 1–4 Plenum Press: New York, 1976 Unless otherwise stated, values

are for 25 °C and zero ionic strength Values in parentheses are considered less reliable.

NH3

HN

OHCHCH2

COOH

NH3

NH3CHCH(CH3)2COOH

Trang 36

Metal–Ligand Formation Constants—continued

Trang 38

Metal–Ligand Formation Constants—continued

Source: All values are from Martell, A E.; Smith, R M Critical Stability Constants, Vol 1–4, Plenum Press: New York, 1976 Unless otherwise stated, values

are for 25 °C and zero ionic strength Values in parentheses are considered less reliable.

742 Modern Analytical Chemistry

N

NH+

Trang 39

BrO3–+ 6H++ 6e– t Br–+ 3H2O 1.478

Cd2++ 2e– t Cd(s) –0.4030Cd(CN)42–+ 2e– t Cd(s) + 4CN– –0.943Cd(NH3)42++ 2e– t Cd(s) + 4NH3 –0.622

Cl2(g) + 2e– t 2Cl– 1.396ClO–+ H2O + e– t 1⁄2Cl2(g) + 2OH– 0.421 1 M NaOHClO–+ H2O + 2e– t Cl–+ 2OH– 0.890 1 M NaOHHClO2+ 2H++ 2e– t HOCl + H2O 1.64

ClO3–+ 2H++ e– t ClO2(g) + H2O 1.175ClO3–+ 3H++ 2e– t HClO2+ H2O 1.181ClO4–+ 2H++ 2e– t ClO3–+ H2O 1.201

continued

Trang 40

MnO4–+ 8H++ 5e– t Mn2++ 4H2O 1.51MnO4–+ 2H2O + 3e– t MnO2(s) + 4OH– 0.60

Hg2++ 2e– t Hg(l) 0.85352Hg2++ 2e– t Hg22+ 0.911

744 Modern Analytical Chemistry

Ngày đăng: 13/08/2014, 02:20

TỪ KHÓA LIÊN QUAN