1. Trang chủ
  2. » Giáo Dục - Đào Tạo

Contaminated Ground Water and Sediment - Chapter 3 pdf

72 516 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Contaminated Ground Water and Sediment
Tác giả George F. Pinder, David E. Dougherty, Robert M. Greenwald, George P. Karatzas, Peter K. Kitanidis, Hugo A. Loaiciga, Reed M. Maxwell, Alexander S. Mayer, Dennis B. McLaughlin, Richard C. Peralta, Donna M. Rizzo, Brian J. Wagner, Kathleen M. Yager, William W.-G. Yeh
Trường học University of California, Berkeley
Chuyên ngành Environmental Engineering
Thể loại monograph
Năm xuất bản 2003
Thành phố Berkeley
Định dạng
Số trang 72
Dung lượng 1,77 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Management Model3.3.2.3 Alternative Stochastic Optimization Methods3.3.3 Uncertainty3.3.3.1 Sources3.3.3.2 Examples3.3.4 Design-Risk Cost Tradeoff3.3.5 Long-Term Ground Water Monitoring3

Trang 1

3 Optimization and Modeling for Remediation and Monitoring

prepared by George F Pinder

with contributions by David E Dougherty, Robert M Greenwald, George P Karatzas, Peter K Kitanidis, Hugo A Loaiciga, Reed M Maxwell, Alexander S Mayer, Dennis B McLaughlin, Richard C Peralta, Donna M Rizzo, Brian J Wagner, Kathleen M Yager, William W.-G Yeh

CONTENTS

3.1 Introduction3.2 The User’s Persective3.2.1 The View from the U.S Environmental Protection Agency (USEPA)

3.2.2 The View from the U.S Department of Energy (DOE)3.2.2.1 Application of Site Characterization

and Monitoring Technologies3.2.2.2 Numerical and Optimization Models3.2.2.3 Innovative Technologies and the Regulatory Process3.2.2.4 Technology Needs

3.2.3 The View from the U.S Department of Defense (DoD)3.2.3.1 Optimization Efforts

3.2.3.2 Model Development Efforts3.2.3.3 Monitoring Efforts

3.2.4 The View from Industry3.3 State of Knowledge and Practice3.3.1 The Simulation Optimization Approach3.3.1.1 Gradient Control Remediation Technology3.3.1.2 Concentration Constraints Remediation Technology

L1667_C03.fm Page 107 Tuesday, October 21, 2003 3:49 PM

Trang 2

Management Model3.3.2.3 Alternative Stochastic Optimization Methods3.3.3 Uncertainty

3.3.3.1 Sources3.3.3.2 Examples3.3.4 Design-Risk Cost Tradeoff3.3.5 Long-Term Ground Water Monitoring3.3.5.1 The Relationship between Remedy and Monitoring3.3.5.2 Performance Monitoring Problems

3.3.5.3 Methods3.4 Gaining Acceptance3.4.1 Remediation System Design Optimization Demonstrations3.4.1.1 Dissolved TCE Cleanup at Central Base Area,

Norton Air Force Base, California3.4.1.2 Model Calibration and TCE/PCE Plume Containment

at March AFB, California3.4.1.3 Containment and Cleanup of TCE and DCE Plumes,

Wurtsmith AFB, Michigan3.4.1.4 Dissolved TCE Cleanup at Massachusetts

Military Reservation3.4.2 Long-Term Monitoring Field Studies3.4.3 Communication Improvements3.5 Challenges and Emerging Issues3.5.1 Optimization Algorithmic Challenges IdentiÞed through Application Needs

3.5.1.1 Natural Variability Over Space and Time3.5.1.2 Multiple Constituents

3.5.1.3 Multiple Phases3.6 Summary

AcknowledgmentsReferences

3.1 INTRODUCTION

The focus of this chapter is optimization and modeling for remediation and toring The goal is to provide the reader with insights into the optimization andmodeling tools available for cost-effective resolution of environmental problems,especially as they pertain to ground water contamination and its long-term impacts

moni-To achieve this goal, the technical and practical challenges inherent in this approachare presented as well as documented accomplishments Utilizing this organizationalapproach, the reader should comprehend both the Þnancial beneÞts and the antici-

L1667_C03.fm Page 108 Tuesday, October 21, 2003 3:49 PM

Trang 3

per-While the discussion of each speciÞc issue reßects the views of the authors, theissues have been deÞned in such a way as to provide an integrated discussion of themain topics Nevertheless, the styles, formats, and levels of technical detail found

in the various presentations are, by their nature, different

3.2 THE USER’S PERSECTIVE 3.2.1 T HE V IEW FROM THE U.S E NVIRONMENTAL P ROTECTION

remediation technologies One such promising innovative technology is ical optimization for the design and redesign of remediation and monitoring systems.However, as with many innovative technologies, the regulated community has beenreluctant to adopt these approaches readily due to the lack of cost and performancedata and concern over regulatory acceptance

mathemat-In 1999, the USEPA completed a demonstration project applying hydraulicoptimization techniques for pump-and-treat systems (Greenwald, 1999) The scope

of this study included selecting three sites with existing pump-and-treat systems,screening the sites for optimization potential, and applying a hydraulic optimizationcode at each site At two of the sites, pumping solutions were obtained that had thepotential to yield millions of dollars in savings relative to current pumping rate costs

At the third site, no substantial improvement over the current design was identiÞedwith optimization The general conclusions from this study were that hydraulicoptimization has the potential to improve operating pump-and-treat systems and thatmore complicated sites (i.e., large ground water plumes and many extraction andinjections wells) are more likely to beneÞt from hydraulic optimization It is impor-tant to note that there are many mathematical optimization algorithms available andthat this study evaluated only one hydraulic optimization approach

Although this study conclusively determined that mathematical optimization can

be beneÞcial at improving pump-and-treat system design, very few applications ofthis technology have been observed at USEPA or other state-led sites This lack ofapplication of optimization algorithms can be attributed to several factors, includinglack of technology awareness, lack of well-trained optimization modelers in theconsulting engineering community, and cost Certainly the lack of awareness of

L1667_C03.fm Page 109 Tuesday, October 21, 2003 3:49 PM

Trang 4

optimization techniques in the remediation community is the primary factor tributing to low use Although optimization algorithms are widespread in manyindustries, the remediation community has not adopted these techniques as standardpractice for remediation Furthermore, there are few trained users or real-worldexamples of their applications For this reason, industry and the consulting engi-neering and government communities are not fully aware of the beneÞts of optimi-zation algorithms and do not have personnel trained in these applications Withoutthe pull from problem holders requesting these techniques or the push from con-sulting engineers recommending their use, there is minimal demand for applyingoptimization algorithms in hazardous waste site cleanup From a regulatory perspec-tive, because few sites have requested the use of mathematical optimization algo-rithms, regulators have not been widely exposed to their applications

con-Another problem associated with the lack of use of mathematical optimization

is cost or perceived cost Many sites have developed simple ßow models based onlimited site-characterization information These models are generally used as one tool

in the decision-making process for the site, but often are not adequate models for usewith an optimization algorithm In order to ensure a worthwhile optimization analysis,the base model(s) might need to be updated or completely redone, which is anadditional (and sometimes unforeseen) cost This additional step prior to an optimi-zation analysis can discourage continuing with the optimization analysis

Although there are several reasons for the lack of use of optimization in theremediation Þeld, there remains a deÞnitive need to improve remediation systemsusing mathematical optimization or other approaches The USEPA estimates thatover 700 pump-and-treat systems are under construction or operating at Superfundsites across the country per the Records of Decisions (RODs) (USEPA, 1999) Many

of these systems are anticipated to operate for years to decades at substantial cost

to industry and the government Furthermore, many of these systems were designedbased on limited site information and limited knowledge of the capabilities of pump-and-treat systems All stakeholders can beneÞt substantially by implementing math-ematical optimization techniques in these cases Other continuous improvementtechniques such as periodically evaluating system performance, labor and monitoringpractices, aboveground treatment components, and data management also should beconsidered There is a tremendous need to ensure that pump-and-treat systems andother remedial systems are properly designed, maintained, and monitored; the reme-diation community should consider the use of optimization approaches to this end

3.2.2 T HE V IEW FROM THE U.S D EPARTMENT OF E NERGY (DOE)

The DOE is a major partner in managing the nation’s toxic substances in thesubsurface The DOE has administrative jurisdiction over several sites that containremnants of radioactive and other toxic wastes generated during the Cold War’snuclear race Those sites include, but are not limited to, Hanford Reservation(Washington), Idaho National Environmental Engineering Laboratory (INEEL,Idaho), Oak Ridge National Laboratory (ORNL, Tennessee), Rocky Flats (Colo-rado), and the Savannah River site (SRS, South Carolina and Georgia) Taken inconjunction, these DOE sites represent perhaps the most signiÞcant repository of

L1667_C03.fm Page 110 Tuesday, October 21, 2003 3:49 PM

Trang 5

radioactive compounds in the subsurface environment in the U.S The cost ofcontainment, abatement, and remediation (i.e., environmental restoration) associ-ated with DOE sites outranks that of any other agency, public or private, in thenation The monitoring, characterization, and modeling of subsurface pollutants atDOE sites present enormous challenges due to the nature of the pollutants and thecomplexity and heterogeneity of the transport environment On the other hand, thechallenges present opportunities to use innovative optimization methods to helpidentify environmental restoration technologies at DOE sites

This section summarizes the results of a recent survey of subsurface ization, environmental monitoring, and modeling technologies at DOE sites Numer-ical modeling technologies included optimization models as well Although the focus

character-of this chapter is on optimization methods, information gathered on all subsurfacecharacterization and environmental monitoring technologies are presented to dem-onstrate that the application of optimization methods at DOE sites cannot be exam-ined in isolation from other technologies In fact, optimization methods are onlybeginning to be tested at large-scale DOE sites In this respect, their usefulness andeffectiveness in large-scale and complex subsurface pollution situations at DOE sites

is still experimental It should be noted that the information and opinions presented

in this section do not reßect the DOE’s ofÞcial position on subsurface contaminationmanagement at its sites

3.2.2.1 Application of Site Characterization

and Monitoring Technologies

Table 3.1 shows a summary of technologies currently in use or those that have beenused at INEEL, ORNL, and SRS in subsurface characterization, environmentalmonitoring, and modeling This table also summarizes the survey responses obtainedfrom the three sites An “X” in Table 3.1 indicates that the technology is currentlyused A blank space indicates neither current nor past use of a speciÞc technology

As seen in the table, a wide range of remote sensing, geophysical technologies,nuclear logging, drilling, ground water and vadose zone sampling, analytical tech-nologies, and numerical/statistical technologies, as well as optimization methods,are currently in use or have been used at all three sites INEEL and ORNL reportedthe use of 12 to 13 of the 30 listed analytical technologies These two sites relylargely on off site analytical laboratories for sample analysis Thus, many of thelisted analytical technologies are not deployed as functional units within INEEL orORNL SRS, on the other hand, reported the application of 24 of the 30 listedanalytical technologies This reßects the fact that Westinghouse, the managementand operation (M&O) contractor at SRS, maintains fully equipped and staffedanalytical laboratories within the SRS boundaries, where many of the Þeld samplesundergo analysis

Ecological monitoring, an aspect of site characterization, was overlooked tially and not mentioned in the survey ORNL and SRS actively monitor vegetation,Þsh, mammals, and other biota, as well as surface water bodies Living organismsare tested mostly for radionuclides and metals that accumulate in tissues (e.g., cesiumand strontium isotopes, mercury) Ecological monitoring is performed by capturing

ini-L1667_C03.fm Page 111 Tuesday, October 21, 2003 3:49 PM

Trang 6

Ground Water Sampling

Soils Characterization

Vadose Zone Water and Gas Monitoring

Electronic leak detection system

Thermocouple psychrometers

Frequency-domain capacity probes

Automatic VOC collection/gas chromatography

L1667_C03.fm Page 112 Tuesday, October 21, 2003 3:49 PM

Trang 7

Super-critical ßuid chromatography

Ion mobility spectrometry

Laser-induced breakdown spectrometry

Near-IR reßectance/transmission spectrometry

Radiation detectors (e.g., Geiger counter, solid/liquid

scintillator, semiconductor detector)

Environmental test kits (color testing, titrimetric testing,

penetrometer system; VOC = volatile organic compound; IR = infrared spectroscopy; ORP = oxidation reduction potential; TDS = total dissolved solids; DO = dissolved oxygen; RDX = royal demolition explosive; TNT = trinitrotoluene; PCBs = polychlorinated biphenyls

TABLE 3.1 (CONTINUED)

Summary of Site Characterization, Environmental Monitoring, and Modeling Technologies Used at Selected DOE Sites

L1667_C03.fm Page 113 Tuesday, October 21, 2003 3:49 PM

Trang 8

complex-of contamination spreading through soil, water, air, and living organisms.

3.2.2.2 Numerical and Optimization Models

Environmental restoration has progressed from screening-level and deÞnitive-levelcharacterization to risk analysis, containment, abatement, and remediation As aresult, models have become ßexible and useful tools for creating and analyzing avariety of scenarios in a cost-effective manner For example, a mass transport numer-ical model can simulate the fate and transport of benzene in ground water that isbeing pumped, treated, and recharged according to a speciÞc pump-and-treat scheme

Or a vadose zone model such as SESOIL can be implemented to assess the effect

of soil capping on long-term metal vertical migration in the vadose zone

Numerical, spatial, and statistical models are accepted and used for a wide range

of applications at all three sites (Table 3.1) Modelers at DOE sites typically are part

of the risk analysis groups at these sites The risk analysis groups determine thelikelihood of environmental harm caused by pollutants within DOE sites By andlarge, they house most of the personnel qualiÞed to work with simulation andoptimization models

The state-of-the-art of optimization modeling at DOE sites consists of ristic search techniques based on ground water ßow and transport models In thisapproach, the analyst implements ground water and transport models for aselected range of stress or remediation control variables (e.g., pumping rates, soilventing aeration, permeable treatment bed thickness) The measure of effective-ness of a particular control variable is then assessed For example, the amount

heu-of a polar hydrocarbon retained in a permeable treatment bed is determined as afunction of the bed’s thickness Or the concentration of a chlorinated hydrocarbonremaining in solution is assessed as a function of the pumping rate in a pump-and-treat system The analyst applies his experience and professional judgment

in constraining the feasible range of the decision variables, while noting otherimportant factors such as the cost of containment, abatement and remediation,the time required to achieve desired targets, and other regulatory constraints.Expert systems, also called decision support systems (DSS), have been developed

to assist risk analysts in the search for the best environmental restoration natives in the heuristic approach (Loaiciga and Leipnik, 2000) The Þnal result

alter-of the heuristic search is a series alter-of values alter-of the measure alter-of pollution-controleffectiveness and related parameters needed to achieve it An assessment of theuncertainty associated with each of the entertained pollution-control options can

be issued also The Þnal pollution-control decision, which can be a mixture of

L1667_C03.fm Page 114 Tuesday, October 21, 2003 3:49 PM

Trang 9

alternative restoration technologies, is arrived at through a consensus-buildingapproach that involves contractors, DOE personnel, and regulators (state andfederal) The implementation of restoration strategies relies heavily on real-timemonitoring to make adjustments as needed while the restoration work progresses

In this sense, the restoration work relies on feedback and corrections to achievepollution-control targets

The implementation of optimization modeling at DOE sites is a distantvariation of the classical open-loop optimization prevalent in research literature.Classical open-loop optimization refers to optimizing a system that has no feed-back control and primarily employs linear, nonlinear, and dynamic programmingalgorithms Contaminant processes of varying degrees of complexity are imbed-ded in the mathematical formulation of the search algorithm, which yields a set

of decision variables that maximizes or minimizes a prespeciÞed restorationbeneÞt/cost (objective) function while satisfying a set of constraints imposed bythe control, abatement, and remediation technologies; by resource and economiclimitations; and by the intervening biological, chemical, and physical processes(Willis and Yeh, 1987) Because restoration strategies derived by the classicaloptimization approach have no feedback mechanism, they are best interpreted asplausible courses of action that need frequent updating to achieve desired goals.The greatest limitation of classical optimization is its ability to deal with thesubtleties and complexities of real-world restoration problems at DOE sites.Another obstacle to its adoption by DOE is the high degree of specializationrequired by the users These obstacles render classical optimization out of reachfor DOE users and others

3.2.2.3 Innovative Technologies and the Regulatory Process

One of the key issues raised at all surveyed DOE sites is the role that state andfederal regulations play in the application of new environmental restoration tech-nologies According to input received during interviews, state and federal regulatorsare generally risk-averse when approving new characterization, monitoring, andmodeling technologies Technical procedures for sample collection and analysisapproved at each site rely on traditional and presumably well-tested technologies.Thus, for example, a split-spoon sampler is preferred over a Geoprobe® soil corer

at INEEL because the latter has not been proven to regulators to yield samples of

at least equal representativeness to those obtained by the former This preferenceexists in spite of the fact that the Geoprobe soil corer yields shallow and deep soilcores that preserve the integrity of volatile organic compounds (VOCs) in the soilmatrix — a most difÞcult task with split-spoon samplers On the other hand, someexamples justify the risk aversion of regulators toward new technologies One is thecase of a polychlorinated biphenyl (PCB) in situ immunoassay test kit that was used

at SRS in an attempt to separate PCB debris at an old, weathered landÞll About50% of the in situ results were false positives Debris separation ultimately relied

on standard sample collection and laboratory analysis

DOE sites have so-called technology demonstration programs that seem idealfor testing new environmental restoration technologies Such a program could be

L1667_C03.fm Page 115 Tuesday, October 21, 2003 3:49 PM

Trang 10

a natural framework under which to test a novel apparatus, technique, or model,and, if successful, approve it for Þeld deployment or application The reality issomewhat different Contractors work under strict federal facility compliance agree-ments (FFCAs) that stipulate the environmental restoration milestones and dead-lines to be met under agreed-upon budgets As a result, the contractors have limitedfunding, time, and resources to develop, test, and permit new equipment andsimulation models Alternatively, the new technology research and developmentcould be undertaken by universities or other research centers and then transferred

to DOE if proven successful in test trials The latter avenue seems a necessity foroptimization modeling, which requires signiÞcant mathematical and computationalskills rarely found outside university laboratories Yet, a considerable gap remainsbetween the capabilities currently offered by optimization modeling and the realitiesand complexities of DOE environmental restoration It is in this respect that pilottest projects are most needed to determine the potential contribution of optimizationtechniques to environmental restoration at DOE sites

3.2.2.4 Technology Needs

Finally, site-characterization technology users expressed consensus on the need for afew technologies that, if available, would greatly expedite environmental restorationefforts First is a Þeld-deployable probe for radionuclide speciation with adequatequantitative accuracy Such a device would bypass arduous and hazardous sampling,handling, testing, and disposal of radioactive materials The other technology in theusers’ wish lists is an accurate in situ analyzer for VOCs in soils and ground water.VOC loss during sampling is a major problem that biases analytical results, and VOCsrepresent the second most threatening contaminant after radionuclides at INEEL,ORNL, and SRS Low priority was given to optimization model application, probablydue to limited experience with applications at DOE sites and the lack of familiarity

of DOE managers, regulators, and contractors with optimization models in general

3.2.3 T HE V IEW FROM THE U.S D EPARTMENT OF D EFENSE (D O D)

Congress established the Defense Environmental Restoration Program (DERP) in

1984 to remediate contamination at DoD sites Since then, the DoD has spent almost

$20 billion on the DERP through two accounts About $5 billion has been spentthrough the Base Realignment and Closure Act (BRAC) account to remediate basesbeing closed and transferred to civilian use The rest of the funds have been spentthrough the DERP account at bases remaining active

Funding limitations make it necessary to prioritize remedial activities andapproaches After all, some contamination poses less risk than others Some reme-diation approaches cost less than others, but the cheaper alternative can take longer

to achieve about the same result as the more expensive alternative Selecting aremediation approach for a contaminated site can involve economic analysis andcompromise between DoD and environmental regulatory agencies

The DoD has attempted to improve the efÞcacy and reduce the cost of ation Included actions have involved innovative technology demonstration projects,

remedi-L1667_C03.fm Page 116 Tuesday, October 21, 2003 3:49 PM

Trang 11

technology transfer, system operation evaluations, research, and development InDoD parlance, optimization refers to any effort to improve a process Optimizationcan involve reducing costs of construction, labor, energy, treatment, monitoring,analysis, reporting, documentation, data retrieval, or data archiving without endan-gering human health and safety or the environment This section mentions someDoD optimization efforts in monitoring, analysis, and remediation Most do notinclude formal mathematical optimization All the major defense services and agen-cies support some efforts in optimization

3.2.3.1 Optimization Efforts

DoD agencies share information and methods with each other and other tions The Technology Transfer Division of the Air Force Center for EnvironmentalExcellence (AFCEE/ERT) organizes an Annual Technology Transfer Conferencehighlighting new developments and lessons learned by DoD services and agencies,the U.S Geological Survey (USGS), and the USEPA

organiza-Each military service has at least one center developing improvements in diation technology, often in collaboration with other organizations For example, theNaval Facilities Engineering Service Center (NAVFAC), in cooperation with theother services and the USEPA, currently leads a project demonstrating pump-and-treat optimization The Army has several centers of expertise that apply and publishremediation guidance This section does not mention all DoD centers or initiatives;however, it does discuss example demonstration and technology transfer initiativespromoting optimized methods.

reme-3.2.3.1.1 AFCEE Pump-and-Treat Optimization

AFCEE/ERC conceived and awarded a project in 1993 to demonstrate applyingformal optimization to pump-and-treat or pump, treat, and reinject operations (Here-inafter, pump-and-treat is used to refer to both types of systems.) Resulting effortsdemonstrated that signiÞcant cost reductions could result from applying simulationoptimization modeling to pump-and-treat system design and pumping-strategydevelopment Additional simulation optimization applications at Air Force and DoDsites followed the ERC project

3.2.3.1.2 DoD Pump-and-Treat Operation Evaluation

By 1996, DoD was operating 75 pump-and-treat systems as the primary remedy forsites having chlorinated solvent–contaminated ground water Because of the largeoperation and maintenance (O&M) costs, the DoD OfÞce of the Inspector Generaldecided to evaluate the cost and effectiveness of these systems Some of the Þndingsare as follows:

• Annual pump-and-treat system costs reached $40 million by 1996

• Many of the pump-and-treat systems were designed before more suitabletechnologies were available

• Sometimes the achieved remediation using pump-and-treat was slow

• Some pump-and-treat systems were not going to achieve required cleanupgoals within a reasonable period

L1667_C03.fm Page 117 Tuesday, October 21, 2003 3:49 PM

Trang 12

• Many of the pump-and-treat systems had indeÞnite shutoff dates

• Continuing the operations and monitoring of pump-and-treat systemswould consume an increasing portion of the DERA (DoD, 1998)

The DoD recommended that military services and agencies evaluate the existingsystems to determine whether replacing pump-and-treat with other technologiesmight improve performance or reduce cost The Inspector General recommendedthat the military cooperate with the public, scientists, and environmental regulators

to determine more effective alternate remediation methods

Respondents to the DoD evaluation in 1998 indicated that monitored naturalattenuation was the preferred remediation approach and that this approach was beingselected for new sites, if possible Partially as a result of the DoD report, the militaryincreased efforts to improve pump-and-treat management and use better and lesscostly approaches when appropriate

3.2.3.1.3 Air Force/Defense Logistics Agency Remediation

Process Optimization (RPO)

RPO is a program-management tool developed by the AFCEE/ERT to provide asystematic iterative approach to evaluate all phases of remedial actions and updateand optimize the effectiveness and efÞciency of efforts to achieve cleanup goals.RPO provides a mechanism to feed information back into the decision process sothat goals can be updated (if necessary) and met The objective of RPO is to utilizebest practice technical and management approaches to protect human health and theenvironment (AFCEE, 1999) The Air Force Base Conversion Agency (AFBCA) isapplying the RPO process through AFCEE/ERT and is responsible for remediationprograms of bases being closed and converted to civilian use Contaminated propertycannot be transferred to civilian ownership until a remediation method approved byregulators is in place Because Congress wants property ownership to proceed asquickly as possible, AFBCA remediation projects have a strong temporal component.The AFBCA is eager to get approved remedies in place as quickly as possible, withinfunding limits, so it can transfer property ownership With this in mind, the AFBCAhas initiated a program to periodically (usually every 5 years) reevaluate pump-and-treat operations and the need for existing remediation systems

One of the Þrst RPO reports concerned Operable Unit 1 (OU1) of AFBCA’sGeorge Air Force Base OU1 contains a pump-and-treat system to treat a trichloro-ethene (TCE) plume that originates in an upper unconÞned stratum and reaches alower stratum Per the ROD, the pump-and-treat system must contain the plume ofdissolved TCE and reduce concentrations to below 5 ppb The pump-and-treat systembegan operation in 1991 and was augmented in 1996 Treated water is injected intothe upper stratum upgradient of the plume Regulators fear that the injection increasescontamination migration to the lower stratum The Air Force and environmentalregulators have not yet reached agreement on the site conceptual model and on howthe contamination reaches the lower stratum

The RPO report states that the pump-and-treat system has been inefÞcient inreducing mass, and it is questionable whether this method will achieve cleanup goalswithin a reasonable period RPO recommendations included the following:

L1667_C03.fm Page 118 Tuesday, October 21, 2003 3:49 PM

Trang 13

• Ceasing pumping at 11 of the 18 extraction wells, reducing ßow by up

to 50%

• Evaluating water treatment and disposal alternatives

• Reducing sampling frequency to annual from semiannual

• Reducing the number of sampled monitoring wells from 47 to 34

• Pursuing alternative cleanup goals

• Fully evaluating other potential remediation measures (e.g., monitorednatural attenuation, phytoremediation)

The RPO team considers that implementing short-term recommendations couldreduce annual costs by more than $170,000, and long-term recommendations couldsave $5 million during the remaining 33-year project life

3.2.3.2 Model Development Efforts

In response to the need for improved integrated software to aid ground water cleanup,the DoD, in partnership with the DOE, USEPA, Cray Research, and 20 academicpartners, has developed the DoD Ground Water Modeling System (GMS)(http://chl.wes.army.mil/software/gms/) The GMS is comprehensive, integratedsoftware for simulating subsurface ßow and contaminant fate and transport Itincludes many popular or public domain simulation models GMS simpliÞes groundwater ßow and transport modeling by making it easy to use an assemblage ofcomputational tools GMS provides tools for simulation, site characterization, modelconceptualization, mesh and grid generation, geostatistical evaluation, visualization,and simulation

3.2.3.3 Monitoring Efforts

3.2.3.3.1 Passive Diffusion Bag (PDB) Samplers

Using PDB samplers (developed by Don Vroblesky of the USGS) can signiÞcantlyreduce the cost of ground water sampling PDB samplers can obtain representativeVOC ground water concentrations from monitoring wells A typical PDB samplerconsists of a low-density polyethylene tube that is closed at both ends and lies ßatwhen empty The tube is Þlled with deionized water and is positioned at the targetlocation in the aquifer by attachment to a weighted line The PDB samplers equil-ibrate within approximately 48 h for TCE and tetrachloroethene (PCE) Vinyl chlo-ride and some chloroethanes can require between 96 and 168 h to equilibrate Thesamplers remain in a well at least 2 weeks to allow the well water to restabilize afterthe disruption and absorption caused by the sampler Recovery consists of removingthe samplers from the well and immediately transferring the enclosed water to 40-

ml sampling vials for analysis The samplers can help delineate contaminant iÞcation in wells having insigniÞcant vertical ßow, and multiple PDB samplers can

strat-be used to help identify chemically stratiÞed wells or wells with ßow pattern changesthrough the screen as a result of ground water pumping or seasonal ßuctuations.However, PDB samplers are ineffective for inorganic ions or for highly solubleorganics such as methyl tert butyl ether (MTBE)

L1667_C03.fm Page 119 Tuesday, October 21, 2003 3:49 PM

Trang 14

Three years of intensive testing at Air Force and Navy sites indicate that samplingwith PDBs produces data as accurate as those obtained through other presentlyapproved sampling techniques Using PDB samplers can result in cost savings of

50 to 70% The AFCEE/ERT and USGS are evaluating a new passive samplerexclusively for inorganic and natural attenuation parameters and analytes

An interagency workgroup, including the Air Force, Army, Navy, Defense tics Agency (DLA), USEPA, Interstate Technology and Regulatory Cooperation(ITRC), and USGS, published a user’s guide for PDB samplers (USGS, 2001) TheAFCEE/ERT sponsored development of the guidance and is implementing its use

Logis-at 20 DoD installLogis-ations

3.2.3.3.2 Pneumatic Well Logging (PneuLog ® ) of Soil Vapor

Extraction (SVE) Wells

PneuLog (a product of Praxis Environmental Technologies) is an in-well ment used to quickly deÞne the vertical distribution of contamination and soilpermeability in SVE wells PneuLog provides much greater vertical proÞling datathan any other available technique optimizing or supporting closure of SVE sys-tems Under active vapor extraction, the PneuLog device is lowered and raisedalong a well screen using an automated cable reel while simultaneously recordingthe ßow rate and total vapor concentration Flow can be attributed to speciÞc soilintervals from the measured changes in cumulative ßow Additionally, this change

instru-in ßow over a depth instru-interval effectively deÞnes its permeability The contaminstru-inantvapor concentration is measured continuously through a Teßon® sampling tubelocated just above the ßow sensor and conveyed to the surface where it is analyzedusing a photoionization detector (PID) A mass balance is used to determine aproÞle of the soil-gas contamination from the changes in cumulative ßow and totalconcentration measured in the well In addition, vapor samples can be collected

at discrete depths for compound-speciÞc analyses Site conceptual models areimproved by deÞning preferential ßow paths that bypass contaminated intervalsand by identifying mass transfer limited soils that extend cleanup times The dataallow SVE to focus on the most contaminated intervals and avoid stagnation zones

A more detailed soil-gas concentration proÞle allows more accurate contaminanttransport modeling to assess the risk from residual contamination More accuraterisk evaluation allows remedial managers to know when the vadose zone is sufÞ-ciently clean to terminate SVE This technology is applicable only to the screenintervals of active SVE wells

PneuLog has been utilized at numerous BRAC and active Air Force bases toimprove conceptual site models, enhance SVE operations, and support closure ofSVE systems The AFCEE funded use of the technology in the initial characterization

of the vadose zone at three sites and supported efforts to optimize SVE operations

at seven sites using PneuLog The optimization effort at the seven sites has savedthe Air Force an estimated $300,000 to $500,000 to date Additional cost savingswill be achieved over time as these sites achieve closure based on the detailed dataset provided by PneuLog Further details of these site-speciÞc efforts are provided

in the Þnal report submitted by Praxis to the AFCEE (Praxis Environmental nologies, 2000)

Tech-L1667_C03.fm Page 120 Tuesday, October 21, 2003 3:49 PM

Trang 15

3.2.4 T HE V IEW FROM I NDUSTRY

Industry is primarily interested in reducing long-term cost liability Therefore, try generally is willing to spend money up front for an optimization analysis (plusany subsequent costs associated with system modiÞcations) if it is considered likelythat the total life-cycle cost will be reduced as a result Making this assessmentrequires a site-speciÞc cost-beneÞt analysis prior to a full optimization evaluationthat accounts for the expected cost of the optimization analysis, expected costs ofsystem modiÞcations, and expected savings

indus-Industry generally performs cost evaluations in terms of net present value(NPV), using a discount rate that adjusts future expenditures to their present value.(Money not spent today can generally be invested by industry at a rate that exceedsinßation; therefore, current dollars are worth more than future dollars.) Conse-quently, optimization analyses performed for industry should be performed withrespect to NPV

Industry must gain the approval of regulatory agencies to implement or modifyremedial strategies Strategies that are derived by using mathematical optimizationtechniques linked to ground water simulation models are no different in this regardthan strategies derived solely on the basis of ground water modeling, because themathematical optimization algorithms simply perform a series of simulations withthe ground water model in an efÞcient order Therefore, regulatory issues shouldfocus on the validity of the ground water model predictions Once that is established(i.e., the simulation model is accepted as a valid design tool by the regulators), thelinkage of mathematical optimization algorithms with the simulation model shouldnot create additional regulatory concerns

Industry is keenly aware that new approaches to ground water remediationcontinue to evolve In some cases, the evolution is because of new technology (e.g.,

in situ bioremediation, chemical oxidation, permeable reactive barriers), and in othercases, regulatory reform (e.g., monitored natural attenuation) The determination ofwhether to apply simulation optimization techniques must consider not only thepotential beneÞts with respect to the current remediation strategy (e.g., pump andtreat), but also whether resources are better spent pursuing alternative remedialapproaches that might replace and/or augment the current remediation strategy Atmany sites, long-term ground water monitoring costs can in fact be the greatest life-cycle cost component At these sites, the optimization of ground water monitoringcan represent the greatest opportunity for future cost savings

3.3 STATE OF KNOWLEDGE AND PRACTICE

The contamination of ground water supplies poses widespread and signiÞcant ronmental problems In the past few decades, different remediation strategies havebeen applied and a great deal of research is in progress The most common groundwater remediation techniques are as follows:

envi-• Pump and treat

• Bioremediation

L1667_C03.fm Page 121 Tuesday, October 21, 2003 3:49 PM

Trang 16

In recent years, optimization management models have been developed to design

ground water remediation strategies These models combine mathematical

optimi-zation techniques with ground water ßow and mass transport simulators to determine

an optimal remedial design Regarding the aforementioned remediation techniques,

optimization management models have been presented to pump-and-treat (early

1970s) (Gorelick, 1983) and bioremediation (late 1990s)

Another important area of research and development over the past few years is

long-term ground water monitoring design optimization The long-term ground water

monitoring issue is signiÞcant because of the duration of monitoring programs, the

need to verify remedies, and the potential for remedy modiÞcations if either the

remedy or monitoring plan does not perform adequately In addition, long-term

ground water monitoring has received substantially less attention than remediation

process design optimization, so a greater potential exists for signiÞcant impacts

Time-robust monitoring networks (i.e., monitoring networks that perform well for

extended periods of time) have not been investigated extensively to date and are

recommended as a research focus area

3.3.1 T HE S IMULATION O PTIMIZATION A PPROACH

The development of ground water simulation models in the early 1970s provided

planners with quantitative techniques for analyzing alternative management strategies

In recent years, simulation models have been combined with optimization models to

identify the best management alternatives while considering management objectives

and constraints Typical ground water remediation problems involve the design of the

well Þeld, that is, the determination of the number, location, and pumping/recharge

schedule of all pumping/recharge wells Gorelick (1983), Yeh (1992), Ahlfeld and

Heidari (1994), Wagner (1995), and Ahlfeld and Mulligan (2000) have provided

extensive reviews on coupling simulation models with optimization models

The mathematical formulation of a ground water management problem consists

of an objective function that is related either to total remediation cost or to the total

amount of pumped water, subject to a set of constraints that are based on hydraulic

heads, ßows, or concentrations at selected locations Depending on the kind of

constraints, whether hydraulic heads or concentrations, the following two basic

approaches appear to be employed: ground water management models involving

hydraulic constraints and those involving concentration constraints

3.3.1.1 Gradient Control Remediation Technology

The primary goal of many ground water remediation systems is to contain impacted

ground water by preventing ground water ßow beyond a speciÞed boundary (i.e.,

horizontally or vertically) This containment can be accomplished by controlling

hydraulic gradients Most pump-and-treat systems have been designed using

L1667_C03.fm Page 122 Tuesday, October 21, 2003 3:49 PM

Trang 17

numerical simulation models for ground water ßow, such as MODFLOW

(Har-baugh and McDonald, 1996a, 1996b) Traditionally, the hydraulic simulation

model is run repeatedly to simulate different pumping scenarios Each scenario is

typically evaluated with respect to the number of wells required and the total

pumping rate necessary to achieve the required hydraulic containment while

main-taining compliance with other design constraints (e.g., limits on water levels,

drawdowns) These manually iterative simulations rely heavily on the experience

and insight of the modeler, who must personally determine each successive trial

A limitation of this manually iterative approach is that there are an inÞnite number

of well location and well rate combinations to consider, and only a small number

of numerical simulations are practical

The linkage of mathematical optimization techniques with the ground water

ßow simulator is an attractive alternative for gradient control problems The most

popular technique is the response matrix technique, which is described in detail in

Gorelick et al (1993) and Ahlfeld and Mulligan (2000) This approach capitalizes

on the linear relationship between pumping rate and drawdown that applies to many

ground water systems (i.e., the law of linear superposition) and easily extends to

a linear relationship between pumping rates and hydraulic gradients This linear

relationship allows an optimization problem to be formulated as a linear (or

mixed-integer linear) program, where the decision variables are the pumping locations

and pumping rates The optimization seeks to minimize an objective function (e.g.,

minimize total pumping rate) subject to a set of constraints that all must be satisÞed

according the simulation model results, including limits on gradients that establish

hydraulic control

Hydraulic optimization for gradient control problems is implemented easily

because of the following: (1) most sites with ground water contamination have a

site-speciÞc ground water ßow model, (2) the optimization approach is

straightfor-ward and easily understood, and (3) tools for performing the optimization are

available as off-the-shelf technology Applications of hydraulic optimization have

appeared in the literature since the 1970s, and several codes for performing these

evaluations are freely available such as MODMAN (Greenwald, 1998) and

MOD-OFC (Ahlfeld and Rießer, 1999) A detailed discussion of formulation options

associated with gradient control applications and demonstrations of these techniques

for three sites is provided in Greenwald (1999)

Gradient control techniques are limited by the predictive ability of the

under-lying simulation model, which is affected by uncertainty in parameter values, the

conceptual hydrogeological model of the site, the experience of the modeler, input

errors, and many other factors Additional limitations of gradient control problems

include the following: (1) contaminant concentrations cannot be included in the

mathematical formulation; (2) cleanup time cannot be rigorously included in the

mathematical formulation; and (3) for thin unconÞned aquifers (and several other

circumstances), linear superposition (which allows the use of linear programming

techniques) can be violated For sites where cleanup is the main objective and

predictions of contaminant concentrations or cleanup time are central to evaluating

the objective function and key constraints, the limitations of hydraulic optimization

can be prohibitive Transport modeling and transport optimization can be applied

L1667_C03.fm Page 123 Tuesday, October 21, 2003 3:49 PM

Trang 18

in such cases However, developing a transport simulation model and performing

a transport-based optimization analysis can require signiÞcant effort and cost, and

transport model predictions are subject to additional uncertainties (relative to ßow

model predictions)

3.3.1.2 Concentration Constraints Remediation Technology

As mentioned previously, ground water management problems combine a ground

water numerical simulator and an optimization model In the past few decades,

several ground water numerical simulators have been presented and employed (e.g.,

MODFLOW, MT3DMS, SUTRA, FEMWATER [3-D], PTC) to represent ground

water ßow and contaminant transport The following optimization models can be

categorized based on the theory that is used:

• Nonlinear models (using nonlinear programming)

• Dynamic models (using dynamic programming)

• Genetic algorithm models

• Simulated annealing models

• ArtiÞcial neural network (ANN) models

• Cutting plane techniques models

The main characteristic of the ground water contaminant management problem

is that the problem is nonconvex due to nonconvex behavior of the mass transport

equation (constraints) and/or the objective function Therefore, the majority of the

above models have difÞculty determining a globally optimal solution The main

characteristics of and a historical review of each category of the above models are

as follows:

• Nonlinear Models

The majority of these models rely on gradient-based techniques Theyrequire calculation of the derivative matrix for concentrations with re-spect to the decision variables, and a globally optimal solution is notguaranteed In the past, a combination of ground water simulation withnonlinear programming techniques to solve ground water managementproblems has been presented by Gorelick et al (1984), Willis and Yeh(1987), Ahlfeld et al (1988), Charbi and Peralta (1994), McKinney andLin (1995), Peralta et al (1995), and Emech and Yeh (1998)

• Dynamic Models

These models are based on dynamic programming theory where linear and stochastic features of the ground water system can be trans-lated into the formulation SigniÞcant cost savings have been reportedusing these models (between 20 and 70%) In some cases, dynamic wellstrategies (i.e., the well locations are not Þxed between different man-agement periods) have been incorporated These models are mathemat-ically complicated, numerical difÞculties have been reported, and aglobally optimal solution is not guaranteed Work on dynamic models

non-L1667_C03.fm Page 124 Tuesday, October 21, 2003 3:49 PM

Trang 19

has been presented by Jones et al (1987), Chang et al (1992), Culverand Shoemaker (1992, 1993, 1997), and Huang and Mayer (1997) Inaddition, some individuals have opted not to use the dynamic program-ming theory directly but rather the multiperiod approach This work hasbeen presented by Ahlfeld (1990), Rizzo and Dougherty (1996), andKaratzas et al (1998)

• Genetic Algorithm Models

One of the main characteristics of these models is that derivatives arenot required They are computationally intensive, but parallel comput-ing can be applied A globally optimal solution is not guaranteed Theirapplication to ground water problems began appearing in the early1990s, and representative works have been presented by McKinney andLin (1994), Rogers et al (1995), Wang and Zheng (1997, 1998), andAly and Peralta (1999a)

• Simulated Annealing Models

These models are devised to solve large combinatorial optimizationproblems and do not require derivative computation They have shownßexibility in the selection of cost functions (convex or not convex) and,theoretically, they can Þnd global optima (but not in practice) They arecomputationally intensive, but parallel computing can be applied Somestudies suggest that they are competitive with other optimization tech-niques Dougherty and Marryott (1991), Kuo et al (1992), Marryott et

al (1993), Rizzo and Dougherty (1996), and Wang and Zheng (1998)have presented work on simulated annealing models

These models are computationally intensive due to the neural networktraining Following the training, they are consistent with most of thenonlinear models, with fewer calls to the simulator Parallel computingcan be also applied Work on ANN models for ground water manage-ment has been presented by Rogers and Dowla (1994), Rogers et al

(1995), Rizzo and Dougherty (1996), and Aly and Peralta (1999b)

• Cutting Plane Technique Models

In this category, the models are characterized as global optimizationtechniques due to the formulation of the objective function, which isrequired to be concave When the objective function is concave, theseoptimization methods can be characterized as global optimizationtechniques Only one function derivative (of the most violated con-straint) is required at each iteration until the optimal solution is ob-tained These models are based on cutting plane theory where thefeasible region is enclosed into a polytope and where the most ex-treme point of the feasible region is determined by eliminating parts

of the infeasible region (using cutting hyperplanes) (Karatzas andPinder, 1993, 1996)

In the past few years, work has been presented where bioremediation simulators

have been combined with optimization techniques such as Minsker and Shoemaker

L1667_C03.fm Page 125 Tuesday, October 21, 2003 3:49 PM

Trang 20

(1998), Yoon and Shoemaker (1999), and Smalley et al (2000) In addition, models

of multiple contaminants have been presented that are related to pulsed or continuous

pumping for removing contaminants subject to rate-limited mass (Haggerty and

Gorelick, 1994)

3.3.2 S TOCHASTIC O PTIMIZATION TO A CCOMMODATE P OTENTIAL

D ESIGN F AILURE

The uncertainties underlying ground water ßow and transport models (e.g.,

asso-ciated with characterizing subsurface heterogeneities, contaminant sources and

plumes, reaction pathways and rates) have a profound effect on the reliability with

which cleanup system performance can be predicted Consequently, simulation

model uncertainty is viewed as the most important source of errors in the simulation

optimization design model Research to date has focused on incorporating

simula-tion model uncertainty into the optimizasimula-tion framework to assess the tradeoffs

between reliability (or probability of failure) and cost effectiveness SigniÞcant

work has been presented in the past decade by Wagner and Gorelick (1987),

Andricevic and Kitanidis (1990), Lee and Kitanidis (1991), Wagner et al (1992),

Whiffen and Shoemaker (1993), Morgan et al (1993), Reichard (1995), Aly and

Peralta (1999b), and Freeze and Gorelick (1999) Gorelick (1990, 1997), Wagner

(1995), and Freeze and Gorelick (1999) provide a detailed review of stochastic

ground water optimization models

The goal of ground water remedial design is to develop a remediation strategy

that will lead ultimately to compliance with the ground water quality performance

standards set forth by the controlling regulatory agencies Therefore, failure of a

remediation strategy is deÞned as any incident that violates the established

perfor-mance criteria As discussed previously, perforperfor-mance standards typically serve as

constraints in the simulation optimization model Therefore, the deÞnition of failure

can be further extended to be the violation of performance constraints in the

opti-mization model

Under ideal conditions, the design optimization model is a perfect

represen-tation of the remediation problem, and there is no possibility of failure The

objectives and constraints would reßect perfectly the goals and performance

standards set forth by the regulatory agencies: the ground water simulation

model(s) would reßect perfectly the geologic, hydrologic, and chemical

condi-tions of the contamination site and would perfectly predict ßow and transport

under alternative remediation strategies Further, the optimization model would

identify, with complete certainty, the design that best meets the problem’s

objec-tives and constraints Obviously, this does not occur in real-world applications

As a result, simulation optimization models will always be speciÞed incorrectly

to some degree The goal then is to develop remediation design optimization

models that provide assurance, albeit risk qualiÞed, that remediation performance

criteria will be met

Traditionally, engineering design has relied on the use of standardized design

codes that deÞne deterministic safety factors to account for uncertainty in the design

process For ground water remediation, however, each problem is distinctly unique

L1667_C03.fm Page 126 Tuesday, October 21, 2003 3:49 PM

Trang 21

with little or no precedent on which standardized safety factors can be established.Consequently, the trend in ground water remediation design has moved away fromthe traditional “one size Þts all” safety factor approach and toward the use of moresophisticated stochastic analyses that account for site-speciÞc uncertainties Asdiscussed in later sections, many of these approaches do not completely abandonthe safety factor approach; instead, they reÞne that approach to develop safetyfactors on a site-by-site basis It is important to note that with uncertainty included

in the optimization model, there is no longer a single best solution as in thedeterministic case Rather, there is a spectrum of stochastic optima, with eachoptimal solution associated with a speciÞed level of reliability (the complement ofthe probability of failure)

The majority of research to date has focused on the following two stochasticoptimization methods: chance-constrained optimization and multiple realization(sometimes referred to as stacking) approaches Both methods assume that somesimulation model parameters are unknown, and both approaches begin by estimatingthe unknown parameters and quantifying their uncertainties They then introducethe effects of the model parameter and prediction uncertainties into the optimizationmodel It is at this point that the chance-constrained and multiple realizationapproaches diverge based largely on the manner in which the two approachescategorize simulation model uncertainty The chance-constrained approach is linkedwith a conceptualization in which the simulation model parameters are viewed asuniform over large zones In this case, model parameter and prediction uncertaintiesare included in the optimization model via Þrst-order uncertainty analysis Themultiple realization approach, on the other hand, is not limited to the small varianceassumption with respect to the aquifer properties, and parameter uncertainty isincluded in the optimization model using Monte Carlo methods that are not con-strained by the limitations of Þrst-order uncertainty analysis

3.3.2.1 Chance-Constrained Ground Water

When predicting system performance under simulation model uncertainty, there

is a probability that a constraint will not be met (i.e., a probability of system failure),and it is necessary to replace the deterministic constraint with a stochastic one:

Trang 22

As deÞned above, the probabilistic constraint cannot be solved in the optimizationmodel; however, if it is assumed that the simulated concentrations as a function ofmodel parameter uncertainty are (or are well approximated as) normally distributed,

it can be reformulated as a deterministic equivalent known as a chance constraint (e.g.,Tung, 1986; Wagner and Gorelick, 1987; and Freeze and Gorelick, 1999):

E[C i] + FN–1(R) S[C i ] < C* (3.3)

where E[C i ] and S[C i ] are the expected value and standard deviation of C i, tively, and FN–1(R) is the value of the standard-normal cumulative distribution cor- responding to reliability level R.

respec-An inspection of Equation 3.3 shows that the chance constraint has thefollowing two components: an expected value component (Þrst term on left side)and a stochastic component (second term on left side) When the reliability is0.5, FN–1(R) is zero, the stochastic component drops out, and the chance con-

straint reduces to the deterministic constraint (Equation 3.1) Thus, the ministic optimization model that ignores uncertainty corresponds to the casewhere there is a 50% chance of failure The stochastic component in Equation3.3 is essentially a safety factor that controls the amount of overdesign needed

deter-to achieve the desired level of performance reliability For a given level of modeluncertainty, the stochastic component of Equation 3.3 increases with increasingreliability requirement The effect of this can be best understood by moving thestochastic component to the right-hand side of Equation 3.3, which is equivalent

to imposing a safety factor that redeÞnes (i.e., reduces) the maximum permissibleconcentration However, unlike standardized safety factors used in many engi-neering disciplines, the magnitude of the stochastic component is a uniquefunction of the simulation model uncertainty and the reliability level, and it isunknown prior to solving the chance-constrained optimization model (Tiedemanand Gorelick, 1993)

One advantage of the chance-constrained approach is that the reliability level isexplicitly considered in the optimization model, allowing the development of aremediation strategy designed to meet the decision maker’s reliability preference Italso allows the development of reliability–cost tradeoff curves (Wagner and Gorelick,1987; Tiedeman and Gorelick, 1993; Freeze and Gorelick, 1999) Although it isnatural to think that a decision maker will take a risk-averse stance and choose adesign with a high degree of reliability, it is not realistic to assume that a reliabledesign will be implemented regardless of cost The reliability–cost tradeoff curveallows the decision maker to evaluate the marginal cost associated with an increase

or decrease in reliability and select a design that provides an acceptable balancebetween reliability and cost

Based on Þeld data to date, there have been two applications of the constrained ground water management model One involves the application of

chance-a nonlinechance-ar chchance-ance-constrchance-ained optimizchance-ation model to identify optimchance-al pumpingschemes for plume capture at a landÞll site near Ottawa, Canada (Gailey andGorelick, 1993) The other applies nonlinear chance-constrained optimization

to identify minimum pumping strategies for plume containment at a site in

L1667_C03.fm Page 128 Tuesday, October 21, 2003 3:49 PM

Trang 23

southwest Michigan (Tiedeman and Gorelick, 1993) Both examples demonstratethe need for overdesign (pumping above that required in the deterministic case)

to account for the performance uncertainties that arise from model parameteruncertainties For the plume capture problem presented by Gailey and Gorelick,

an overdesign of 27% was needed to achieve a reliability level of 0.90 For theplume containment problem presented by Tiedeman and Gorelick, a 40% over-design was needed for a reliability level of 0.90 Tiedeman and Gorelick alsoprovide an interesting analysis of the stochastic component of the chance con-straints Their analysis showed that not only could the safety factors not be

deÞned a priori, they could vary signiÞcantly from one constraint to another

within a given problem

3.3.2.2 Multiple Realization Ground Water

in which multiple realizations of the uncertain parameters are included in theoptimization model

Consider again the deterministic concentration constraint given in Equation 3.1

In the multiple realization model, this single constraint is replaced with the followingseries of constraints:

C i1 < C i * for parameter realization 1 (3.4a)

C i2 < C i * for parameter realization 2 (3.4b)

C in < C i * for parameter realization n (3.4c)

where C i1,C i2 , and C in are the simulated concentrations at space-time location (i) for parameter realizations 1, 2, and n It is important to understand the structure of

the multiple realization management model in order to understand how it identiÞesfailure-averse designs The multiple realization model solves the optimization prob-

lem simultaneously for all n parameter realizations From the standpoint of failure,

this means that the model provides a robust solution that is feasible (i.e., successful)for all parameter realizations included in the model This simultaneous solutionapproach recognizes that, in general, no single realization can be used to identify

a reliable remediation strategy Rather, the unique design demands of each tion must be pooled in order to deÞne the optimal reliable solution For example,consider the problem of minimizing pumping for plume containment Each param-eter realization can dictate pumping in a different part of the aquifer in order tomeet the design constraints By considering the inßuence of pumping across allrealizations, the multiple realization method identiÞes the scenario requiring the

realiza-L1667_C03.fm Page 129 Tuesday, October 21, 2003 3:49 PM

Trang 24

least pumping that meets the demands of each realization This pooled solutionrequires pumping in excess of that which would be required for any single realiza-tion As in the chance-constrained model, this overdesign can be thought of as asafety factor, and, as in the chance-constrained model, this safety factor cannot bedeÞned prior to solving the multiple realization model It is important to note thatthe multiple realization management model is different from Monte Carlo optimi-zation which solves a series of individual optimization problems, each with adifferent parameter realization Monte Carlo optimization can provide informationabout the variability of the optimal solution from realization to realization, but,except for very limiting cases, it cannot identify reliability-based optimal designs.Wagner and Gorelick (1989) and Freeze and Gorelick (1999) provide a moredetailed discussion of Monte Carlo optimization.

Unlike the chance-constrained model, the multiple realization model does notexplicitly contain reliability in its formulation However, Chan (1993) presents the-oretical analyses that deÞne the design reliability as a nonparametric function of the

number of realizations included in the management model, R = n/(n+1) For example,

if the model is formulated with 99 realizations, the estimated design reliability would

be 0.99 For the case of optimal plume containment in the presence of a spatiallyvarying and uncertain transmissivity, Chan (1993) evaluates the accuracy of thereliability estimator Monte Carlo analyses show agreement between the reliabilitypredicted by the nonparametric formula and the average reliability provided by theMonte Carlo results (Analyses by Wagner and Gorelick [1989] similarly show agree-ment between the nonparametric reliability estimate and the design reliabilityobtained through Monte Carlo analysis.) Chan (1993) also presents a series of tests

to gauge reliability prediction sensitivity to model parameter and structure changes.The results indicate that the nonparametric reliability estimate is robust with respect

to a variety of changes (e.g., changes in the covariance and correlation structure ofthe transmissivity Þeld, changes in the location and magnitude of velocity constraints).The multiple realization approach described above has been modiÞed in a number

of ways Wagner et al (1992), Morgan et al (1993), and Chan (1994) present multiplerealization methods that allow for constraint violations within the stack of constraintsets All these works deal with designing reliable hydraulic containment strategies.Wagner et al (1992) modify the objective function to include a penalty cost forconstraint violations Morgan et al (1993) and Chan (1994) develop heuristic algo-

rithms that generate solutions in which R n of the constraint sets are satisÞed, where R

is the design reliability level and n is the number of realizations The assumption here

is that if R n of the constraint sets are satisÞed, the management strategy will satisfy

R% of the constraint sets across the entire ensemble of realizations Monte Carlo testing

by Chan (1994) shows that the accuracy of this approach improves as the number ofrealizations increases Ranjithan et al (1993) used an ANN to reduce the number ofrealizations considered by the multiple realization model The pattern recognitioncapabilities of the ANN were used to identify realizations that are likely to dictate theÞnal design The multiple realization model was then applied to a small subset of thesecritical realizations This approach was compared with that presented by Morgan et

al (1993) and was found to closely reproduce the cost–reliability trade-offs usingfewer realizations and less computational time Ritzel and Eheart (1994) use the

L1667_C03.fm Page 130 Tuesday, October 21, 2003 3:49 PM

Trang 25

multiple realization approach in a multiobjective model to evaluate the cost–reliabilitytradeoffs for optimal plume containment Finally, Smalley et al (2000) present apromising stacking model that is solved using a noisy genetic algorithm They study

the problem of risk-based design of in situ bioremediation where uncertainty stems

from a heterogeneous hydraulic conductivity and unknown parameters of the exposureand risk model For the test example, the noisy genetic algorithm was able to identify

a reliable design from a relatively small number of parameter realizations

3.3.2.3 Alternative Stochastic Optimization Methods

The focus on chance-constrained and multiple realization methods in the abovediscussion mirrors the focus of stochastic ground water optimization methodsresearch to date A number of papers present alternatives or enhancements to thesemethods, such as those in the areas of coupled ground water management andmonitoring The above discussion highlights that the effect of failure-averse design

is to introduce a cost of overdesign that increases with increasing model uncertainty

or reliability Additional data can potentially reduce model uncertainty and therebyreduce overdesign The important issue in coupled ground water management andmonitoring design is whether the reduction in management costs offsets data col-lection costs Among those that address this problem are Andricevic and Kitanidis(1990), Tucciarelli and Pinder (1991), and Wagner (1999)

This section would not be complete without a discussion of decision analysis,which has emerged as an alternative to stochastic optimization for reliable groundwater management design Like the stochastic optimization approach, the decisionanalysis approach seeks the least-cost, reliable design solution that accounts forground water simulation model uncertainty However, there are two important dif-ferences between the two decision-making frameworks First, whereas stochasticoptimization typically deals with minimizing costs, decision analysis involves a risk-cost minimization Second, stochastic optimization normally seeks to identify theleast-cost solution for only one technological strategy, whereas decision analysisconsiders a suite of technological strategies from which one (not necessarily optimal)strategy is selected A detailed comparison of the stochastic optimization and deci-sion analysis frameworks can be found in Freeze and Gorelick (1999)

3.3.3 U NCERTAINTY

Typically, the limitations of subsurface remediation technologies are thought of

in a process engineering sense In engineered systems, the efÞciency of a processcan be improved through theoretical or experimental investigations In the reme-diation of natural subsurface systems, however, there is far less control over thebehavior of the process and much greater degrees of variability and uncertainty.Thus, the most signiÞcant technological limit in subsurface remediation is a softlimit: uncertainty

Engineers and others involved in designing and making decisions on subsurfaceremediation systems need to be familiar with the sources of uncertainty and theirsigniÞcance with regard to system performance An informed decision maker is one

L1667_C03.fm Page 131 Tuesday, October 21, 2003 3:49 PM

Trang 26

who identiÞes the most signiÞcant sources of uncertainty and either incorporatesthe uncertainty into the design or obtains the necessary information to reduce theuncertainty to a manageable level.

Using mathematical optimization to design subsurface remediation systems hasshown promise in providing designs that are more cost effective than designs based

on trial and error or intuition (e.g., Yager and Greenwald, 1999) A signiÞcant efforthas been dedicated toward developing advanced optimization tools for designingsubsurface remediation system design However, designs produced with even themost sophisticated optimization tools are doomed to failure if they are based oninaccurate or incomplete data or do not, in some way, take uncertainty into account

In this section, the sources of uncertainty that typically are encountered inthe optimal design of remediation systems are presented Three examples highlight

a few of these sources of uncertainty This section is not meant to be an exhaustivereview of the approaches that have been developed for contending with uncer-tainty; rather, it gives an introduction to the role of uncertainty in optimal reme-diation design

3.3.3.1 Sources

There are many technical difÞculties associated with cleaning up contaminatedground water and soil For example, the removal of nonaqueous phase liquids(NAPLs) or strongly sorbing organics and metals is inherently difÞcult to achieve

at an acceptable cost Not the least of the challenges is uncertainty about processes,parameters, and inputs The biogeochemical processes are complex and inadequatelyunderstood Geologic media are highly heterogeneous and their hydrogeologic andbiogeochemical parameters are known imperfectly from a limited number of direct

measurements or from inverse processes of observations of how the system responds

to given stimuli An additional source of uncertainty is the future inputs into thesystem, such as the intensity and quality of recharge or regional ßow The subsectionsbelow are an attempt to categorize and describe the major sources of uncertainty inoptimization as applied to decision making

3.3.3.1.1 Hydrogeochemical

The following four factors play a role in hydrogeochemical uncertainty:

• Aquifer Physical Characteristics

Aquifer physical characteristics include the distribution of aquifer terials and their corresponding physical properties (e.g., hydraulic con-ductivity, porosity) as well as the transient nature of ßow boundaryconditions

ma-• Contaminant Characteristics and Aquifer Contaminant Interactions

The lack of detail regarding the chemical and biological composition

of contaminants can contribute to hydrogeochemical uncertainty.The largely undeÞned distribution of contaminants also plays a keyrole The distribution of aquifer material properties that relate tochemical interactions with contaminants (e.g., sorptive properties),

L1667_C03.fm Page 132 Tuesday, October 21, 2003 3:49 PM

Trang 27

the distribution of aqueous chemistry (e.g., dissolved organic matterconcentrations), and microbial distribution and corresponding bio-chemical conditions (e.g., electron acceptor, carbon source energy

source concentrations) are all considerations when evaluating

hydro-geochemical uncertainty

• Plume Characteristics

The following three factors should be considered when evaluating drogeochemical uncertainty: spatial extent of the plume, chemical com-position and concentrations, and plume age

hy-• Source Characteristics

Source characteristics such as composition and strength of the source,source location, and source history affect hydrogeochemical uncertainty

3.3.3.1.2 Technology

Technology uncertainty can be deÞned as the uncertainty in predicting the response

of the contaminant(s) to a technology or combination of technologies (especiallyfor innovative technologies)

3.3.3.1.3 Cleanup Goals

Determining cleanup goals involves the consideration of many variables First, thecleanup goal itself must be identiÞed based on different methods (e.g., point con-centration based, mass based, risk based) Second, measuring the attainment of thecleanup goal has its own uncertainties (e.g., uncertainty in the relevance of a pointmeasurement, sampling and analytical errors, uncertainty in the initial mass ofcontaminant) Third, there is uncertainty associated with the risk assessment (e.g.,identiÞcation of exposed population, determination of exposure, estimation of healtheffects) Last, there is decision-making uncertainty Remedial cleanups by theirnature involve multiple stakeholders with conßicting views and preferences andvarying degrees of risk aversion

3.3.3.2 Examples

3.3.3.2.1 Aquifer Physical Characteristics Uncertainty

In this example, the performance of an optimal remediation design was examined

in the situation where the hydraulic conductivity distribution is uncertain Theoptimal pumping schedule for a pump-and-treat system was obtained for the fol-lowing two levels of uncertainty: (1) assuming that the aquifer is homogeneous inall three dimensions and (2) assuming that the aquifer is heterogeneous in thehorizontal (aerial) plane but homogeneous in the vertical direction The pumpingschedules obtained under these conditions of uncertainty were applied to the real,three-dimensional (3-D), heterogeneous aquifer The ability of the pumping scheme

to achieve a given cleanup level was assessed

The aquifer contaminant system was modeled with ground water ßow andtransport simulators (i.e., MODFLOW and MT3D) The aquifer was discretized into

31 ¥ 20 Þnite-difference nodes per layer, with a total of four layers The size of theaquifer is 4101 ¥ 3000 ¥ 98 ft (Figure 3.1) No-ßow boundaries were imposed on

L1667_C03.fm Page 133 Tuesday, October 21, 2003 3:49 PM

Trang 28

the north, south, and bottom faces of the aquifer system Constant head boundarieswere applied on the east and west sides to produce a west to east ßow direction.The physical characteristics of the aquifer system are summarized in Table 3.2.

A stochastic, conditional simulation technique was applied to generate 30

3-D, spatially correlated, random Þelds of isotropic hydraulic conductivity The meanand variance of the log-hydraulic conductivity are 5.31 ¥ 10–3 ft/s and 0.4, respec-tively The correlation scales are 150 and 25 ft in the horizontal and verticaldirections, respectively The hydraulic conductivities in the domain range from

7.9 x 10–4 to 3.6 ¥10–3 ft/s, which is indicative of a mixture of clean sand andgravel materials

The initial plume was generated by introducing solute into the aquifer system

at eight locations in the second and third layers The plume was allowed to migrateuntil the concentrations at downgradient monitoring wells (Figure 3.1) reached aspeciÞed standard The conÞguration of the plume is shown in Figure 3.1 An average

of the 3-D hydraulic conductivity Þeld was used in the plume simulation

FIGURE 3.1 Aerial view of hypothetical ground water system used in examples.

TABLE 3.2 Aquifer Physical Characteristics

Longitudinal dispersivity (m) 3.3 Transverse dispersivity (m) 0.3 Diffusion coefÞcient (m 2 /s) 10 –6

Freundlich parameters (K ab, 1/n) 3.79, 0.89

Scale

0 100 m

Direction of Flow (without Pumping) Contaminant

Source

No-Flow Boundary

No-Flow Boundary

Monitoring Wells Candidate

Pumping Wells

N

L1667_C03.fm Page 134 Tuesday, October 21, 2003 3:49 PM

Trang 29

The homogeneous version of the aquifer was described with a single, hydraulicconductivity value equal to the mean hydraulic conductivity used to generate therandom Þelds A two-dimensional (2-D), horizontal hydraulic conductivity Þeld wasgenerated by vertically averaging each of the 30 3-D Þelds and then averaging theresulting 2-D Þelds at each horizontal location.

Optimal pumping schedules were determined for the homogeneous and verticallyaveraged 2-D versions of the aquifer The objective was to determine an optimalremediation scheme to reduce the aqueous phase solute concentration in the aquifer

to less than the drinking water standard (5 mg/l) in 5 years The remediation planninghorizon was divided into Þve 1-year management periods to allow for dynamicpumping rates Up to 15 candidate pumping wells were utilized to remediate theplume The objective function consists of the sum of a well installation cost term,

a pumping lift cost term, and a ground water treatment cost term:

(3.5)

where N is the total number of management periods in the entire planning horizon;

N p (t) is the total number of extraction wells during the t-th management period; a 0,

a 1 , and a 2 are cost coefÞcients for well installation, pumping lift, and treatment,

respectively; Q i (t) is the ground water extraction rate of the i-th extraction well during t-th management period; s i (t) is the drawdown in the i-th extraction well during

management period t; d i is the distance between the static water table and the ground

surface at the extraction well i; C si (t) is the ßow-weighted solute concentration in the i-th well during the t-th period; C t0 is the treatment objective (5 mg/l); and K ab and

1/n are parameters related to the carbon adsorption treatment technology.

The optimization formulation was subjected to remediation goal constraints,resource protection constraints, and pumping capacity constraints The optimizationformulation was solved by an application of the genetic algorithm For more details,refer to Huang and Mayer (1996)

Figure 3.2 shows the pumping schedules determined for the homogeneous andvertically averaged versions of the aquifer These schedules were applied to each

of the 30 3-D Þelds Concentrations at the end of the 5-year remediation horizonwere determined from the pumping and monitoring wells for each of the 3-DÞelds The maximum concentration found in the wells was used as an indication

of the success or failure of the remediation design Figure 3.3 shows the resultingfrequency distributions of the maximum concentration at the pumping wells forthe 30 3-D Þelds

The results in Figure 3.3 indicate that, for the pumping schedule obtained withthe homogeneous assumption, the remediation goal of 5 mg/l is exceeded for 80%

of the 3-D Þelds As expected, the pumping schedule obtained with the verticallyaveraged assumption performs better, where the remediation goal of 5 mg/l isexceeded for 64% of the 3-D Þelds In addition, the extreme values of concentration(>100 mg/l) are avoided when horizontal heterogeneity is considered

Trang 30

3.3.3.2.2 Decision-Making Uncertainty

In the previous example, the cleanup goal was Þxed, and the most cost-effectivesolution that achieved the cleanup goal was found An alternative approach foroptimizing subsurface remediation systems is to allow both the cleanup goal andthe cost to vary With this multiobjective approach, there are an inÞnite or at leastvery large number of possible designs To make a decision on the appropriateremediation design, the preferences of the decision makers with regard totradeoffs between cleanup goals and costs must be known In other words, adecision maker must know how to weigh the relative importance of the remedi-ation system cost and the cleanup performance to be achieved by the remediationsystem Once these weights are known, the two objective functions (i.e., minimizecost and maximize cleanup performance) can be combined into a single objective

FIGURE 3.2 Pumping schedules found for remediation systems based on homogenous and

vertically averaged versions of aquifer.

FIGURE 3.3 Maximum concentrations found for remediation systems based on homogenous

and vertically averaged versions of aquifer.

3 /min)

0.00 0.25 0.50 0.75 1.00

Year

Homogeneous Vertically Averaged

L1667_C03.fm Page 136 Tuesday, October 21, 2003 3:49 PM

Trang 31

optimization problem However, in most cases, a decision maker is not able to

select these weights a priori.

One approach for dealing with uncertainty in the decision-making process is

to generate a tradeoff curve that allows the decision maker to see the full range

of alternatives for a particular site Figure 3.4 shows a tradeoff curve for and-treat remediation of the hypothetical, contaminated aquifer described in theprevious example, where up to 15 extraction wells are to be used The total costindicated on the y-axis consists of capital and operating costs for the remediationsystem over the remediation period Here, the percent of contaminant mass remain-ing in the aquifer at the end of the remediation period is used as a measure ofcleanup performance Other measures of cleanup performance can be substitutedeasily (e.g., ground water concentrations at monitoring points, human health riskremaining after remediation)

pump-The design optimization problem is to Þnd all the best combinations of 15pumping rates In the case with 15 wells, there are on the order of 1018 potentialcombinations of pumping rates The best designs are those that tend to minimizethe cost and the mass remaining simultaneously Each symbol on the graph in Figure3.4 represents the best design found by the optimization algorithm for the corre-sponding position on the tradeoff curve For example, the cheapest design able toachieve a cleanup goal of 10% mass remaining is estimated at about $125,000 (seedashed lines in Figure 3.4)

The optimization algorithm is an advanced evolutionary method called theNiched Pareto Genetic Algorithm (NPGA) The NPGA sorts among the potentialdesigns and attempts to improve the optimality of the designs as it iterates towardthe best solutions The NPGA is built around a ground water ßow and transportsimulator, which provides information about the ground water pressures and

FIGURE 3.4 Tradeoff curve for multiobjective optimization of pump-and-treat design.

Mass Remaining (%)

1 10 100

0 50,000 100,000 150,000 200,000 250,000

L1667_C03.fm Page 137 Tuesday, October 21, 2003 3:49 PM

Trang 32

contaminant concentrations This information is used to determine the cost andcleanup performance for each candidate design For more details on the algorithmand the example shown here, refer to Erickson et al (2001).

3.3.3.2.3 Risk Assessment Uncertainty

Risk assessment is used to quantify the human health risks due to exposure tocontaminated ground water and to develop cleanup goals Risk assessmentinvolves estimating the level of contamination at the point of exposure, theexposure of individuals to the contamination, and the resulting toxicologicalimpact Because the level of contamination at the exposure point depends on theproperties of the aquifer and contaminant, risk assessment is subject to hydro-geochemical uncertainty However, risk assessment is further complicated byuncertainty in exposure factors (e.g., uncertainty in the amount of contaminatedground water ingested) and toxicological factors (e.g., uncertainty in the sensi-tivity of the exposed population to the toxicological effect[s]) The impacts ofthese sources of uncertainty have been investigated by Pelmuder et al (1996)and Maxwell et al (1998), among others

The following example illustrates the risk assessment uncertainty A linear formfor estimating the risk of carcinogenesis due to ingestion of ground water contam-inated with an organic chemical (McKone and Bogen, 1991) is used:

Risk = CDI ¥ CSF = [C ¥ I ¥ ED ¥ EF/(BW ¥ AT)] ¥ CSF (3.6)

where CDI is the chronic daily intake in milligrams per kilogram per day (mg/kg/day), CSF is the cancer slope factor (kg day/mg), C is the concentration

at the exposure point (mg/l), I is the ingestion rate for contaminated water from the exposure point (l/day), ED is the exposure duration (years), EF is the exposure frequency (days/year), BW is body weight (kg), and AT is the averaging

cancer slope factor was divided by the body weight, CSF/BW Variability in CSF/BW

represents the situation where the sensitivity and size of the exposed individualsvaries, as one would expect in a realistic population

Table 3.3 lists the values of the variables in Equation 3.6 used in the analysis,

including the lognormal distribution used to describe the uncertainty in CSF/BW.

The value of the geometric standard deviation used here results in a distribution thatranges over approximately two orders of magnitude A Monte Carlo analysis was

used to sample 300 values of the CSF/BW distribution These CSF/BW values were

then substituted into Equation 3.6 along with values from the concentration quency distribution given in Figure 3.3

fre-L1667_C03.fm Page 138 Tuesday, October 21, 2003 3:49 PM

Trang 33

Figure 3.5 shows the frequency distribution of risks by using only the mean

value of CSF/BW This frequency distribution is simply a linear translation of the

concentration frequency distribution given in Figure 3.3 and indicates that a 13%probability exists that the risk will exceed 10–4 but an 80% probability exists thatthe risk will exceed 10–5 When the uncertainty in CSF/BW is included, the resulting

frequency distribution is extremely wide, as shown in Figure 3.6 Of course, thewidth of the distribution is directly related to the somewhat arbitrary choice of

standard deviation in the CSF/BW distribution Some of the very high risks (>10–2)indicated in Figure 3.6 are a result of using perhaps unrealistic, low values of BW

or high values of SF.

3.3.3.2.4 Approaches for Addressing Uncertainty

How mathematical models and optimization programs must be modiÞed to accountfor uncertainty transcends the use of mathematical models or optimization Everystrategy, no matter how derived, should be able to deal with the characteristics ofvarious uncertainties discussed in previous sections However, when a strategy isderived using mathematical simulation models and optimization techniques, thesuccess of the methods should be judged on the basis of how well the schemes theygenerate represent these characteristics Some approaches follow:

Trang 34

• Feedback

A strategy should be adaptable to changing conditions and new mation This requirement appears easy to meet by using deterministicmodels that use best estimates as if they were the right values for thequantities they represent When the estimates change, the models arerun anew and the remediation scheme is adjusted

If there are many possible parameter or input values, the strategy shouldnot be chosen to optimize one of them Instead, it should be chosen to per-form in a satisfactory way over the ensemble of possible values of the un-known quantities It is of paramount importance to prevent high costoutcomes (i.e., avoid disastrous performance under a plausible scenario).Enforcing hedging in a rigorous fashion in an optimization scheme iscomputationally very challenging Hence, applying optimization underuncertainty becomes an exercise in approximation For example, An-dricevic and Kitanidis (1990) and Lee and Kitanidis (1991, 1996) used

a small perturbation approximation Among more computationally tensive approaches, the multiple realization approach involves replac-ing possible scenarios with a Þnite number that is generated throughMonte Carlo techniques (Gorelick et al., 1984; Wagner and Gorelick,1989; Chan, 1993) Then, the optimal scheme is the one that has thesmallest average cost and performs well over the largest percentage ofrealizations or a combination of the two

in-• Sampling

The signiÞcance of sampling when addressing uncertainty is two-fold.First is the anticipation that measurements will be collected in the future,thus requiring less hedging For example, the multiple realization ap-proach can be overly conservative or cautious by requiring a high pump-ing rate to ensure that 95% of plume realizations, based on initiallyavailable knowledge, are captured In practice, however, the ensemble of

FIGURE 3.6 Frequency distribution of risks incorporating uncertainty in CSF/BW.

Trang 35

plausible plume realizations changes over time as new information is corporated into the model reducing uncertainty McGrath et al (1996)use this approach in the context of an adaptive characterization method.Second is that sampling can become part of the optimization For exam-ple, the pumping rate and collection of measurements can be manipulated

in-to stimulate the system in a way that reduces uncertainty, thus reducingthe total cost This approach has been adopted by Andricevic and Kitani-dis (1990) and Lee and Kitanidis (1991, 1996)

3.3.4 D ESIGN -R ISK C OST T RADEOFF

Risk has been deÞned in many different ways (Starr, 1969; Kaplan and Garrick,1981), for example:

Risk = Probability ¥ Consequence (3.7)

(3.8)

From these initial deÞnitions, risk can be linked with probability, uncertainty, quency, and cost beneÞt Risk in a cost–beneÞt framework is usually treated as aconstraint or minimized Freeze and Gorelick (1999) point out that there are often

fre-no ecofre-nomic beneÞts to environmental cleanup, just cost minimization An tive view, however, may be that the beneÞt takes the form of a reduction in humanhealth risk Schulze and Kneese (1981) and Morgan (2000) posit that the risk-cost-beneÞt framework is an important decision-making tool and should be used toequitably mitigate risks to a populace

alterna-Risk, when used in the context of optimal subsurface contamination remediation,often includes the following two meanings: (1) that pertaining to remedial systemreliability and (2) that signifying the potential for adverse impact on human health.Both are appropriate uses of the term, and either (or even both) can be used in acost–beneÞt framework In fact, it is common to see concentration goals used as asurrogate for health risk

From a reliability perspective, risk is deÞned in terms of the probability of failure

of intended remedial design (or any engineering design) There is often a monetary

or regulatory cost associated with failure This stems from Equation 3.7, where therisk is deÞned as the probability of failure of the intended remedial design and theconsequence of this failure deÞned as either regulatory or monetary penalties (e.g.,Freeze and Gorelick, 1999) A predominant complicating factor in this scenario isthe uncertainty due to hydraulic conductivity

One example of this reliability perspective is presented in Morgan et al (1993)

In this work, trade-off curves of reliability vs pumping rate were developed (Figure3.7) The authors simulated many different, equally likely realizations of hydraulicconductivity for varying degrees of heterogeneity Reliability was quantiÞed as thenumber of realizations that did not violate the design constraints The authors notedthat a deterministic system has either zero or one reliability (i.e., either the design

Safeguards

=L1667_C03.fm Page 141 Tuesday, October 21, 2003 3:49 PM

Trang 36

succeeds or fails) and demonstrated the change in reliability with the change instandard deviation of hydraulic conductivity (Figure 3.8).

The standard framework for human health risk includes the processes of riskassessment and risk management (National Research Council [NRC], 1983, 1992).Risk assessment comprises the following four steps: hazard identiÞcation; exposureassessment; dose–response assessment; and risk characterization, including uncer-tainty and variability In identifying hazards, one determines whether a physicalinsult can cause an increase in negative consequences for human health Exposure

is assessed by estimating the intensity, frequency, and duration of the insult asexperienced by the at-risk population The dose–response is the relationship betweenthe exposure to the physical insult and the incidence of the human health conse-quences, and risk characterization is the synthesis of these to evaluate the healthconsequences of the physical insult Risk management is how one chooses to managethe risks (i.e., mitigation strategies or neglect) This type of framework is oftenapplied to complex systems (e.g., engineered, natural, human) that have relativelypoor data

Several simplifying assumptions are often adopted to make assessments ble, standardize the approach, and accommodate some of the limitations of theavailable data and models (e.g., USEPA, 1989) These simpliÞcations include linearno-threshold dose response functions for carcinogens; population-averaged valuesfor dose–response, consumption patterns, and metabolic features; and an assessment

tracta-of either a maximally exposed individual or a population-averaged exposure It has

FIGURE 3.7 Trade-off curve for reliability as a function of minimum pumping rate (After

NUMBER OF REALIZATIONS NOT VIOLATED

(100 Realizations)

infeasible inferior

L1667_C03.fm Page 142 Tuesday, October 21, 2003 3:49 PM

Ngày đăng: 11/08/2014, 12:21

TỪ KHÓA LIÊN QUAN