3Since the mid-1980s, criticism of the central assumptions of neoclassical economicshad increased its pace in economic sociology as well as in the history of economics.Insights and theor
Trang 2Haraway, Donna (1997) Modest_Witness@Second_Millennium: FemaleMan_Meets_Oncomouse (New York:
Routledge).
Hardacre, Helen (1994) “The Response of Buddhism and Shinto to the Issue of Brain Death and Organ
Transplants,” Cambridge Quarterly of Healthcare Ethics 3: 585–601.
Hayes, C (1992) “Genetic Testing for Huntington’s Disease: A Family Issue,” New England Journal of Medicine 327: 1449–51.
Hedgecoe, Adam (2001) “Schizophrenia and the Narrative of Enlightened Geneticization,” Social Studies
of Science 31: 875–911.
Hendrick, Burton J (1913) “On the Trail of Immortality,” McClure’s 40: 304–17.
Hill, Shirley (1994) Managing Sickle Cell Disease in Low-Income Families (Philadelphia: Temple University
Press).
Hogle, Linda (1995) “Standardization Across Non-Standard Domains: The Case of Organ Procurement,”
Science, Technology & Human Values 20: 482–500.
Hogle, Linda (1999) Recovering the Nation’s Body: Cultural Memory, Medicine and the Politics of Redemption
(New Brunswick, NJ: Rutgers University Press).
Joralemon, Donald (1995) “Organ Wars: The Battle for Body Parts,” Medical Anthropology Quarterly 9(3):
335–56.
Keller, Evelyn Fox (1992) “Nature, Nurture, and the Human Genome Project,” in D J Kevles & L Hood
(eds), The Code of Codes: Scientific and Social Issues in the Human Genome Project (Cambridge, MA: Harvard
University Press): 281–89.
Kerr, A., S Cunningham-Burley, & A Amos (1998) “The New Human Genetics and Health: Mobilizing
Lay Expertise,” Public Understanding of Science 7: 41–60.
Kevles, Daniel J (1985) In the Name of Eugenics: Genetics and the Uses of Human Heredity (Cambridge,
MA: Harvard University Press).
Kitcher, Philip (1996) The Lives to Come: The Genetics Revolution and Human Possibilities (New York: Simon
and Schuster).
Konrad, Monica (2005) Narrating the New Predictive Genetics: Ethics, Ethnography and Science (Cambridge:
Cambridge University Press).
Kopytoff, Igor (1986) “The Cultural Biography of Things: Commoditization as Process,” in A
Appadurai (ed), The Social Life of Things: Commodities in Cultural Perspective (Cambridge: Cambridge
University Press): 64–91.
Kuliev, A M (1986) “Thalassaemia Can Be Prevented,” World Health Forum 7: 286–90.
Lacqueur, Thomas (1983) “Bodies, Death and Pauper Funerals,” Representations 1: 109–30.
Latour, Bruno (1988) The Pasteurization of France (Cambridge, MA: Harvard University Press).
Lewontin, Richard C (1992) “The Dream of the Human Genome,” in H C Plotkin (ed), New York Review
of Books (New York: Wiley): 31–40.
Lie, M & K H Sorensen (eds) (1996) Making Technology Our Own? Domesticating Technology into day Life (Oslo, Norway: Scandinavian University Press).
Every-Linebaugh, Peter (1975) “The Tyburn Riot: Against the Surgeons,” in D Hay, P Every-Linebaugh, J Rule,
E P T Thompson, & C Winslow (eds), Albion’s Fatal Tree: Crime and Society in Eighteenth-Century England
(London: Allen Lane): 65–117.
Trang 3Lippman, Abby (1998) “The Politics of Health: Geneticization Versus Health Promotion,” in S Sherwin
(ed), The Politics of Women’s Health: Exploring Agency and Autonomy (Philadelphia: Temple University
Press).
Lock, Margaret (1993) Encounters with Aging: Mythologies of Menopause in Japan and North America
(Berkeley: University of California Press).
Lock, Margaret (2002a) “Alienation of Body Tissue and the Biopolitics of Immortalized Cell Lines,” in
N Scheper-Hughes & L Waquant (eds), Commodifying Bodies (London: Sage): 63–92.
Lock, Margaret (2002b) Twice Dead: Organ Transplants and the Reinvention of Death (Berkeley: University
of California Press).
Lock, Margaret (2002c) “Utopias of Health, Eugenics, and Germline Engineering,” in M Nichter & M.
Lock (eds), New Horizons in Medical Anthropology (London: Routledge): 239–66.
Lock, Margaret (2003) “On Making up the Good-as-Dead in a Utilitarian World,” in S Franklin & M.
Lock (eds), Remaking Life and Death: Toward an Anthropology of the Biosciences (Santa Fe, NM: School of
American Research Press).
Lock, Margaret (2005) “Eclipse of the Gene and the Return of Divination,” Current Anthropology 46:
S47–S70.
Lock, Margaret & Patricia Kaufert (eds) (1998) Pragmatic Women and Body Politics (Cambridge:
Cambridge University Press).
Lock, Margaret, Stephanie Lloyd, & Janalyn Prest (2006) “Genetic Susceptibility and Alzheimer’s Disease:
The ‘Penetrance’ and Uptake of Genetic Knowledge,” in A Leibing & L Cohen (eds), Thinking About Dementia: Culture, Loss, and the Anthropology of Senility (New Brunswick, NJ: Rutgers University Press):
123–54.
Lock, Margaret, Julia Freeman, Rosemary Sharples, & Stephanie Lloyd (2006) “When It Runs in the
Family: Putting Susceptibility Genes into Perspective,” Public Understanding of Science 15(3): 277–300.
Long, Susan O (2003) “Reflections on Becoming a Cucumber: Culture, Nature, and the Good Death in
Japan and the United States,” Journal of Japanese Studies 29(1): 33–68.
Mantel, Hilary (1998) The Giant, O’Brien (Toronto: Doubleday).
McGuffin, P., B Riley, & R Plomin (2001) “Toward Behavioral Genomics,” Science 291(5507): 1242–49.
McNeil, S M., A Novelletto, J Srinidhi, G Barnes, I Kornbluth, M R Altherr, J J Wasmuth, J F Gusella, M E MacDonald, & R H Myers (1997) “Reduced Penetrance of the Huntington’s Disease
Mutation,” Human Molecular Genetics 6: 775–79.
Michie, S., H Drake, M Bobrow, & T Marteau (1995) “A Comparison of Public and Professionals’
Atti-tudes Towards Genetic Developments,” Public Understanding of Science 4: 243–53.
Mitchell, John J., Annie Capua, Carol Clow, & Charles R Scriver (1996) “Twenty-Year Outcome sis of Genetic Screening Programs for Tay-Sachs and β-Thalassemia Disease Carriers in High Schools,”
Analy-American Journal of Human Genetics 59: 793–98.
Novas, Carlos & Nikolas Rose (2000) “Genetic Risk and the Birth of the Somatic Individual,” Economy and Society 29(4): 485–513.
Office of Technology Assessment (1988) Mapping Our Genes (Washington, DC: Government Printing
Office).
Oudshoorn, Nelly & Trevor Pinch (eds) (2003) How Users Matter: The Co-Construction of Users and nologies (Cambridge, MA: MIT Press).
Trang 4Tech-Parens, Eric & Adrienne Asch (1999) “The Disability Rights Critique of Prenatal Genetic Testing:
Reflec-tions and RecommendaReflec-tions,” Hastings Centre Report 29(5): S1–S22.
Park, Katherine (1994) “The Criminal and the Saintly Body,” Renaissance Quarterly 47: 1–33.
Paul, Diane B & Hamish G Spencer (1995) “The Hidden Science of Eugenics,” Nature 374: 302–4.
Petryna, Adriana (2006) “Globalizing Human Subjects Research” in A Petryna, A Lakoff, & A
Kleinman (eds), Global Pharmaceuticals: Ethics, Markets, Practices (Durham and London: Duke
Univer-sity Press): 33–60.
Potter, Paul (1976) “Herophilus of Chalcedon: An Assessment of His Place in the History of Anatomy,”
Bulletin of the History of Medicine 50: 45–60.
Quaid, K A & M Morris (1993) “Reluctance to Undergo Predictive Testing: The Case of Huntington
Disease,” American Journal of Medical Genetics 45: 41–45.
Quaid, Kimberly A & Melissa K Wesson (1995) “Exploration of the Effects of Predictive Testing for
Huntington Disease on Intimate Relationships,” American Journal of Medical Genetics 57: 46–51 Rabinow, Paul (1996) Essays on the Anthropology of Reason (Princeton, NJ: Princeton University
Press).
Rapp, Rayna (1998) “Refusing Prenatal Diagnosis: The Meanings of Bioscience in a Multicultural World,”
Science, Technology & Human Values 23(1): 45–71.
Rapp, Rayna (1999) Testing Women, Testing the Fetus: The Social Impact of Amniocentesis in America (New
York: Routledge).
Rapp, Rayna (2003) “Cell Life and Death, Child Life and Death: Genomic Horizons, Genetic Diseases,
Family Stories,” in S Franklin & M Lock (eds), Remaking Life and Death: Toward an Anthropology of the Biosciences (Santa Fe, NM: School of American Research Press).
Richards, Martin (1996) “Lay and Professional Knowledge of Genetics and Inheritance,” Public standing of Science 5: 217–30.
Under-Richardson, Ruth (1988) Death, Dissection, and the Destitute (London: Routledge).
Richardson, Ruth (1996) “Fearful Symmetry: Corpses for Anatomy, Organs for Transplantation,” in
R C Fox, L J O’Connell, & S J Youngner (eds), Organ Transplantation: Meaning and Realities (Madison:
University of Wisconsin Press): 66–100.
Rix, Bo Andreassen (1999) “Brain Death, Ethics, and Politics in Denmark,” in S J Youngner, R M.
Arnold, & R Shapiro (eds), The Definition of Death: Contemporary Controversies (Baltimore, MD: Johns
Hopkins University Press): 227–38.
Rose, Dale & Stuart Blume (2003) “Citizens as Users of Technology: An Exploratory Study of Vaccines
and Vaccination,” in N Oudshoorn & T Pinch (eds), How Users Matter: The Co-Construction of Users and Technologies (Cambridge, MA: MIT Press).
Rose, Nikolas (1993) “Government, Authority and Expertise in Advanced Liberalism,” Economy and Society 22(3): 283–99.
Scheper-Hughes, Nancy (1998) “Truth and Rumor on the Organ Trail,” Natural History 107(8): 48–56.
Scheper-Hughes, Nancy (2002) “Bodies for Sale: Whole or in Parts,” in N Scheper-Hughes & L
Wacquant (eds), Commodifying Bodies (London: Sage): 31–62.
Scheper-Hughes, Nancy (2003) “Rotten Trade: Millennial Capitalism, Human Values and Global Justice
in Organs Trafficking,” Journal of Human Rights 2: 197–226.
Trang 5Schöne-Seifert, Bettina (1999) “Defining Death in Germany: Brain Death and Its Discontents,” in R M.
Arnold, R Shapiro, & S J Youngner (eds), The Definition of Death: Contemporary Controversies (Baltimore,
MD: Johns Hopkins University Press): 257–71.
Shapin, S & S Schaffer (1985) Leviathan and the Air-Pump: Hobbes, Boyle and the Experimental Life
(Princeton, NJ: Princeton University Press).
Sharp, Lesley A (1995) “Organ Transplantation as a Transformative Experience: Anthropological
Insights into the Restructuring of the Self,” Medical Anthropology Quarterly 9(3): 357–89.
Sharp, Lesley A (2006) Strange Harvest: Organ Transplants, Denatured Bodies, and the Transformed Self
(Berkeley: University of California Press).
Siminoff, Laura A & Kata Chillag (1999) “The Fallacy of the ‘Gift of Life,’” Hastings Center Report 29(6):
34–41.
Simmons, Roberta G., Susan K Marine, Robert Simmons, Susan D K Marine, Richard L Simmons
(1987) Gift of Life: The Effect of Organ Transplantation on Individual, Family, and Societal Dynamics (New
Brunswick, NJ: Transaction Books).
Spallone, Pat (1998) “The New Biology of Violence: New Geneticisms for Old?” Body and Society 4:
Strathern, Marilyn (2005) “Robust Knowledge and Fragile Futures,” in A Ong & S J Collier (eds),
Global Assemblages: Technology, Politics and Ethics as Anthropological Problems (Malden, MA: Blackwell):
464–81.
Thomas, Nicholas (1991) Entangled Objects: Exchange, Material Culture, and Colonialism in the Pacific
(Cambridge, MA: Harvard University Press).
Tilney, Nicholas L (2003) Transplant: From Myth to Reality (New Haven, CT: Yale University
Press).
Trescott, M M (ed) (1979) Dynamos and Virgins Revisited: Women and Technological Change in History
(Lanham, MD: Scarecrow Press).
Turney, John & Jill Turner (2000) “Predictive Medicine, Genetics and Schizophrenia,” New Genetics and Society 19(1): 5–22.
Van der Geest, Sjaak & Susan Reynolds Whyte (eds) (1988) The Context of Medicines in Developing tries: Studies in Pharmaceutical Anthropology (Dordrecht, Netherlands: Kluwer).
Coun-Willis, Evan (1998) “Public Health, Private Genes: The Social Context of Genetic Biotechnologies,” ical Public Health 8(2): 131–39.
Crit-Winner, Langdon (1986) The Whale and the Reactor: A Search for Limits in an Age of High Technology
(Chicago: University of Chicago Press).
Trang 6Wright, Susan (2006) “Reflections on the Disciplinary Gulf Between the Natural and the Social
Sci-ences,” Community Genetics 9(3): 161–69.
Yoxen, E (1982) “Constructing Genetic Diseases,” in P Wright & A Treacher (eds), The Problem of Medical Knowledge: Examining the Social Construction of Medicine (Edinburgh: University of Edinburgh):
144–61.
Zargooshi, Javaad (2001) “Iranian Kidney Donors: Motivations and Relations with Recipients,” Journal
of Urology 165: 386–93.
Trang 7Over the past two decades, custom-tailored technologies and theoretical models havebecome ubiquitous features of financial markets Contemporary markets mean screensdisplaying an uninterrupted flow of prices in public places, financial products designedwith the help of complex mathematical models, software programs for the instantdisplay and analysis of financial data, and much more Against the background of aglobal expansion, this massive presence, together with the growing dependence offinancial transactions on both technology and formal modeling, raises the question
of the impact of science and technology on a fundamental institution of modern eties The relevance of this question can be better understood if we take into accountthe historical dimension of the processes through which science and technology havepenetrated financial transactions Historians of economics and sociologists alike haverecently acknowledged that this impact should be measured in centuries rather thandecades (e.g., Sullivan & Weithers, 1991; Harrison, 1997; Jovanovic & Le Gall, 2001).How do they contribute, then, to the preeminent position occupied by financial institutions in developed societies? To what extent is finance shaped by science andtechnology?
soci-Since the mid-1990s, scholars from STS have become increasingly aware of thesequestions Working initially independently of each other, several scholars startedresearch projects on the role of science and technology in financial markets Theoutput of these projects has materialized in books, journal articles, Ph.D dissertations,conferences, informal exchange networks, coordinated projects, as well as national
associations (e.g., the Association d’études sociales de la finance in France) Research
hosted at several universities in Western Europe and North America has grown at asteady pace, attracting doctoral students, research funding, together with the interest
of academic publishers, and cross-fertilizing academic fields such as behavioral finance,economic sociology, economic anthropology, international political economy, andgeography
One question arising here is that of the background against which the interest ofSTS scholars was directed toward finance Several developments frame this moment,independently of particular interests and motivations (1) After the fall of the IronCurtain and toward the mid-1990s, the acceleration of global financial expansionAlex Preda
Trang 8highlighted the central position occupied by technology and by formal models offinance (2) More or less celebratory media representations of the wave of financialexpansion contrasted with several severe crises toward the end of the 1990s, crises inwhich formal models played an important role (e.g., the Long-Term Capital Manage-ment crisis of 1998) These events triggered renewed discussions about the capacity offinancial markets to replace social policies and raised issues of trust, legitimacy, andmarket constitution, directly involving both technology and financial theories (3)Since the mid-1980s, criticism of the central assumptions of neoclassical economicshad increased its pace in economic sociology as well as in the history of economics.Insights and theoretical approaches developed in science and technology studies hadbeen fruitfully transferred to the history of economics, especially in the work of PhilipMirowski (1989) Additional research in the history of financial economics (e.g.,Mehrling, 2005; Bernstein, 1996) also highlighted the conceptual links betweenphysics (especially thermodynamics) and financial theory.
Against this background, a transfer of research topics, concepts, and approachesfrom STS to the study of financial markets took place, to the effect that social studies
of finance (SSF) emerged as a new field of inquiry Yet, SSF (which comprises differentemerging paradigms) cannot be seen as a mere extension or as an application ofscience and technology studies to finance First, there has been cross-fertilization withother disciplinary fields, most notably perhaps with economic sociology Second, SSFdid not simply take over already existing STS concepts but modified and enrichedthem, developing its own research agenda In the following, I discuss some of the mostimportant conceptual and topical links between STS and the social studies of finance,thus exploring the SSF research agenda In the first step of the argument, I show howvarious SSF approaches conceptualize the relationship between knowledge and finan-cial action, analogous to the STS conceptualization of the link between scientificknowledge and practical action In a second step, I examine how SSF approaches thedemarcation problem with regard to financial economics and to markets I argue thatthe social studies of finance take over, reformulate, and expand the demarcationproblem examined in science and technology studies In the third step, I discuss theconcept of agency developed in SSF and show its similarities and differences with con-cepts of agency present in science and technology studies as well as in economictheory The conclusion reviews the research agenda of the social studies of financeand discusses potential cross-fertilization with the STS agenda
FINANCIAL INFORMATION AND PRICE AS EPISTEMIC THEMES
Information has become a crucial concept of economic theory in the 1970s as a result(and continuation) of efforts started during World War II in operations research (e.g.,Klein, 2001: 131; Mirowski, 2002: 60), efforts aiming at optimizing action outcomesbased on random, incomplete data (e.g., tracking airplanes with guns and messageencryption) This required mathematical tools for transforming randomness intodetermined patterns, tools that were combined with the notion (formulated by
Trang 9Friedrich von Hayek and the Austrian School of economics in the 1930s) that marketscan be seen as gigantic distributors of information, similar to a telephone switchboard(Mirowski, 2002: 37) This fusion between a view of allocation processes as determined
by information on the one hand, and the formal processing of random signals in order
to identify determined patterns on the other, led to conceptualizing information asadditive signals, independently of the cognitive properties of the receiver The effectwas to separate information from cognition; while the former was treated as a sort oftelephone signal, triggering a reaction from the receiver, cognition was deemed to beirrelevant Noise was equated with uncertainty (Knight, [1921]1985) and seen as ablurring of determined (or meaningful) patterns, analogously to an encryptionmachine that scrambles the message by inserting (apparently) random signals.This concept of information as signals, which has proved influential in economicsociology too (e.g., White, 2002: 100–101) is being contested by the game-theoreticalnotion of information as choice of actions relative to signals under a fixed decisionrule (Mirowski, 2002: 380) This introduces the idea of rational expectations on thepart of economic actors (Sent, 1998: 22); expectations contain deterministic patternsthat filter the random signals This second notion of information maintains the dis-tinction to cognition, seen not as entirely irrelevant but as statistical inference.According to Mirowski (2002: 389; 2006), there is a third concept of information assymbolic computation, coming from artificial intelligence, which has proved lessinfluential than the other two Relevant in this context is the fact that “information,”
as it is used in neoclassical economic theory, is seen analogously to phone signals.Uncertainty (or noise) is understood as random signals, with no underlying mean-ingful pattern, while cognition is taken either as irrelevant or as reducible to statisti-cal inferences
Financial markets can be seen thus as information processors, sending out pricesignals (Paul, 1993: 1475) on the basis of which actors make their choices according
to (rational) decision rules In this process, actors reciprocally anticipate their tive expectations and incorporate them into signals In turn, these anticipations areaccompanied by dispersion and volatility, understood as a measure of ignorance anduncertainty in the marketplace (e.g., Stigler, 1961: 214) Along with price observation(Biais, 1993: 157), networks of relationships (e.g., Baker, 1984; Abolafia, 1996) and spe-cialization (Stigler, 1961: 220) contribute to reducing noise
respec-Price signals are regarded as fully reflecting all the information available to marketactors (Stigler, 1961) This is also a key assumption of the efficient market hypothesis(EMH) The presence of a large number of actors in the market, acting independently
of each other, handling all the relevant information they can get, is a fundamentalcondition for market efficiency and liquidity (Fama, 1970, 1991; Jensen, 1978) Theseparticipants “compete freely and equally for the stocks, causing, because of such com-petition and the full information available to the participants, full reflection of theworth of stocks in their prevailing prices” (Woelfel, 1994: 328)
EMH is related to the random walk hypothesis (RWH), which can be followed back
to Louis Bachelier’s treatment of stock price movements as a Brownian motion
Trang 10([1900]1964) and to Jules Regnault, a mid-nineteenth-century French broker(Jovanovic & Le Gall, 2001) Prices are conceived of as similar to gas molecules, movingindependently of each other, with future movements being independent of past move-ments This tenet grounds models for computing the probability of future price move-ments, such as the Black-Merton-Scholes formula (Mehrling, 2005; MacKenzie, 2006).The EMH tenet was contested early by Benoit Mandelbrot, who noticed that price fluc-tuations are inconsistent with a Gaussian distribution of securities prices (they gener-ate “fat tails”) and that prices are scale-invariant (Mirowski, 2004: 235, 239: Mehrling,2005: 97–98).
The assumption of market efficiency presupposes that at any given time economicagents can distinguish between (meaningful) signals and noise, between the relevant
and the irrelevant, without recourse to issues of cognition Several epistemological
problems arise here (1) The distinction between prices and price data: prices as signalscannot be separated from price data, which are not neutral with respect to produc-tion and recording processes, as well as to their material support Recording dataimplies the use of technology; therefore, the question arises about how price record-ing technologies shape price data and financial transactions with them (see the Socialand Cultural Boundaries of Financial Economics section) (2) The generation andrecording of data are not independent of formal and informal theoretical assumptionsabout veridicality, consistency, homogeneity, reproducibility, comparability, andmemorization, assumptions that are incorporated into recording procedures and tech-nologies and reflected in analysis and interpretation How are these assumptions pro-duced, and which social forces are involved in this process? (3) The use of price data
by financial actors implies observation, monitoring, and representation Theseprocesses, in their turn, require interpretation (provided by financial theories), skills,and tacit knowledge
Seen in this perspective, price data neither appear as given, natural, or determined
by the inherent rationality of financial actors, nor do they appear as analogous tophone signals that trigger the recipients’ reactions Rather, these data appear as prax-eological structures (Lynch, 1993: 261), that is, as routine, accountable sequences ofsocial action In this perspective, information, the key concept of financial econom-ics (Shleifer, 2000: 1–3), is not treated as the natural starting point of investigationbut as a practical problem for financial actors When using price data, academic econ-omists share a set of epistemic assumptions with nonacademic financial actors:assumptions about veridicality, consistency, homogeneity, reproducibility, and so on.The scientific work of financial modeling or experimenting does not appear as embed-ded in a type of understanding or rationality radically different from (and superior to)the lay one At the same time, since theoretical models are used in financial transac-tions, they have not only a representational but also an instrumental quality How dothey affect, then, the very assumptions they rely on? A first task on the research agenda
is, therefore, to investigate these price-related epistemic themes
I begin with price observation: what does it mean to observe securities prices asobjective and given? Karin Knorr Cetina and Urs Bruegger have studied how dispersed
Trang 11traders observe prices in trading rooms with the help of computer screens They arguethat price observation is above all a collective work (2002: 923–24) of reciprocal coor-dination, which takes place over considerable geographical distances and does notrequire spatial co-presence What it requires is temporal co-presence: the observation
of the same price data at the same moment in time Temporal co-presence, in its turn,
is achieved in a form of interaction which Knorr Cetina and Bruegger call screen (2002: 940), in opposition to Erving Goffman’s face-to-face situation (1982):personal interaction mediated and determined by the flow of prices on the computerscreen Reciprocal coordination determines that price data can be accounted for asobjective and reproducible while being continuously generated in conversationalinteractions Whereas in the scientific laboratory spatial coordination (Gieryn, 2002:128) plays an important role in the observation of scientific objects, in the tradingroom it is temporal coordination that appears as crucial
face-to-The laboratory appears as an “‘enhanced’ environment that ‘improves upon’ thenatural order as experienced in everyday life in relation to the social order” (KnorrCetina, 1995: 145) The trading room, by contrast, does not work as a system thatmodifies and integrates an external (natural) order into the social order Rather, thetrading room constitutes a reflexive system of data observation and projection (KnorrCetina 2005: 40) that brackets out the outside world: the price data it operates withare generated in the system’s own conversational interactions In the process of reci-procal coordination, however, the data become objectified and treated as external withrespect to the system’s operations A key role in this process is played by the computerscreen, on which financial actors project the outcomes of their interactions (i.e., theprice data) At the same time, similar to the scientific lab, trading rooms constituteheterogeneous frameworks of distributed cognition (Beunza & Stark, 2004: 92), whereinstruments and actors with different properties and skills, respectively, produce andcategorize the objects (i.e., financial products) of action
This raises the question of the role played by price-recording and -displaying nologies with respect to epistemic themes such as veridicality and homogeneity.Veridicality of price data implies that participants ascribe them a referential qualitywhile investing them with trust at the same time Homogeneity implies that price dataare accessible in the same form to every participant (i.e., standardized), a requirementderived from the condition of actors’ mutual coordination based on data observation.The relationship between trust, standardization, and technology has been a centralSTS issue during the last two decades (e.g., MacKenzie & Wajcman, 1985; MacKenzie,2001a; Porter, 1995): technology disentangles data from the particular skills of indi-vidual persons and invests it with abstract authority Trust is displaced from personalrelationships and individual reputations and put on a mix of abstract competencesand iterable rules, incorporated in technology With respect to price data, historicalstudies of competing price-recording technologies show how their introduction tofinancial markets in the late 1860s changed the veridicality of price data (Preda, 2003).While one technology (the pantelegraph) attempted to confer veridicality on pricedata by reproducing the signature of transaction partners, its competition (the stock
Trang 12tech-ticker) disentangled price data from individuals and tied them to each other Data thusappeared as self-sufficient, abstract representations of a flow of transactions Theirveridicality was grounded in the technology’s set of simple, iterable rules, which couldreproduce these data across various contexts.
Standardization of financial information involves calculative agencies (Callon, 1998:6–12; 1999: 183)—that is, procedures and techniques through which the “economic”
is disentangled from the “social.” These procedures, provided by theoretical models,are instruments through which a certain type of economic rationality is enacted In
a study of standardized cotton prices in world markets, Koray Çalis¸kan (forthcoming)investigates the social processes through which different stages of standardization areattained These stages, which Çalis¸kan, following Callon, calls “prosthetic prices,”involve (1) the reciprocal fine-tuning of the traders’ pricing models and expectations,(2) the projection of future prices based on commonly acknowledged calculations, and(3) the narrative framing of pricing formulas
A complementary aspect of standardization is how price data—made abstract andtaken out of the concrete contexts of their generation—are used by financial actors tocalculate and thus construct paths of collective action A central dimension of finan-cial calculation is that discursive sense-making procedures frame the data and make
it accountable—that is, practically intelligible—to financial actors Several case studieshave examined the practices of accountants, who are confronted with the task ofmeeting formal rationality criteria when dealing with financial information Thesestudies show that accountants do not treat financial data as abstract, disembedded,and universal but rather as depending on local procedures through which they aremade practically intelligible; these include negotiation, storytelling, and tinkering,among others (e.g., Kalthoff, 2004: 168; 2005) Since the accountants’ criteria of formalrationality depend on the generation of intelligible data, and the latter depend onlocal sense-making procedures, it follows that in practice there can be no clear-cut dis-tinction between formal, abstract rationality, on the one hand, and practical intelli-gibility, on the other Several authors have stressed the need for studies of
“ethnoaccountancy” (e.g., Heatherly et al., forthcoming; Vollmer, 2003), which shouldfocus on the practical methods through which financial data are generated andinvested with formal qualities Examples are profit and costs as historical categories offinancial knowledge, local methods of accounting for financial data, and practical rulesfor the classification of these data
Observation, representation, and calculation of financial data are approached asepistemic themes, in a manner that is both directly and indirectly influenced byscience and technology studies One of the contributions of SSF is to show that pricedata—regarded as unproblematic both by financial economics and by economic soci-ology—are constituted in a web of interactions involving both human actors and tech-nological artifacts While economic sociology has focused mainly on the study ofsocial-structural embeddedness of economic transactions, social studies of financeshow that information is the outcome of complex, multilayered interaction processesand indistinguishable from cognition At the same time, rationality criteria do not
Trang 13merely build a normative horizon for financial action but are actually generated andused as practical tools in the actors’ transactions This link between local practices andtheoretical horizons questions the relationship between financial theory—understoodboth as prescription and as representation—and practical action I turn now to thisaspect.
SOCIAL AND CULTURAL BOUNDARIES OF FINANCIAL ECONOMICS
As an established academic discipline, financial economics claims to build a ical horizon for concrete actors and practices by enunciating the ideal conditions ofrationality under which efficient action becomes possible As shown in the previoussection, a cornerstone of financial economics is the EMH, with the assumption thatall action-relevant information quickly becomes fully incorporated into securitiesprices, and therefore actors can make transaction-relevant decisions based on dataabout price variations This incorporation mechanism is public; sufficiently largenumbers of actors have access to data about price variations so that no single person
theoret-or group can consistently control transactions The probability of gaps between futureand actual prices can be computed according to a formal model and tested againstempirical data In this account, the EMH, which has known several varieties, can beseen as a deductive theoretical model of price behavior
At this point, several questions arise: (1) about financial theory as the product of ahistorical development and about the social and cultural factors playing a role here,(2) about how the boundaries of this model were drawn, and (3) about the relation-ship between the theoretical model and the empirical data against which it is tested.The historiography of economics has presented modern financial theory as the result
of a straightforward development beginning with Louis Bachelier (and Jules Regnaultearlier) and continuing in the 1960s and the 1970s with the work of Eugene Fama andPaul Samuelson, among others (e.g., Dimson & Moussavian, 1998: 93) Yet, a moreilluminating approach would be to follow the history of financial theory not as a string
of disembodied, asocial thoughts but as a series of social and cultural processes throughwhich its language, concepts, and objects of investigation take shape Starting fromthis premise, Alex Preda (2004a) has investigated the nineteenth-century prehistory
of financial theory and shown how a vernacular “science of financial investments”reconfigured investor behavior as rational, grounded in attention and observation,while linking the concept of price to those of news and information This “science”disentangled financial securities from gambling and prepared the field for a formaltreatment of price movements At the same time, brokers like Jules Regnault appliedphysical principles to the study of price variations (Jovanovic & Le Gall, 2001) Formalmodels like Bachelier’s shifted from investor to price behavior, represented in an alge-braic not a geometrical fashion We are confronted here with the emergence of severalcultural boundaries (between rational and nonrational behavior, gambling and invest-ing, human actors and prices) that lay the ground for the formal theory of efficientmarkets
Trang 14Although the prehistory of financial theory traced these cultural and conceptualboundaries, the theory’s growth into a full-blown deductive, formal model took placebetween the 1950s and the early 1970s The more general intellectual background ofthis process was a sustained program of economic research into information and opti-mization algorithms, initiated during World War II at several U.S research institutes.Whereas neoclassical economics operated until then with a concept of utility modeled
on classical mechanics’ notion of energy, this research program had at its core theconcept of information, understood as patterns of signals similar to phone codes(Mirowski, 2002: 7, 21) The growth of financial theory into the dominant academicmodel, however, required further boundary work, concerning (1) theorists and prac-titioners of formal pricing models and (2) financial theorists and the nonfinancialeconomists in the academic world The setting in which this second boundary wastraced was provided by U.S business schools, which underwent a rapid “academi-cization” in the 1960s, providing a home for financial economics, which otherwisewas sometimes marginalized in the more established economics departments
As to the first boundary—although in the beginning practitioners were hostile topricing models and to the general assumptions of the EMH, some of them enrolledthis theoretical apparatus as a handy tool in their controversies and feuds with otherpractitioners (Mehrling, 2005; MacKenzie, 2006) A central case studied by DonaldMacKenzie and Yuval Millo (2003) is that of the option pricing formula developed byFischer Black, Myron Scholes, and Robert C Merton in the early 1970s In the earlystages of its use, empirical data did not fit the predictions of the Black-Scholes-Mertonformula Yet, traders on the Chicago Board Options Exchange (CBOE) used it because
of its cognitive simplicity, academic reputation, and free availability The Scholes-Merton pricing formula offered traders a tool for coordinating their actionsand a guide to trading and hedging The use of the Black-Scholes-Merton formula,together with innovations in financial products, led to an increasing fit betweenempirical data and theoretical predictions and thus ultimately to the academic andpractical success of this model
Black-The establishment of financial economics as a successful academic discipline and,with it, of the EMH as a dominant theoretical model was the outcome of complexsocial processes that traced the boundaries of finance as a domain of legitimate theoretical conceptualization and empirical investigation This was accompanied byjurisdictional claims of practitioners, conflicts of interest among academic and nonacademic groups, and a reconceptualization of market exchanges as optimizationalgorithms The boundaries between academic financial theory and practice, betweenacademic and other forms of expertise, appear as porous and shifting; vernacular con-cepts of price as information have played a role in preparing the conceptual founda-tions of financial theory, while the interests, practices, and institutions ofnonacademic groups have contributed in an essential fashion to the overall success offormal pricing models
Although a central tenet of EMH is that securities prices move in a random fashionand cannot be predicted, technical analysis (or chartism) maintains that prices move
Trang 15according to predictable patterns In spite of this inconsistency with (and of attacksfrom) academic theory, chartism has been successful with financial practitioners for acentury How can a vernacular form of expertise coexist with an established academictheory asserting the opposite? How can it maintain success with practitioners overlong periods of time? The investigation of these issues, pertaining to studies of demar-cation and expertise (Evans, 2005; Collins & Evans, 2003), recently has been started(e.g., Preda, 2004b) At the same time, the impact of financial theory on markets,together with the prominent role played by technology, raises the issue of agency:how are the structures of financial action changed by formal pricing models, by price-recording and data-processing technologies? How is the organization of marketsaffected by them?
IMPACT OF THEORETICAL MODELS AND TECHNOLOGY:
AGENCY IN FINANCIAL MARKETS
The “technologization” of stock exchanges started in the late 1860s with the stockticker, followed by cinema screens in the 1920s, teletypewriters in the 1930s, and com-puters in the early 1960s In the 1950s, the New York Stock Exchange (NYSE) draftedplans for computer recording of trading data, and in 1962 it formulated the aim ofdeveloping a “complete data processing system” that “will mechanize virtually allpresent manual operations in the Exchange’s stock ticker and quotation services”(NYSE, 1963: 48–49) In 1963, a special study of the Securities Exchange Commission(SEC) recommended to the U.S Congress the automation of financial markets
In foreign exchange markets, Reuters introduced the first monitor screen and board in 1967 and the Monitor Dealing Service (a system of computerized transac-tions) in 1970 In the early 1980s, the PC won over proprietary systems in brokerageoffices, a process that facilitated the automation of major financial exchanges such asEuronext (formed in 2000 by the merger of the Paris, Brussels, and Amsterdam stockexchanges) The coexistence of automated and nonautomated financial exchanges hashighlighted technology-induced differences in price and volatility patterns (Franke &Hess, 2000: 472), raising the question of the role of technology in the constitution ofsecurities prices In the late 1990s, the first electronic exchange networks (ECNs) wereapproved by the SEC as platforms for financial transactions In 2006, ECNs like Archipelago merged with the NYSE
key-Neoclassic economic theory, for its part, has conceived agents as isolated als, endowed with calculative capacities, desires, and preferences, which remain unaf-fected by their relationship with other human beings or with artifacts (Davis, 2003:167) Combined with the prevailing notion of information as signal, this has led toconceiving economic agents as atomistic calculators who process external signals andtake decisions (Mirowski, 2002: 389) Nevertheless, studies of market microstructurequestion these agential assumptions (e.g., O’Hara, 1995: 5, 11)
individu-One of the lasting theoretical and empirical contributions of social studies of sciencehas been to stress the irreducibility of agency to human intentionality or will and to
Trang 16show that scientific theories and technological artifacts shape future paths of tive action At least two concepts mark the STS contribution: (1) theoretical (or disci-plinary) agency, concerning the ways in which conceptual artifacts (like scientificmodels or mathematical formalisms) change cultural and social structures (e.g., Pickering, 1995: 145), and (2) sociotechnical agency, concerned with the role of mate-rial arrangements and of technological artifacts (e.g., Bijker, 1995: 192, 262; Bijker etal., 1987) The STS conceptualization of agency differs from technological determin-ism in that technology (1) is not seen as preconfiguring paths of action, (2) impliesnot only constraints but also social resistance, and (3) is not seen as distinct from but
collec-as a form of social action Consequently, the computerization of financial exchanges
is not seen as inevitable but as the result of specific social interests, conflicts, and groupmobilization
Studies of theoretical and sociotechnical agency have investigated (1) how the duction of formal models and technologies shape future paths of action (the producerside) and (2) how the use of theories and technological artifacts affect collective actionand transform communities (the user side) It has been argued that user groups act asmarket intermediaries, thereby playing a special role with respect to social diffusionand agency (Pinch, 2003) With respect to the field of finance, it becomes relevant toexamine how theoretical models and technologies are produced and adopted in finan-cial markets and how their use affects financial transactions and changes the markets’organizational patterns
pro-Financial models (like the Black-Scholes-Merton formula) do not merely formulate
a set of rules that, when applied, will ensure that these transactions meet efficiencyand rationality criteria If we take these models as normative, we risk a deterministposition, according to which financial agents simply follow theoretical prescriptions
If we accept the representational character of formal models, we take financial actions as an isolated asocial domain of investigation and assume a naturalist stance(MacKenzie, 2001b)
trans-To avoid these conceptual difficulties while preserving a notion of theoretical
agency, Michel Callon (1998) has suggested the concept of performativity According
to Callon, economic theory shapes the way in which transactions are conducted andmarkets are organized; it has a performative character A program of research on per-formativity should involve an investigation of the social forces, groups, interests, andmechanisms through which successful theoretical intervention is performed Anexample in this respect (Callon, 1998) is the reshaping of an agricultural producemarket by economic consultants, a reshaping that enacts a normative model of rationality This enactment, however, is not automatic but involves conflicts betweeninterest groups, persuasion, and the mobilization of organizational structures and artifacts
Theoretical agency (performativity) consists of two opposite yet closely intertwinedprocesses The first is the demarcation of the boundaries between the economic andthe social, or disentanglement (Callon, 1999: 186)—a process through which ethicaland social aspects are redefined as outside the sphere of transactions The second
Trang 17process is the social entanglement between producer and user groups, through whichthey reciprocally tune their interests and enroll heterogeneous resources to realizethese interests In the language of actor-network theory, performativity then impliesthe creation of a heterogeneous network that defines its interests and mobilizes ade-quate resources while tracing conceptual and cultural boundaries in such a mannerthat the outcome of this process (e.g., empirical data, results) appears to reinforce the resources (e.g., confirm the abstract model) However, since the outcome of boundary-marking (data) is neither independent of the resources used (model) norinterest-neutral, it follows that model and data circularly reinforce each other, in away similar to the bond existing between theory producers and users.
Theoretical agency (or performativity) combines then the normative aspect of nomic theories with the reflexive character of economic knowledge: normative models
eco-of economic processes are developed by academic researchers and at the same timemonitored by market actors, who adopt and adapt these models to their own inter-ests, practices, and situations
Empirical studies such as MacKenzie and Millo’s (2003) historical analysis of theBlack-Scholes-Merton option pricing formula have highlighted how traders on theChicago Board Options Exchange (the user community) imposed Black-Scholes prices
on those who believed them to be too low In using the formula as a tool in hedgingand trading, traders started pushing down options prices; in doing so, they generatedprices that fitted those predicted by the theoretical model This, in turn, acted as anempirical confirmation of the theoretical model The use of the formula by optionstraders, together with the introduction of new financial products, narrowed the gapbetween theoretical predictions and actual prices At the same time, the use of theoption pricing formula changed the organization of derivative markets, their legal definition, and the structure of financial products On these grounds, MacKenzie
(2004; 2006: 17) distinguishes generic performativity, Barnesian performativity, and counter-performativity While generic performativity designates the use of the model as
a tool by practitioners, Barnesian performativity means that users will generate suchdata as to confirm the model’s predictions, without the data being directly derivedfrom the model Counter-performativity, by contrast, designates the situation wherethe use of a theoretical model engenders counter-productive imitation: price data generated by imitative trades no longer match the model’s predictions (e.g., “fat tail”distributions)
Other studies, such as Philip Mirowski and Edward Nik-Khah’s (2007) investigation
of wavelength auctions, have argued that the boundary between the economic andthe social is never perfect, since group interests and structures play a dominant role.Moreover, this boundary is marked by conflicts of interests between competing userand producer groups, who form alliances Mirowski and Nik-Khah show how in theauctions of mobile phone wavelengths competing alliances between phone compa-nies (user groups) and experimental economists (theory producers) were formed, withobjectives and agendas that fused theoretical and political aspects They also arguethat the complexity and heterogeneity of financial expertise (ranging from academics
Trang 18to securities analysts, accountants, and merger lawyers), together with the prominentrole of group interests, require a more nuanced approach to theoretical agency thanthat provided by the concept of performativity The overall argument resonates withthe requirement for a more intense analysis of various, even contradictory forms offinancial expertise needed for a better understanding of how boundaries are producedand maintained in finance (see also Miller, 2002).
The second aspect of agency is related to the massive reliance of global financialmarkets on technological systems for data processing and transactions Enmeshed withthis aspect are issues such as (1) the social forces that advance the technologization
of financial transactions, both in a historical and in a contemporary perspective; (2)the assumptions that underlie the design of trading programs; (3) the effects of tech-nology on the organization of financial exchanges and the perception of financialdata; and (4) the ties between technology and forms of financial expertise like securi-ties analysis
With respect to the first issue, recent historical studies have shown that, when thefirst price-recording technology (the stock ticker) was introduced on the NYSE, userand producer groups (stock brokers and telegraph companies, respectively) formedalliances to promote their monopoly and control price data This technology displacedbodily recording techniques, standardized data, and disentangled authority and credibility from individual actors (Preda, 2006) In her investigation of the ChicagoBoard of Trade, Caitlin Zaloom (2003) has confronted the question of the CBOT’s bitter resistance to automation, in contrast to the latter’s enthusiastic adoption by theParis Bourse in the late 1980s, as studied by Fabian Muniesa (2000, unpublished).Zaloom’s argument is that trading technologies are multilayered and embedded inlocal settings, being represented not only by software programs but also by the bodytechniques and spatial arrangements traders use to communicate and gather relevantinformation Lack of trading automation does not mean the absence of any technique;traders rely on a set of distributed, heterogeneous techniques for solving informationalproblems The body techniques employed by traders have developed into change-resistant routines, intertwined with networks of personal relationships and with asocial hierarchy on the trading floor This constellation of specific routines, spatialarrangements, and social relationships is perceived by participants as proprietary and
as inaccessible to outsiders Automation is resisted by traders and perceived as amenace to their privileges, to the existing networks of relationships, and, above allperhaps, to established ways of gathering and processing information Instead of beingperceived as reducing informational uncertainties, automated trading is seen asincreasing social uncertainties Resisting it does not mean that the CBOT traders resistany kind of technology On the contrary: they mobilize the existing techniques as aunique resource in fighting off attempts to change the ways in which they gather andprocess information
In contrast to the CBOT’s resistance to computerized trading, Fabian Muniesa showshow automation was successfully introduced to the Paris Bourse In the 1980s, the
Trang 19problem of the Paris Bourse was to attract customers by offering distinct features thatother stock exchanges did not possess This competitive pressure had been heightened
by the relatively marginal position of the Bourse with respect to other major exchanges(London and New York), by the deregulation of the London Stock Exchange in 1986,and by the latter’s subsequent technological upgrading The management of the ParisBourse adopted (and adapted) a system of computerized trading (CATS, or computer-assisted trading system) introduced on the Toronto Stock Exchange in 1975 with
limited success Yet, in modifying the CATS system (which became CAC, or Cotation assistée en continu), the Paris Bourse was confronted with the problem of the assump-
tions (among others, about equilibrium and fairness) that should underlie the tradingalgorithms These assumptions determine the design of the trading algorithm softwareand, consequently, the processes through which securities prices are formed (or whatmarket participants call “price discovery”)
From the beginning, it becomes clear that pricing is not a natural process ing the identification (or discovery) of an already existing “ideal” or “objective” price.Rather, pricing appears as a complex social process of negotiation involving interestgroups, software, economic theories, and computer networks, among others Theabsence of human intermediaries (i.e., brokers) does not imply the absence of anynegotiation process but rather displacement and distribution among heterogeneousactors While working in the tradition of the actor-network theory characteristic ofthe Paris school of STS, Muniesa highlights both producer- and user-related aspects ofsociotechnical agency On the producer side, he shows that the successful introduc-tion of automated trading on the Paris Bourse was brought about by an alliance ofmanagers, brokers, and software engineers who reciprocally tuned their positions andadapted existing technologies to local constellations of interests An outcome of thisreciprocal tuning was the presentation of trading software as embodying “a vision ofthe market” (Muniesa 2000: 303)—that is, a “perfect” theory of market equilibriumthat was not imported as a given into this alliance but produced by it The agentialcharacter of formal equilibrium models has less to do with their normative characterthan with their role as a resource in such a heterogeneous alliance
involv-On the user side, the trading software enables participating actors to compute prices,which they afterward project as “true,” “real,” and “discovered.” This mode of com-puting differs from previous ones (which used statistical means) and implies stan-dardization, without being reducible to it As performed by the software, thecalculation of prices is standardized and displayed to actors from a central source.Traders appear as anonymous participants in transactions, known only to a central,data-providing authority (the computer) Yet, exactly because participation in trading
is anonymous and routed via a technological authority, actors need to reciprocallycoordinate their expectations by inferring personal or categorical identities from thecomputerized display of price data Coordination of expectations, in turn, allowstraders to project future courses of actors and to construct the market as a collectivemovement of human and nonhuman agents, a movement that grounds evaluations
Trang 20of market fairness and justice Personal agency and technical agency combine to
configure the market as an entity sui generis, with a life of its own.
FINANCIAL MODELS, TECHNOLOGY, AND RISK
The starting point of my argument (presented in the Financial Information and Price
as Epistemic Themes section) has been the centrality of the concept of information
in financial economics Acknowledging this position means investigating the istemic premises of this concept, its cultural trajectory in the history of economics, aswell as its links with technology A significant link is that between information andrisk: a standard argument of financial economics (taken over by economic sociology
ep-as well) is that economic actors gather and distribute information to process tainties into risks (e.g., Stinchcombe, 1990: 5), thereby enabling economic decisions.Yet, if information cannot be separated from (tacit and explicit) forms of knowledgeand expertise, depending on heterogeneous constellations of human actors and arti-facts, it follows that the said forms of knowledge, together with group relations andconcrete technologies, will have an impact on how financial risks are produced andmanaged Since financial risk constitutes a major problem in a global world (as repeat-edly illustrated by the crises of the late 1980s and 1990s), investigation of this areaoffers a potential for practical contributions as well
uncer-On a first, micro-interaction level, financial risk appears as a discursive device that,combined with body technique and with price-recording technologies, is employed
in managing the “trading self” (Zaloom, 2004: 379) While more general economicdiscourse ascribes a negative connotation to risk, the practice of financial actors is toapproach it as something that is not entirely manageable through calculations andformulas but requires narrative framings and classifications (see also Mars, unpub-lished; Kalthoff, 2005)
On a different, organizational level, financial risk is made sense of with the help oftechnologies like software programs and formal models, which saw a rapid, worldwideexpansion in the 1990s Tracing the sources of this expansion, Michael Power (2004)argues that technologies such as enterprise risk management (ERM) originated in acultural shift that put emphasis on shareholder value and on increased performances
of company stock prices in the market ERM was implemented in banks all over theworld to control financial exposure and to prevent overengagement in financial trades.Yet, since such technologies are based on algorithms that automatically overrulehuman actors’ decisions, a reciprocal tuning of traders and software is no longer pos-sible The introduction of standardized risk measurement technologies, managed fromoutside the trading floor, blocks out the local skills and personal knowledge of humanactors, which play an important role in avoiding financial loss Risk-measurementtechnologies are not instruments that measure an external given reality (“risk”) buttools of financial action (Holzer & Millo, 2004: 16) These models change the veryphenomena they are supposed to represent; consequently, their use does not auto-matically diminish financial risks and volatility (see also MacKenzie, 2005: 78) While
Trang 21traders use models to calculate option prices and exposures, they also observe andimitate each other, to the effect that “superportfolios” emerge In situations of finan-cial instability, the use of the same pricing formulas in the same way, with the sametrades, can have destructive effects.
CONCLUSION
I have argued that a distinctive feature of social studies of finance is the investigation
of scientific models, technology, and forms of expert knowledge in financial tions Is SSF then to be regarded as a subfield of STS? Are financial institutions complexenough to support an emerging discipline over longer periods of time? What wouldthe SSF research program look like?
institu-Undoubtedly, the majority of SSF studies has been done by academics trained in thesociology of science and technology, or who had an established reputation in STS.Many of them continue to conduct parallel research projects in both fields The majorthemes of investigation—such as observation, representation, boundary marking,agency, and risk, to name but a few—had already been successfully investigated withrespect to science and technology Yet, in spite of the clear affinities and influences,SSF does not appear as a mere subdomain of STS There are several reasons: the first
is that SSF combines epistemic topics with the study of problems relevant in areas likeeconomic sociology and behavioral finance, bringing a genuine contribution to thestudy of financial institutions One of these problems is the pricing mechanism: whilefinancial economics has noticed the impact of technology, it has been the role of SSF studies to show how price data, theoretical assumptions, trading software, andcomputer networks influence the constitution of securities prices Another genuinecontribution is related to the analysis of information as the cornerstone of financialmarkets While financial economics and economic sociology have understood infor-mation as signal processing and treated it as a black box, SSF has highlighted the socialand institutional origins of this concept as well as the epistemic and cultural assump-tions on which financial information is constituted
A second reason for the growing disciplinary autonomy of SSF is that it has madeconceptual contributions, acknowledged as such, in disciplines such as sociology,behavioral finance, and the history of economics, an example being the concept ofperformativity, which can be seen as an extension and modification of the notion ofagency developed in the sociology of science and technology Another example is theconcept of markets as a reflexive system, built on an analogy with the concept of lab-oratory This indicates growing disciplinary autonomy, without affecting the tiesbetween STS and SSF Owing to the close personal and intellectual ties between thesefields, I expect them to stay in a lively dialogue
A further question with respect to the possibility of disciplinary autonomy iswhether the field of inquiry is deep enough to support continuous SSF research in thelong run I can confidently venture the following: the research done since the mid-1990s is a mere scratch on the surface of the field There is a wealth of uninvestigated
Trang 22or under-investigated topics, both historical and contemporary A short list wouldinclude the social and epistemic history of competing price-recording technologies,the development of trading software and the interface between the software industryand financial markets, trading robots, the social history of financial information as acommodity, the emergence of epistemic intermediaries like financial analysts, thegrowing role of financial expertise, the relationship between formal financial modelsand vernacular economics, the relationship between academic theories and nonacad-emic ones, and vernacular forms of financial knowledge and theories The field showsenough depth and relevance to support research in the long run.
While there is neither a formal research program, comparable, for instance, with thestrong program in the sociology of scientific knowledge (but see Preda, 2001), nor asingle school (comparable to the Edinburgh, Paris, or Bath/Cardiff schools in STS), thiscan be seen rather as an advantage, since it allows the inclusion of various researchinterests and approaches Nevertheless, the possibility cannot be excluded that formalresearch programs will emerge and that we will witness more internal differentiationafter the initial growth period Already several distinct approaches are configuring: onecentered on the concept of performativity and influenced by (but not limited to) theactor-network theory perspective and another one grounded in the tradition of labo-ratory studies and centered on field work in the trading room I expect that furtherempirical studies and theoretical contribution will deepen the differentiation process
In any case, the prominence of financial institutions in our world, together with thegrowing role of financial theories, expertise, and technologies, make this one of themost exciting developments to have emerged from STS
References
Abolafia, Mitchel (1996) Making Markets: Opportunism and Restraint on Wall Street (Cambridge, MA:
Harvard University Press).
Bachelier, Louis ([1900]1964) “Theory of Speculation,” in P H Cootner (ed), The Random Character of Stock Market Prices (Cambridge, MA: MIT Press): 17–78.
Baker, Wayne (1984) “The Social Structure of a National Securities Market,” American Journal of Sociology 89: 775–811.
Bernstein, Peter L (1996) Against the Gods: The Remarkable Story of Risk (New York: Wiley).
Beunza, Daniel & David Stark (2004) “How to Recognize Opportunities: Heterarchical Search in a
Trading Room,” in K Knorr Cetina & A Preda (eds), The Sociology of Financial Markets (Oxford: Oxford
University Press): 84–101.
Biais, Bruno (1993) “Price Formation and Equilibrium Liquidity in Fragmented and Centralized
Markets,” Journal of Finance 48(1): 157–185.
Bijker, Wiebe E (1995) Of Bicycles, Bakelites, and Bulbs: Toward a Theory of Sociotechnical Change
(Cambridge, MA: MIT Press).
Bijker, Wiebe E., Thomas P Hughes, & Trevor Pinch (1987) The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology (Cambridge, MA: MIT Press).
Trang 23Çalis¸kan, Koray (forthcoming) “Markets’ Multiple Boundaries: Price Rehearsal and Trading Performance
in Cotton Trading at Izmir Mercantile Exchange,” in M Callon, Y Millo, & F Muniesa (eds), Market Devices: Sociological Review Monograph Series (Oxford: Blackwell).
Callon, Michel (1998) “Introduction,” in M Callon (ed), The Laws of the Markets (Oxford: Blackwell):
1–57.
Callon, Michel (1999) “Network Theory: The Market Test,” in J Law & J Hassard (eds), Network Theory and After (Oxford: Blackwell): 181–95.
Actor-Collins, Harry M & Robert Evans (2003) “The Third Wave of Science Studies: Studies of Expertise and
Experience,” Social Studies of Science 32(2): 235–96.
Davis, John B (2003) The Theory of the Individual in Economics: Identity and Value (London: Routledge) Dimson, Elroy & Massoud Moussavian (1998) “A Brief History of Market Efficiency,” European Finan- cial Management 4(1): 91–103.
Evans, Robert (2005) “Demarcation Socialized: Constructing Boundaries and Recognizing Difference,”
Science, Technology & Human Values 30(1): 3–16.
Fama, Eugene (1970) “Efficient Capital Markets: A Review of Theory and Empirical Work,” Journal of Finance 25: 383–417.
Fama, Eugene (1991) “Efficient Capital Markets: II,” Journal of Finance 46: 1575–617.
Franke, Günter & Dieter Hess (2000) “Information Diffusion in Electronic and Floor Trading,” Journal
of Empirical Finance 7: 455–78.
Gieryn, Thomas (2002) “Three Truth-Spots,” Journal of History of the Behavioral Sciences 38(2): 113–32 Goffman, Erving (1982) Interaction Ritual: Essays on Face-to-Face Behavior (New York: Pantheon).
Harrison, Paul (1997) “A History of an Intellectual Arbitrage: The Evolution of Financial Economics,”
in J B Davis (ed), New Economics and Its History, annual supplement to History of Political Economy
29(suppl.): 172–87.
Heatherly, David, David Leung, & Donald MacKenzie (forthcoming) “The Finitist Accountant:
Classi-fications, Rules, and the Construction of Profits,” in T Pinch & R Swedberg (eds), Living in a Material World: On Technology, Economy, and Society (Cambridge, MA: MIT Press).
Holzer, Boris & Yuval Millo (2004) “From Risks to Second-Order Dangers in Financial Markets: tended Consequences of Risk Management Systems,” Discussion Paper 29 (London: CARR/LSE).
Unin-Jensen, Michael (1978) “Some Anomalous Evidence Regarding Market Efficiency,” Journal of Economic Literature 6: 95–101.
Jovanovic, Franck & Philippe Le Gall (2001) “Does God Practice a Random Walk? The ‘Financial Physics’
of a Nineteenth-Century Forerunner, Jules Regnault,” European Journal of the History of Economic Thought
8(3): 332–62.
Kalthoff, Herbert (2004) “Financial Practices and Economic Theory: Outline of a Sociology of Economic
Knowledge,” Zeitschrift für Soziologie 33(2): 154–75.
Kalthoff, Herbert (2005) “Practices of Calculation: Economic Representation and Risk Management,”
Theory, Culture and Society 22(2): 69–97.
Klein, Judy L (2001) “Reflections from the Age of Economic Measurement,” in J L Klein & M S.
Morgan (eds), The Age of Economic Measurement, annual supplement to History of Political Economy
33(suppl.): 111–36.
Trang 24Knight, Frank ([1921]1985) Risk, Uncertainty, and Profit (Chicago: University of Chicago Press).
Knorr Cetina, Karin (1995) “Laboratory Studies: The Cultural Approach to the Study of Science,” in
S Jasanoff, G E Markle, J C Petersen, & T Pinch (eds), Handbook of Science and Technology Studies
(Thousand Oaks, CA: Sage): 140–66.
Knorr Cetina, Karin (2005) “How Are Global Markets Global? The Architecture of a Flow World,” in K.
Knorr Cetina & A Preda (eds), The Sociology of Financial Markets (Oxford: Oxford University Press):
38–61.
Knorr Cetina, Karin & Urs Bruegger (2002) “Global Microstructures: The Virtual Societies of Financial
Markets,” American Journal of Sociology 107(4): 905–50.
Lynch, Michael (1993) Scientific Practice and Ordinary Action: Ethnomethodology and Social Studies of Science
(Cambridge: Cambridge University Press).
MacKenzie, Donald (2001a) Mechanizing Proof: Computing, Risk, and Trust (Cambridge, MA: MIT Press).
MacKenzie, Donald (2001b) “Physics and Finance: S-Terms and Modern Finance as a Topic for Science
Studies,” Science, Technology & Human Values 26: 115–44.
MacKenzie, Donald (2004) “Is Economics Performative? Option Theory and the Construction of Derivatives Markets,” paper presented at the Harvard-MIT Economic Sociology Seminar, November 16.
MacKenzie, Donald (2005) “How a Superportfolio Emerges: Long-Term Capital Management and the
Sociology of Arbitrage,” in K Knorr Cetina & A Preda (eds), The Sociology of Financial Markets (Oxford:
Oxford University Press): 62–83.
MacKenzie, Donald (2006) An Engine, Not a Camera: Finance Theory and the Making of Markets
(Cambridge, MA: MIT Press).
MacKenzie, Donald & Yuval Millo (2003) “Constructing a Market, Performing a Theory: The Historical
Sociology of a Financial Derivatives Exchange,” American Journal of Sociology 109: 107–45.
MacKenzie, Donald & Judy Wajcman (eds) (1985) The Social Shaping of Technology: How the Refrigerator Got Its Hum (Philadelphia: Open University Press).
Mars, Frank (unpublished) Wir sind alle Seher: Die Praxis der Aktienanalyse, Ph.D diss., Bielefeld,
Germany.
Mehrling, Perry (2005) Fischer Black and the Revolutionary Idea of Finance (Hoboken, NJ: Wiley) Miller, Daniel (2002) “Turning Callon the Right Way Up,” Economy and Society 31(2): 218–33.
Mirowski, Philip (1989) More Heat Than Light: Economics as Social Physics, Physics as Nature’s Economics
(Cambridge: Cambridge University Press).
Mirowski, Philip (2002) Machine Dreams: Economics Becomes a Cyborg Science (Cambridge:
Cambridge University Press).
Mirowski, Philip (2004) The Effortless Economy of Science? (Durham, NC: Duke University
Press).
Mirowski, Philip (2006) “Twelve Theses on the History of Demand Theory in America,” in W Hands
& P Mirowski (eds), Agreement of Demand, supplement to vol 38 of History of Political Economy: 343–79.
Mirowski, Philip & Edward Nik-Khah (2007) “Markets Made Flesh: Performativity, and a Problem in Science Studies, Augmented with Consideration of the FCC Auctions,” in D MacKenzie, F Muniesa,
& L Siu (eds), Do Economists Make Markets? On the Performativity of Economics (Princeton, NJ:
Princeton University Press): 190–224.
Trang 25Muniesa, Fabian (2000) “Performing Prices: The Case of Price Discovery Automation in the Financial Markets,” in H Kalthoff, R Rottenburg, & H.-J Wagener (eds), Facts and Figures: Economic Represen- tations and Practices (Marburg, Germany: Metropolis): 289–312.
Muniesa, Fabian (unpublished) “Des marchés comme algorithms: Sociologie de la cotation électronique
à la Bourse de Paris,” Ph.D diss., Ecole des Mines, Paris.
NYSE (1963) “The Stock Market Under Stress: The Events of May 28, 29, and 31, 1962: A Research Report by the New York Stock Exchange” (New York: New York Stock Exchange).
O’Hara, Maureen (1995) Market Microstructure Theory (Oxford: Blackwell).
Paul, Jonathan M (1993) “Crowding Out and the Informativeness of Securities Prices,” Journal of Finance
48(4): 1475–96.
Pickering, Andrew (1995) The Mangle of Practice: Time, Agency, and Science (Chicago: University of
Chicago Press).
Pinch, Trevor (2003) “Giving Birth to New Users: How the Minimoog Was Sold to Rock and Roll,”
in N Oudshoorn & T Pinch (eds), How Users Matter: The Co-construction of Users and Technologies
(Cambridge, MA: MIT Press): 247–70.
Porter, Theodore M (1995) Trust in Numbers: The Pursuit of Objectivity in Science and Public Life
(Princeton, NJ: Princeton University Press).
Power, Michael (2004) “Enterprise Risk Management and the Organization of Uncertainty in Financial
Institutions,” in K Knorr Cetina & A Preda (eds), The Sociology of Financial Markets (Oxford: Oxford
University Press): 250–68.
Preda, Alex (2001) “Sense and Sensibility: Or, How Should Social Studies of Finance Be(have)? A
Man-ifesto,” Economic Sociology: European Electronic Newsletter 2(2): 15–18.
Preda, Alex (2003) “Les hommes de la Bourse et leurs instruments merveilleux: Technologies de
transmission des cours et origins de l’organisation des marches modernes,” Réseaux 21(122): 137–
66.
Preda, Alex (2004a) “Informative Prices, Rational Investors: The Emergence of the Random Walk
Hypothesis and the Nineteenth-Century ‘Science of Financial Investments,’” History of Political Economy
Sullivan, Edward J & Timothy M Weithers (1991) “Louis Bachelier: The Father of Modern Option
Pricing Theory,” Journal of Economic Education 22(2): 165–71.
Vollmer, Hendrik (2003) “Bookkeeping, Accounting, Calculative Practice: The Sociological Suspense of
Calculation,” Critical Perspectives on Accounting 3: 353–81.
Trang 26White, Harrison (2002) Markets from Networks: Socioeconomic Models of Production (Princeton, NJ:
Princeton University Press).
Woelfel, Charles (1994) Encyclopedia of Banking and Finance (Chicago: Irwin).
Zaloom, Caitlin (2003) “Ambiguous Numbers: Trading Technologies and Interpretation in Financial
Markets,” American Ethnologist 30(2): 258–72.
Zaloom, Caitlin (2004) “The Productive Life of Risk,” Cultural Anthropology 19(3): 365–91.
Trang 27KNOWING NATURE
In the decade since the first STS handbook to include a chapter on the environment(Yearley, 1995), the significance of environmental topics to the science and technol-ogy studies community has grown with startling rapidity In part this is because therehas been an increasing number of detailed studies on topics such as environmentalcontroversies (Carolan & Bell, 2004; Krimsky, 2000), the relationship between researchand environmental policy (Bocking, 2004; Sundqvist et al., 2002), environmentalmodeling (Shackley, 1997a; Sismondo, 1999), ecosystem management practices(Helford, 1999), citizen participation in environmental understanding and decision-making (Bush et al., 2001; Petts, 2001; Yearley et al., 2001), the shaping of environ-mental research (Jamison, 2001; Zehr, 2004), and the development of innovativeinstitutions for the production of certified environmental knowledge (most famouslythe Intergovernmental Panel on Climate Change, discussed below) STS authors havealso contributed to theoretical and conceptual analyses of environmental themes and
of ideas about environmentalism (e.g., Latour’s [2004] on political ecology andYearley’s [1996, 2005a: 41–53] on the globalization of environmentalism) These twoconsiderations alone would merit a fresh discussion, but such discussion is now press-ingly needed for two additional reasons
First, it has become clear that the earlier framing of this issue as “STS studies of theenvironment and environmental science” is too narrow It is now evident that theenvironment is critical to STS, not just as one more site to study but because study-ing it affords key insight into the status of “the natural” in advanced modernity Atthe simplest level, scientific knowledge is indispensable to contemporary environ-mental policies because science offers to tell us how nature is Plants and animals, letalone the climate, cannot speak for themselves; ecologists, oceanographers and mete-orologists have become their proxies This idea is institutionalized in such things as
“environmental impact assessments” in which professional advisers are employed tofigure out the impact of a new development (like a freeway or harbor) on the sur-rounding environment But such practices inevitably construct “nature” as a baselinecondition at the same time as they disclose the presumed impacts of the new Steven Yearley
Trang 28development Even on a small scale, such construction is far from straightforward Atthe planetary level—in a dynamic ecosystem where even the heat radiated from thesun is believed to vary and where the climate has undergone large fluctuations withinrecorded history—one cannot build the idea of humanly induced climate changewithout constructing what the “natural” climate would, counterfactually, have been.
In a sense, the larger the environmental impact, the more counterfactual must thenatural baseline be Though this is not how he meant it, McKibben (1989; see alsoYearley, 2005b) implicitly recognized this point in his celebrated announcement ofthe “end of nature.” For McKibben, humanly caused global climate change meant thatone can no longer find a purely natural environment anywhere on Earth For the STSscholar, a question of at least equal interest is how the natural is constructed in thevery course of advancing such claims.1
Commonsensically, for most environmental issues the “natural” condition is fit,healthy, and desirable Evolution by natural selection ensures that nature is finelytuned But this comforting observation rapidly runs into problems For one thing, thecontemporary countryside—perhaps most acutely in Europe—is more or less whollyunnatural It is a managed landscape, run for the cultivation of plants and animalsthat otherwise would never have existed in such profusion In Britain even the offi-cially designated “Areas of Outstanding Natural Beauty” (AONBs) are, with unre-marked irony, thoroughly unnatural Worse still for the commonsense, benign view
of nature, many things that are taken as “bads” and routinely combated are alsonatural: diseases, pests, and earthquakes Nature is thus not unproblematically goodnor desirable Accordingly, STS work around environmental topics has had the oppor-tunity, perhaps not taken as forcefully as we might wish, to face up to “nature” and
“the natural.” But the environment is not the sole arena for contests over nature; allel disputes are under way in relation to the new biology and genetic engineering
par-In a shorthand way, in the case of the environment the problem is that humans areconducting an unheeding experiment on external nature while, in the case of ourspecies’ biological nature, humans are wrestling with how to regulate increasingcontrol over our species being Natural variation was typically assumed to governhuman life and reproduction, but once such matters are understood as being underconscious human control, the notions of luck, fortune, and fairness that more or lessworked for centuries can no longer function in the same ways Accordingly, I need todevote a little time to both these realms of “nature.”
The second additional reason why a review is required is a substantive one Thoughthere has been STS investigation of a very wide range of issues within nature and theenvironment, it is clear that a great deal of recent work has clustered around threesubstantive topics: humanly induced climate change, genetically modified crops andfoodstuffs, and genomics and human reproduction All three focus attention on “thenatural” although only the first two would typically be classed as environmentalissues All three are important for any STS conception of nature and environment,however, not only because of the amount of attention paid to them in the field but
Trang 29additionally because they are at the frontier of STS engagement with policy, socialtheory, and social change.
It is also worth pointing out that nature and environment cannot be discussed sively in terms of STS publications In part that is because some highly influentialauthors (McKibben, 1989; Fukuyama, 2002; Beck, 1992, 1995) come almost whollyfrom outside STS But it is also because STS ideas influence the work of many authorswho see themselves more as environmental sociologists (McCright & Dunlap, 2000,2003), geographers (Castree & Braun, 2001; Demeritt, 2002), or policy analysts (Hajer,1995) It is also because STS authors have drawn on work from other traditions, forexample, the literature on globalization or anthropological work on kinship andnatural relations (Strathern, 1992; J Edwards, 2000)
exclu-In brief, in this chapter I argue that the conceptual key to recent STS work on the
environment is about the matter of knowing nature Science and technology are
valu-able for environmental management precisely because they offer authoritative, ranging and powerful ways of comprehending the natural world The distinctivecontribution of STS research is to see that the very business of “knowing nature”shapes the knowledge that results; this decisively influences how effective or not suchknowledge is in other public contexts
far-GLOBAL WARMING AND HUMANLY INDUCED CLIMATE CHANGE
At first sight, the issue of climate change resembles numerous other environmentalcontroversies that STS scholars have studied A claim about a putative environmentalproblem is raised by scientists and taken up and amplified by the media and envi-ronmental groups; in time, a policy response follows As is well known, meteorolo-gists—already aware that the climate had undergone numerous dramatic fluctuations
in the past—began in the second half of the twentieth century to offer ideas and adviceabout the possibility of climate changes affecting our civilization in the longer term(Boehmer-Christiansen, 1994a; P Edwards, 2001; Jäger & O’Riordan, 1996; Miller &Edwards, 2001; Kim, 2005) Though skeptics like to point out that initial warningsalso included the possibility that we might be heading out of an interglacial warmperiod into the cold, as early as the 1950s there was a focus on atmospheric warming(P Edwards, 2000) As such climate research was refined, largely thanks to the growth
in computer power in the 1970s and 1980s, the majority opinion endorsed the earliersuggestion that enhanced warming driven by the build-up of atmospheric carbondioxide was the likely problem Environmental groups are reported to have been ini-tially wary of this claim (F Pearce, 1991: 284) since it seemed such a long shot andwith such high stakes With acid rain on the agenda and many governments active
in denying scientific claims about this effect, it seemed hubristic to warn that sions might be sending the whole climate out of control
emis-Worse still, at a time when environmentalists were looking for concrete successes,the issue seemed almost designed to provoke and sustain controversy The records of
Trang 30past temperatures and particularly of past atmospheric compositions were often notgood, and there was the danger that rising trends in urban air-temperature measure-ments were simply an artifact; cities had simply become warmer as they grew in size.The heat radiating from the sun is known to fluctuate, so there was no guarantee thatany warming was a terrestrial phenomenon due to “pollution” or other human activ-ities Others doubted that additional carbon dioxide releases would lead to a build-up
of the gas in the atmosphere, since the great majority of carbon is in soils, trees, andoceans, so sea creatures and plants might simply sequester more carbon Even if thescientific community was correct about the build-up of carbon dioxide in the atmos-phere, it was fiendishly difficult to work out what the implications would be
Hart and Victor track the interaction between climate science and U.S climate policyfrom the 1950s up to the mid-1970s by which time greenhouse emissions had “beenpositioned as an issue of pollution” (1993: 668); the climate, “scientific leaders dis-covered, could be portrayed as a natural resource that needed to be defended from theonslaught of industrialism” (1993: 667) Subsequently, according to Bodansky (1994:48), the topic’s rise to policy prominence was assisted by other considerations Therewas, for example, the announcement of the discovery of the “ozone hole” in 1987;this lent credibility to the idea that the atmosphere was vulnerable to environmentaldegradation and that humans could unwittingly cause harm at a global level Alsoimportant was the coincidence in 1988 between Senate hearings into the issue and avery hot and dry summer in the United States In his election campaign, GeorgeHerbert Walker Bush even spoke of combating the greenhouse effect with the “WhiteHouse effect,” denying that politicians were powerless to act in the face of this newlyidentified threat Still, most politicians responded to the warnings in the 1980s with
a call for more research Although environmental campaigners countered that therewas no need for more research before taking measures to increase energy efficiencyand use more renewables, most spokespersons concurred with the view that furtherknowledge would be important, particularly if some warming had already been set intrain by emissions to date One significant outcome of this support for research wasthe setting up in 1988 of a new form of scientific organization, the Intergovernmen-tal Panel on Climate Change (IPCC), under the aegis of the World MeteorologicalOrganization and the United Nations Environment Program (Agrawala, 1998a,b) Theaim of the IPCC was to collect together the leading figures in all aspects of climatechange with a view to establishing in an authoritative way the nature and scale of theproblem This initiative was highly important and a novel phenomenon as far as theSTS community was concerned: “While by no means the first to involve scientists in
an advisory role at the international level, the IPCC process has been the most sive and influential effort so far” (Boehmer-Christiansen, 1994b: 195; see also Shackley, 1997b; Miller, 2001b)
exten-STS interest matched the wealth and diversity of issues available here The first issue
to attract attention was the novel conjunction between this form of scientific zation and its dependence on super-fast computing facilities required to do the climatemodeling; this dependence ensured, for most of the period, that key work could only
Trang 31organi-be done at a handful of centers worldwide In a series of papers, Shackley and Wynne(1995, 1996) examined how modeled knowledge was produced, made credible, andrendered serviceable for the policy community (see also Shackley et al., 1998, 1999).Thus, writing with two Dutch colleagues (van der Sluijs et al., 1998), they investigatedthe strikingly consistent nature of estimates of climate sensitivity over a series ofmodels and policy reviews Their puzzle was that “[t]he estimated range of the climatesensitivity to CO2-doubling of 1.5°C–4.5°C has remained remarkably stable over twodecades, despite the huge growth of climate science” (1998: 315) Their interpretationwas that factors within the sociology of this community tended to make changes inthe policy prescriptions much less likely than continuity In any case, the estimatewas broad enough to admit of numerous different interpretations with little frictionamong the scientific contributors, even if the estimate tacitly excluded more cata-strophic scenarios Sociological factors specific to this community seemed to influencethe knowledge it produced Lahsen carried out ethnographic work on the climate mod-eling community, examining how the models (known as GCMs, or general circulationmodels) gained credibility (2005b; see also Sundberg, 2005: 166–84) By their nature,such models cannot be tested against the future Nor can they really be adequatelytested against data about past climates, since they are constructed precisely in the light
of information about the past (P Edwards, 2000: 232) Accordingly, the models areinevitably to some extent conjectural, and one form of test consists of running them against each other; Lahsen investigates the way the unreality and circularity ofthese procedures is managed by practicing modelers Modeling remains very time-consuming and expensive: “Despite vast increases in computer power, full runs oftoday’s state-of-the-art GCMs still require hundreds of supercomputer hours, sincemodelers add complexity to the models even more rapidly than computers improve”(P Edwards, 2000: 232) Given that the climate science community is not homoge-neous, Shackley (2001) argues for the existence of contrasting “epistemic lifestyles”within the modeling community Some modelers are concerned with developing themost comprehensive model they can, arguing that this is a necessary route to mean-ingful climate prediction Others are concerned to establish as quickly as possiblemodels capable of addressing long-term trends so that projections can be made andfed into the policy process (see also Sundberg, 2005: 136–37) The latter group tends
to be dominated by thermodynamicists, who argue that the climate system can betreated as a black box exchanging energy with the rest of the universe Shackley goes
on to point out that the existence of these differences interacts with the funding system (see P Edwards, 1996; see also Bloomfield, 1986) In the United Statesthere are many centers with different disciplinary focuses—they give conflicting adviceroughly along disciplinary lines In the United Kingdom, where there is only onecenter, scientists are forced to be more cooperative and consensus-oriented
research-Other STS work focused on the shaping of the negotiations within the IPCC Giventhe huge scale of the IPCC and its novelty both as an institution and in terms of thephenomena it was trying to assess, a key issue was how it would reach judgments distilled from all the detail One specific, if not typical, case was the question of the
Trang 32economic valuation of lives threatened by climatic changes In terms of policyresponses, there appear to be two broad possibilities: either we try to limit the build-
up of greenhouse gases (by reducing emissions or boosting sequestration and so on),
or we take steps to adapt to a changed climate by building better sea defenses, cating housing, increasing provision for cooling buildings and associated measures
relo-To work out a reasonable balance somewhere between “all abatement” or “all tation,” one needs to know the relative pros and cons Both strategies had costs andbenefits, and economists working on the 1995 assessment argued that the variouspolicy paths could not be evaluated without a worldwide analysis of these advantagesand costs After such an analysis had been completed, the equations could then besolved to get a mix of policies that provided the greatest net benefit at the lowest cost(Fankhauser, 1995) In short, they wanted to work out both the economic costs asso-ciated with greenhouse-gas abatements and those associated with people becomingvictims of the adaptation route Among other things, this entailed putting a price onthe typical life income of people from the various countries, and it turned out, forexample, that each South Asian (many of whom are likely to suffer from sea-level rises)was calculated to “cost” their country much less than each Westerner whose incomemight be lost The economists argued that they were not evaluating the worth ofpeople’s lives, only putting a price on the forgone earnings of typical individuals, butthe procedure appeared to value the life of a South Asian at about one fifteenth theworth of a Northern citizen The valuations were critical, since the relative cheapness
adap-of South Asians meant that the “rational” global policy orientation was for relativelylittle abatement (since abatement was costly, as it tended to impact high-earningNortherners) and a good deal of adaptation (mostly in the developing world); theadaptation appeared relatively inexpensive because it tended to impact people withlow incomes This line of reasoning, though retained in chapter 6 of volume III of the
1995 Assessment Report (D Pearce et al., 1996), was widely criticized among NGOs(notably the Global Commons Institute, which was founded precisely around thisissue) In the end, the economistic argument was largely disavowed in the summaryfor policy makers with which the volume began In the section on the social costs ofhumanly caused climate change, the summary asserted that:
The literature on the subject of this section is controversial There is no consensus about how
to value statistical lives or how to aggregate statistical lives across countries Monetary valuationshould not obscure the human consequences of anthropogenic climate change damages, becausethe value of life has meaning beyond monetary considerations (Bruce et al., 1996: 9–10)
While this revealed deep philosophical divisions over the very conceptualization ofthe scientific climate change issues (O’Riordan & Jordan, 1999), this kind of approachwas less in evidence in later assessments This prompted economics-enthusiastLomborg (2001: 301) to lament, “it is regrettable that [such economic issues are] notrationally assessed in the latest [i.e subsequent] report.”
Though this point is made particularly prominent by the use of an example fromeconomics, it highlights a more general issue The IPCC has to arrive at summary judg-
Trang 33ments, and these judgments (again as van der Sluijs et al.’s 1998 study indicates; seealso van der Sluijs, 1997) are not narrowly determined by the vast array of scientificresults in the reports It is clear that sociological and social psychological considera-tions factors enter into the formulation of these judgments (for the related case of theUNFCCC [see below], see Miller, 2001a) Moreover, the IPCC reports are characterized
by a further level of judgment, since each report volume is introduced with a summaryfor policy-makers (e.g., Bruce et al., 1996) that has to be approved in detail by the
countries’ representatives; it is “thus an intergovernmentally negotiated text” as the
Preface makes clear (1996: x, emphasis added)
A third leading interest of STS scholars has been the relationship between the IPCC—indeed, the whole climate-change regulation community—and its critics (Lahsen,2005a) Critics were quick to point to the supposed vested interests of this commu-nity Its access to money depends on the severity of the potential harms that it warnsabout; hence—or so it was argued—it inevitably has a structural temptation to exag-gerate harms This highlights one of the outstanding feature of the IPCC: though therehave been other mass scientific projects (including the Human Genome project [see
chapter 32 in this Handbook]), the IPCC is unusual in that the science with which it
had to deal was more controversial and more complex than the obvious comparators(Nolin, 1999) Admittedly, the human genome was enormously complicated, andthere were sharply diverging views on how the sequencing should be done, but therewas a high level of agreement within the profession about what the answer shouldlook like and no organized lobby denying its basic premises By contrast, the IPCCwas trying to offer policy-relevant analyses that many other policy advisers, includ-ing some respected scientists, were explicitly trying to junk As it was working in such
a multidisciplinary area, the IPCC attempted to extend its network widely enough toinclude all the relevant scientific authorities But this meant that the IPCC ran intoproblems with peer reviewing and perceived impartiality; there were virtually no
“peers” who were not already within the IPCC (for an analysis of the accusations thatcould be leveled, see P Edwards and Schneider, 2001) In line with the classic script
of “science for policy,” the IPCC legitimated itself in terms of the scientific ity and impartiality of its members But critics were able to point out that the scien-tific careers of the whole climate change “orthodoxy” depended on the correctness ofthe underlying assumptions Worse, the IPCC itself selected who was in the club ofthe qualified experts and thus threatened to be a self-perpetuating community with avested interest in continuing to find evidence for the importance of the phenomenon
objectiv-to which its members’ careers were shackled (see Boehmer-Christiansen, 1994b: 198).When just one chapter in the 2001 Third Assessment Report has ten lead authors andover 140 contributing authors,2then it is clear that this departs from the standardnotion of scientific knowledge production This was of course on top of all the well-recognized problems of science for policy, which Weinberg (1972: 209) had referred
to as “trans-science” and which Collingridge and Reeve (1986) came to describe intheir “over-critical” model of science advising (on this issue in relation to climatechange, see Yearley, 2005c: 160–73) And it was on top of the peculiar difficulty of
Trang 34trying to model future climates in a system of unknown (though enormous) complexity.
The range of critics has been enormous At one end there have been scholars andmoderate critics who have concerns that the IPCC procedure tends to marginalize dis-senting voices and that particular policy proposals (such as the Kyoto Protocol) aremaybe not so wise or so cost-effective as proponents suggest (e.g., Boehmer-Christiansen, 2003; Boehmer-Christiansen & Kellow, 2002) There are also many con-sultants backed by the fossil-fuel industry who are employed to throw doubt on claimsabout climate change (see Freudenburg, 2000 for a discussion of the social construc-tion of “non-problems”); these claims-makers have entered into alliance with right-leaning politicians and commentators to combat particular regulatory moves(McCright & Dunlap, 2000, 2003) Informal networks, often Web-based, have been set
up to allow “climate-change skeptics” to exchange information, and they have comed all manner of contributors, whether direct enemies of the Kyoto Protocol ormore distant allies such as opponents of wind farms (Haggett & Toke, 2006) or anti-nuclear conspiracy theorists Gifted cultural players including Rush Limbaugh and
wel-Michael Crichton have waded into this controversy, with Crichton’s 2004 novel State
of Fear having a “technical appendix” on the errors in climate science At the same
time, mainstream environmental NGOs have tended to argue simply that one shouldtake the scientists’ word for the reality of climate change, a strategy about which theyhave clearly been less enthusiastic in other cases (Yearley, 1993: 68–69, 1992)
There is a second major way in which the IPCC was distinctive: its commitment toinclude economic, social scientific, and policy aspects of the issues Correspondingly,other STS work has focused on the role of the social sciences in analyzing climatechange and, to some extent, on the IPCC’s own social science Though, according tothe self-understanding of the IPCC, these disciplines could not have the precision andexactitude to which the physical sciences aspired, it was clear that global climatechange could not be studied in the absence of societal analyses for two reasons Onthe one hand, the things that worry us about climate change are chiefly the implica-tions for people, commerce, cities, and to some extent wildlife The actual impactsthat will arise clearly depend on how people respond Without expert advice on thesepolicy matters, there could be no sensible modeling of the “output” side of the cli-matologists’ work On the other hand, possible policy responses to climate change (if
it is happening) again depend on people’s willingness to accept the policy tions—to forgo air travel or to put up with climate risks and so on The IPCC handledthis issue by dividing its procedures into three parallel tracks dealing with the physi-cal sciences, the socioeconomic impacts and possible policy responses In a four-volume work, edited by Rayner and Malone (1998), STS and social science scholarswere invited to turn the question around and to focus, so to speak, on the climateimpacts of global human change This innovative enterprise was clearly aimed tomirror the IPCC’s work and to highlight the disciplinary orientations overlooked bythe IPCC Alterations in greenhouse gas concentrations are largely due to emissionsfrom people and from their activities, and thus the rate of such atmospheric change
Trang 35prescrip-depends on the speed and nature of economic growth, the size of future populations,the technologies chosen by people, the cultures of consumption and leisure theydevelop, and so on The institutional assumption of the IPCC is that the only rele-vant social science is economics; many of the contributors to Rayner and Malone’svolumes focus on the role of culture, often from the standpoint of Mary Douglas’s cul-tural theory (Douglas et al., 1998).
Social science engagement with climate issues has also taken the form of studies ofpublic participation in policy responses to global warming If scientific understandingabout environmental issues is uncertain, as it admittedly is with significant aspects ofclimate change, then—so the argument goes—policy decisions cannot simply be led
by expert advice Rather, decisions will inevitably be matters of political judgment,and in democratic societies such decisions should be democratic and transparent Inthe set of studies summarized in Kasemir et al (2003), participatory techniques areproposed as one powerful means for democratizing the handling of such topics Thiswork was primarily based on a large-scale European project known as Ulysses (for
Urban lifestyles, sustainability, and integrated environmental assessment) This project
was based on seven European cities (Athens, Barcelona, Frankfurt, Manchester, Stockholm, Venice, and Zurich), and much of its innovative character derived fromits use of extensive focus-group type workshops to get citizens to reflect on the ways
in which urban lifestyles could change to address climate change and sustainableliving These group meetings commonly acquainted participants with computer-basedmodels of such issues as greenhouse gas emissions so that citizens could use the models
to investigate the likely consequences of their proposed lifestyle changes (GuimarãesPereira et al., 1999) The chief drawback of this study was that the models participantsemployed had been devised for the purposes of the research and were not used bylocal governments or environmental authorities, so that the study’s practical payoffwas necessarily limited (contrast the modeling study reported in Yearley et al., 2003;see also Yearley, 1999, 2006)
Finally, STS scholars have been interested in the scientific community’s—and ically the IPCC’s—role in the wider policy process (see also Skodvin, 2000; Demeritt,2001) In the late 1980s and early 1990s it seemed that getting the science right would
specif-be enormously important to the policy process, as was commonly thought to havehappened in the ozone case (Benedick, 1991; Christie, 2000; Grundmann, 1998, 2006).STS attention typically focused on how the IPCC and others generated this knowl-edge But from the outset, climate scientists advised that states needed to act rapidly
if greenhouse gas concentrations were to be regulated; pressure grew for the duction of some form of international treaty, and in 1990 the United Nations tookthe initiative in setting up an intergovernmental negotiating committee (INC) for aFramework Convention on Climate Change (FCCC) (Bodansky, 1994: 60).3The FCCC,set up in 1992, eventually gave rise to the Kyoto Protocol of 1997, which set out aprocess for introducing a binding treaty committing participating nations to green-house gas emission targets The irony of this development was that the negotiatingapparatus was in place even before the IPCC had finished its second assessment report,