In this book the use of “program” isfocused on the creation, execution, and study of programs wrien in adialect of Lisp for execution on a digital computer.. is includes programs from
Trang 1Structure and Interpretation
of Computer Programs
Harold Abelson and Gerald Jay Sussman with Julie Sussman, foreword by Alan J Perlis
Unofficial Texinfo Format 2.andresraba5.3
second edition
Trang 2©1996 by e Massachuses Institute of Technology Structure and Interpretation of Computer Programs, second edition
Harold Abelson and Gerald Jay Sussman
with Julie Sussman, foreword by Alan J Perlis
is work is licensed under a Creative Commons Aribution-ShareAlike 3.0 Unported License
( 3.0 ) Based on a work at mitpress.mit.edu
e Press
Cambridge, Massachuses
London, England
McGraw-Hill Book Company
New York, St Louis, San Francisco,
Montreal, Toronto
Unofficial Texinfo Format 2.andresraba5.3 (April 6, 2014), based on 2.neilvandyke4 (January 10, 2007).
Trang 31.1 e Elements of Programming 6
1.1.1 Expressions 7
1.1.2 Naming and the Environment 10
1.1.3 Evaluating Combinations 12
1.1.4 Compound Procedures 15
1.1.5 e Substitution Model for Procedure Application 18 1.1.6 Conditional Expressions and Predicates 22
1.1.7 Example: Square Roots by Newton’s Method 28
Trang 41.1.8 Procedures as Black-Box Abstractions 33
1.2 Procedures and the Processes ey Generate 40
1.2.1 Linear Recursion and Iteration 41
1.2.2 Tree Recursion 47
1.2.3 Orders of Growth 54
1.2.4 Exponentiation 57
1.2.5 Greatest Common Divisors 62
1.2.6 Example: Testing for Primality 65
1.3 Formulating Abstractions with Higher-Order Procedures 74
1.3.1 Procedures as Arguments 76
1.3.2 Constructing Procedures UsingLambda 83
1.3.3 Procedures as General Methods 89
1.3.4 Procedures as Returned Values 97
2 Building Abstractions with Data 107 2.1 Introduction to Data Abstraction 112
2.1.1 Example: Arithmetic Operations for Rational Numbers 113
2.1.2 Abstraction Barriers 118
2.1.3 What Is Meant by Data? 122
2.1.4 Extended Exercise: Interval Arithmetic 126
2.2 Hierarchical Data and the Closure Property 132
2.2.1 Representing Sequences 134
2.2.2 Hierarchical Structures 147
2.2.3 Sequences as Conventional Interfaces 154
2.2.4 Example: A Picture Language 172
2.3 Symbolic Data 192
2.3.1 otation 192
Trang 52.3.2 Example: Symbolic Differentiation 197
2.3.3 Example: Representing Sets 205
2.3.4 Example: Huffman Encoding Trees 218
2.4 Multiple Representations for Abstract Data 229
2.4.1 Representations for Complex Numbers 232
2.4.2 Tagged data 237
2.4.3 Data-Directed Programming and Additivity 242
2.5 Systems with Generic Operations 254
2.5.1 Generic Arithmetic Operations 255
2.5.2 Combining Data of Different Types 262
2.5.3 Example: Symbolic Algebra 274
3 Modularity, Objects, and State 294 3.1 Assignment and Local State 296
3.1.1 Local State Variables 297
3.1.2 e Benefits of Introducing Assignment 305
3.1.3 e Costs of Introducing Assignment 311
3.2 e Environment Model of Evaluation 320
3.2.1 e Rules for Evaluation 322
3.2.2 Applying Simple Procedures 327
3.2.3 Frames as the Repository of Local State 330
3.2.4 Internal Definitions 337
3.3 Modeling with Mutable Data 341
3.3.1 Mutable List Structure 342
3.3.2 Representing eues 353
3.3.3 Representing Tables 360
3.3.4 A Simulator for Digital Circuits 369
3.3.5 Propagation of Constraints 386
3.4 Concurrency: Time Is of the Essence 401
Trang 63.4.1 e Nature of Time in Concurrent Systems 403
3.4.2 Mechanisms for Controlling Concurrency 410
3.5 Streams 428
3.5.1 Streams Are Delayed Lists 430
3.5.2 Infinite Streams 441
3.5.3 Exploiting the Stream Paradigm 453
3.5.4 Streams and Delayed Evaluation 470
3.5.5 Modularity of Functional Programs and Modularity of Objects 479
4 Metalinguistic Abstraction 487 4.1 e Metacircular Evaluator 492
4.1.1 e Core of the Evaluator 495
4.1.2 Representing Expressions 501
4.1.3 Evaluator Data Structures 512
4.1.4 Running the Evaluator as a Program 518
4.1.5 Data as Programs 522
4.1.6 Internal Definitions 526
4.1.7 Separating Syntactic Analysis from Execution 534 4.2 Variations on a Scheme — Lazy Evaluation 541
4.2.1 Normal Order and Applicative Order 542
4.2.2 An Interpreter with Lazy Evaluation 544
4.2.3 Streams as Lazy Lists 555
4.3 Variations on a Scheme — Nondeterministic Computing 559 4.3.1 Amb and Search 561
4.3.2 Examples of Nondeterministic Programs 567
4.3.3 Implementing theAmbEvaluator 578
4.4 Logic Programming 594
4.4.1 Deductive Information Retrieval 599
Trang 74.4.2 How the ery System Works 615
4.4.3 Is Logic Programming Mathematical Logic? 627
4.4.4 Implementing the ery System 635
4.4.4.1 e Driver Loop and Instantiation 636
4.4.4.2 e Evaluator 638
4.4.4.3 Finding Assertions by Paern Matching 642
4.4.4.4 Rules and Unification 645
4.4.4.5 Maintaining the Data Base 651
4.4.4.6 Stream Operations 654
4.4.4.7 ery Syntax Procedures 656
4.4.4.8 Frames and Bindings 659
5 Computing with Register Maines 666 5.1 Designing Register Machines 668
5.1.1 A Language for Describing Register Machines 672 5.1.2 Abstraction in Machine Design 678
5.1.3 Subroutines 681
5.1.4 Using a Stack to Implement Recursion 686
5.1.5 Instruction Summary 695
5.2 A Register-Machine Simulator 696
5.2.1 e Machine Model 698
5.2.2 e Assembler 704
5.2.3 Generating Execution Procedures for Instructions 708
5.2.4 Monitoring Machine Performance 718
5.3 Storage Allocation and Garbage Collection 723
5.3.1 Memory as Vectors 724
5.3.2 Maintaining the Illusion of Infinite Memory 731
Trang 85.4 e Explicit-Control Evaluator 741
5.4.1 e Core of the Explicit-Control Evaluator 743
5.4.2 Sequence Evaluation and Tail Recursion 751
5.4.3 Conditionals, Assignments, and Definitions 756
5.4.4 Running the Evaluator 759
5.5 Compilation 767
5.5.1 Structure of the Compiler 772
5.5.2 Compiling Expressions 779
5.5.3 Compiling Combinations 788
5.5.4 Combining Instruction Sequences 797
5.5.5 An Example of Compiled Code 802
5.5.6 Lexical Addressing 817
5.5.7 Interfacing Compiled Code to the Evaluator 823
Trang 9Unofficial Texinfo Format
is is the second edition book, from Unofficial Texinfo Format.You are probably reading it in an Info hypertext browser, such as theInfo mode of Emacs You might alternatively be reading it TEX-formaed
on your screen or printer, though that would be silly And, if printed,expensive
e freely-distributed official -and- format was first verted personally to Unofficial Texinfo Format () version 1 by LythaAyth during a long Emacs lovefest weekend in April, 2001
con-e is easier to search than the format It is also muchmore accessible to people running on modest computers, such as do-nated ’386-based PCs A 386 can, in theory, run Linux, Emacs, and aScheme interpreter simultaneously, but most 386s probably can’t alsorun both Netscape and the necessary X Window System without pre-maturely introducing budding young underfunded hackers to the con-
cept of thrashing UTF can also fit uncompressed on a 1.44 floppy
diskee, which may come in handy for installing UTF on PCs that donot have Internet or LAN access
e Texinfo conversion has been a straight transliteration, to theextent possible Like the TEX-to- conversion, this was not withoutsome introduction of breakage In the case of Unofficial Texinfo Format,
Trang 10figures have suffered an amateurish resurrection of the lost art of .Also, it’s quite possible that some errors of ambiguity were introducedduring the conversion of some of the copious superscripts (‘ˆ’) and sub-
scripts (‘_’) Divining which has been le as an exercise to the reader.
But at least we don’t put our brave astronauts at risk by encoding the
greater-than-or-equalsymbol as<u>></u>
If you modifysicp.texito correct errors or improve the art,then update the@set utfversion utfversionline to reflect your delta.For example, if you started with Lytha’s version1, and your name isBob, then you could name your successive versions1.bob1,1.bob2,
1.bobn Also updateutfversiondate If you want to distribute yourversion on the Web, then embedding the string “sicp.texi” somewhere
in the file or Web page will make it easier for people to find with Websearch engines
It is believed that the Unofficial Texinfo Format is in keeping withthe spirit of the graciously freely-distributed version But younever know when someone’s armada of lawyers might need something
to do, and get their shorts all in a knot over some benign lile thing,
so think twice before you use your full name or distribute Info, ,PostScript, or formats that might embed your account or machinename
Peath, Lytha Ayth
Addendum:See also the video lectures by Abelson and Sussman:
at or
Second Addendum:Above is the original introduction to the from
2001 Ten years later, has been transformed: mathematical symbolsand formulas are properly typeset, and figures drawn in vector graph-ics e original text formulas and art figures are still there in
Trang 11the Texinfo source, but will display only when compiled to Info output.
At the dawn of e-book readers and tablets, reading a on screen isofficially not silly anymore Enjoy!
A.R, May, 2011
Trang 12—Alan J Perlis (April 1, 1922 – February 7, 1990)
Trang 13E, , , psychologists, and parents
pro-gram Armies, students, and some societies are programmed Anassault on large problems employs a succession of programs, most ofwhich spring into existence en route ese programs are rife with is-sues that appear to be particular to the problem at hand To appreciateprogramming as an intellectual activity in its own right you must turn tocomputer programming; you must read and write computer programs—many of them It doesn’t maer much what the programs are about orwhat applications they serve What does maer is how well they per-form and how smoothly they fit with other programs in the creation
of still greater programs e programmer must seek both perfection
of part and adequacy of collection In this book the use of “program” isfocused on the creation, execution, and study of programs wrien in adialect of Lisp for execution on a digital computer Using Lisp we re-strict or limit not what we may program, but only the notation for ourprogram descriptions
Our traffic with the subject maer of this book involves us withthree foci of phenomena: the human mind, collections of computer pro-grams, and the computer Every computer program is a model, hatched
in the mind, of a real or mental process ese processes, arising from
Trang 14human experience and thought, are huge in number, intricate in tail, and at any time only partially understood ey are modeled to ourpermanent satisfaction rarely by our computer programs us eventhough our programs are carefully handcraed discrete collections ofsymbols, mosaics of interlocking functions, they continually evolve: wechange them as our perception of the model deepens, enlarges, gen-eralizes until the model ultimately aains a metastable place withinstill another model with which we struggle e source of the exhilara-tion associated with computer programming is the continual unfoldingwithin the mind and on the computer of mechanisms expressed as pro-grams and the explosion of perception they generate If art interpretsour dreams, the computer executes them in the guise of programs!For all its power, the computer is a harsh taskmaster Its programsmust be correct, and what we wish to say must be said accurately in ev-ery detail As in every other symbolic activity, we become convinced ofprogram truth through argument Lisp itself can be assigned a seman-tics (another model, by the way), and if a program’s function can bespecified, say, in the predicate calculus, the proof methods of logic can
de-be used to make an acceptable correctness argument Unfortunately, asprograms get large and complicated, as they almost always do, the ade-quacy, consistency, and correctness of the specifications themselves be-come open to doubt, so that complete formal arguments of correctnessseldom accompany large programs Since large programs grow fromsmall ones, it is crucial that we develop an arsenal of standard programstructures of whose correctness we have become sure—we call themidioms—and learn to combine them into larger structures using orga-nizational techniques of proven value ese techniques are treated atlength in this book, and understanding them is essential to participation
in the Promethean enterprise called programming More than anything
Trang 15else, the uncovering and mastery of powerful organizational techniquesaccelerates our ability to create large, significant programs Conversely,since writing large programs is very taxing, we are stimulated to inventnew methods of reducing the mass of function and detail to be fiedinto large programs.
Unlike programs, computers must obey the laws of physics If theywish to perform rapidly—a few nanoseconds per state change—theymust transmit electrons only small distances (at most 112feet) e heatgenerated by the huge number of devices so concentrated in space has to
be removed An exquisite engineering art has been developed balancingbetween multiplicity of function and density of devices In any event,hardware always operates at a level more primitive than that at which
we care to program e processes that transform our Lisp programs
to “machine” programs are themselves abstract models which we gram eir study and creation give a great deal of insight into the or-ganizational programs associated with programming arbitrary models
pro-Of course the computer itself can be so modeled ink of it: the ior of the smallest physical switching element is modeled by quantummechanics described by differential equations whose detailed behavior
behav-is captured by numerical approximations represented in computer grams executing on computers composed of .!
pro-It is not merely a maer of tactical convenience to separately tify the three foci Even though, as they say, it’s all in the head, thislogical separation induces an acceleration of symbolic traffic betweenthese foci whose richness, vitality, and potential is exceeded in humanexperience only by the evolution of life itself At best, relationships be-tween the foci are metastable e computers are never large enough orfast enough Each breakthrough in hardware technology leads to moremassive programming enterprises, new organizational principles, and
Trang 16iden-an enrichment of abstract models Every reader should ask himself riodically “Toward what end, toward what end?”—but do not ask it toooen lest you pass up the fun of programming for the constipation ofbiersweet philosophy.
pe-Among the programs we write, some (but never enough) perform aprecise mathematical function such as sorting or finding the maximum
of a sequence of numbers, determining primality, or finding the squareroot We call such programs algorithms, and a great deal is known oftheir optimal behavior, particularly with respect to the two importantparameters of execution time and data storage requirements A pro-grammer should acquire good algorithms and idioms Even though someprograms resist precise specifications, it is the responsibility of the pro-grammer to estimate, and always to aempt to improve, their perfor-mance
Lisp is a survivor, having been in use for about a quarter of a tury Among the active programming languages only Fortran has had
cen-a longer life Both lcen-angucen-ages hcen-ave supported the progrcen-amming needs
of important areas of application, Fortran for scientific and engineeringcomputation and Lisp for artificial intelligence ese two areas con-tinue to be important, and their programmers are so devoted to thesetwo languages that Lisp and Fortran may well continue in active use for
at least another quarter-century
Lisp changes e Scheme dialect used in this text has evolved fromthe original Lisp and differs from the laer in several important ways,including static scoping for variable binding and permiing functions toyield functions as values In its semantic structure Scheme is as closelyakin to Algol 60 as to early Lisps Algol 60, never to be an active languageagain, lives on in the genes of Scheme and Pascal It would be difficult
to find two languages that are the communicating coin of two more
Trang 17dif-ferent cultures than those gathered around these two languages cal is for building pyramids—imposing, breathtaking, static structuresbuilt by armies pushing heavy blocks into place Lisp is for buildingorganisms—imposing, breathtaking, dynamic structures built by squadsfiing fluctuating myriads of simpler organisms into place e organiz-ing principles used are the same in both cases, except for one extraordi-narily important difference: e discretionary exportable functionalityentrusted to the individual Lisp programmer is more than an order ofmagnitude greater than that to be found within Pascal enterprises Lispprograms inflate libraries with functions whose utility transcends theapplication that produced them e list, Lisp’s native data structure, islargely responsible for such growth of utility e simple structure andnatural applicability of lists are reflected in functions that are amazinglynonidiosyncratic In Pascal the plethora of declarable data structures in-duces a specialization within functions that inhibits and penalizes ca-sual cooperation It is beer to have 100 functions operate on one datastructure than to have 10 functions operate on 10 data structures As aresult the pyramid must stand unchanged for a millennium; the organ-ism must evolve or perish.
Pas-To illustrate this difference, compare the treatment of material andexercises within this book with that in any first-course text using Pascal
Do not labor under the illusion that this is a text digestible at only,peculiar to the breed found there It is precisely what a serious book onprogramming Lisp must be, no maer who the student is or where it isused
Note that this is a text about programming, unlike most Lisp books,which are used as a preparation for work in artificial intelligence Aerall, the critical programming concerns of soware engineering and ar-tificial intelligence tend to coalesce as the systems under investigation
Trang 18become larger is explains why there is such growing interest in Lispoutside of artificial intelligence.
As one would expect from its goals, artificial intelligence researchgenerates many significant programming problems In other program-ming cultures this spate of problems spawns new languages Indeed, inany very large programming task a useful organizing principle is to con-trol and isolate traffic within the task modules via the invention of lan-guage ese languages tend to become less primitive as one approachesthe boundaries of the system where we humans interact most oen As
a result, such systems contain complex language-processing functionsreplicated many times Lisp has such a simple syntax and semantics thatparsing can be treated as an elementary task us parsing technologyplays almost no role in Lisp programs, and the construction of languageprocessors is rarely an impediment to the rate of growth and change oflarge Lisp systems Finally, it is this very simplicity of syntax and se-mantics that is responsible for the burden and freedom borne by allLisp programmers No Lisp program of any size beyond a few lines can
be wrien without being saturated with discretionary functions Inventand fit; have fits and reinvent! We toast the Lisp programmer who penshis thoughts within nests of parentheses
Alan J Perlis
New Haven, Connecticut
Trang 19Preface to the Second Edition
Is it possible that soware is not like anything else, that it
is meant to be discarded: that the whole point is to alwayssee it as a soap bubble?
—Alan J Perlis
T has been the basis of ’s entry-level
computer science subject since 1980 We had been teaching this terial for four years when the first edition was published, and twelvemore years have elapsed until the appearance of this second edition
ma-We are pleased that our work has been widely adopted and rated into other texts We have seen our students take the ideas andprograms in this book and build them in as the core of new computersystems and languages In literal realization of an ancient Talmudic pun,our students have become our builders We are lucky to have such ca-pable students and such accomplished builders
incorpo-In preparing this edition, we have incorporated hundreds of cations suggested by our own teaching experience and the comments ofcolleagues at and elsewhere We have redesigned most of the ma-jor programming systems in the book, including the generic-arithmeticsystem, the interpreters, the register-machine simulator, and the com-
Trang 20clarifi-piler; and we have rewrien all the program examples to ensure thatany Scheme implementation conforming to the Scheme standard(IEEE 1990) will be able to run the code.
is edition emphasizes several new themes e most important ofthese is the central role played by different approaches to dealing withtime in computational models: objects with state, concurrent program-ming, functional programming, lazy evaluation, and nondeterministicprogramming We have included new sections on concurrency and non-determinism, and we have tried to integrate this theme throughout thebook
e first edition of the book closely followed the syllabus of our
one-semester subject With all the new material in the second tion, it will not be possible to cover everything in a single semester,
edi-so the instructor will have to pick and choose In our own teaching, wesometimes skip the section on logic programming (Section 4.4), we havestudents use the register-machine simulator but we do not cover its im-plementation (Section 5.2), and we give only a cursory overview of thecompiler (Section 5.5) Even so, this is still an intense course Some in-structors may wish to cover only the first three or four chapters, leavingthe other material for subsequent courses
e World-Wide-Web sitehp://mitpress.mit.edu/sicpprovides port for users of this book is includes programs from the book, sam-ple programming assignments, supplementary materials, and download-able implementations of the Scheme dialect of Lisp
Trang 21sup-Preface to the First Edition
A computer is like a violin You can imagine a novice ing first a phonograph and then a violin e laer, he says,sounds terrible at is the argument we have heard fromour humanists and most of our computer scientists Com-puter programs are good, they say, for particular purposes,but they aren’t flexible Neither is a violin, or a typewriter,until you learn how to use it
try-—Marvin Minsky, “Why Programming Is a Good Mediumfor Expressing Poorly-Understood and Sloppily-FormulatedIdeas”
“T S I C P”
is the entry-level subject in computer science at the MassachusesInstitute of Technology It is required of all students at who major
in electrical engineering or in computer science, as one-fourth of the
“common core curriculum,” which also includes two subjects on circuitsand linear systems and a subject on the design of digital systems Wehave been involved in the development of this subject since 1978, and
we have taught this material in its present form since the fall of 1980 tobetween 600 and 700 students each year Most of these students have
Trang 22had lile or no prior formal training in computation, although manyhave played with computers a bit and a few have had extensive pro-gramming or hardware-design experience.
Our design of this introductory computer-science subject reflectstwo major concerns First, we want to establish the idea that a com-puter language is not just a way of geing a computer to perform oper-ations but rather that it is a novel formal medium for expressing ideasabout methodology us, programs must be wrien for people to read,and only incidentally for machines to execute Second, we believe thatthe essential material to be addressed by a subject at this level is notthe syntax of particular programming-language constructs, nor cleveralgorithms for computing particular functions efficiently, nor even themathematical analysis of algorithms and the foundations of computing,but rather the techniques used to control the intellectual complexity oflarge soware systems
Our goal is that students who complete this subject should have agood feel for the elements of style and the aesthetics of programming
ey should have command of the major techniques for controllingcomplexity in a large system ey should be capable of reading a 50-page-long program, if it is wrien in an exemplary style ey shouldknow what not to read, and what they need not understand at any mo-ment ey should feel secure about modifying a program, retaining thespirit and style of the original author
ese skills are by no means unique to computer programming etechniques we teach and draw upon are common to all of engineeringdesign We control complexity by building abstractions that hide detailswhen appropriate We control complexity by establishing conventionalinterfaces that enable us to construct systems by combining standard,well-understood pieces in a “mix and match” way We control complex-
Trang 23ity by establishing new languages for describing a design, each of whichemphasizes particular aspects of the design and deemphasizes others.Underlying our approach to this subject is our conviction that “com-puter science” is not a science and that its significance has lile to dowith computers e computer revolution is a revolution in the way wethink and in the way we express what we think e essence of this
change is the emergence of what might best be called procedural temology—the study of the structure of knowledge from an imperativepoint of view, as opposed to the more declarative point of view taken
epis-by classical mathematical subjects Mathematics provides a frameworkfor dealing precisely with notions of “what is.” Computation provides aframework for dealing precisely with notions of “how to.”
In teaching our material we use a dialect of the programming guage Lisp We never formally teach the language, because we don’thave to We just use it, and students pick it up in a few days is isone great advantage of Lisp-like languages: ey have very few ways
lan-of forming compound expressions, and almost no syntactic structure.All of the formal properties can be covered in an hour, like the rules
of chess Aer a short time we forget about syntactic details of the guage (because there are none) and get on with the real issues—figuringout what we want to compute, how we will decompose problems intomanageable parts, and how we will work on the parts Another advan-tage of Lisp is that it supports (but does not enforce) more of the large-scale strategies for modular decomposition of programs than any otherlanguage we know We can make procedural and data abstractions, wecan use higher-order functions to capture common paerns of usage,
lan-we can model local state using assignment and data mutation, lan-we canlink parts of a program with streams and delayed evaluation, and we caneasily implement embedded languages All of this is embedded in an in-
Trang 24teractive environment with excellent support for incremental programdesign, construction, testing, and debugging We thank all the genera-tions of Lisp wizards, starting with John McCarthy, who have fashioned
a fine tool of unprecedented power and elegance
Scheme, the dialect of Lisp that we use, is an aempt to bring gether the power and elegance of Lisp and Algol From Lisp we take themetalinguistic power that derives from the simple syntax, the uniformrepresentation of programs as data objects, and the garbage-collectedheap-allocated data From Algol we take lexical scoping and block struc-ture, which are gis from the pioneers of programming-language de-sign who were on the Algol commiee We wish to cite John Reynoldsand Peter Landin for their insights into the relationship of Church’s λ-calculus to the structure of programming languages We also recognizeour debt to the mathematicians who scouted out this territory decadesbefore computers appeared on the scene ese pioneers include AlonzoChurch, Barkley Rosser, Stephen Kleene, and Haskell Curry
Trang 25W the many people who have helped us
develop this book and this curriculum
Our subject is a clear intellectual descendant of “6.231,” a wonderfulsubject on programming linguistics and the λ-calculus taught at inthe late 1960s by Jack Wozencra and Arthur Evans, Jr
We owe a great debt to Robert Fano, who reorganized ’s ductory curriculum in electrical engineering and computer science toemphasize the principles of engineering design He led us in startingout on this enterprise and wrote the first set of subject notes from whichthis book evolved
intro-Much of the style and aesthetics of programming that we try toteach were developed in conjunction with Guy Lewis Steele Jr., whocollaborated with Gerald Jay Sussman in the initial development of theScheme language In addition, David Turner, Peter Henderson, Dan Fried-man, David Wise, and Will Clinger have taught us many of the tech-niques of the functional programming community that appear in thisbook
Joel Moses taught us about structuring large systems His ence with the Macsyma system for symbolic computation provided theinsight that one should avoid complexities of control and concentrate
Trang 26experi-on organizing the data to reflect the real structure of the world beingmodeled.
Marvin Minsky and Seymour Papert formed many of our aitudesabout programming and its place in our intellectual lives To them weowe the understanding that computation provides a means of expres-sion for exploring ideas that would otherwise be too complex to dealwith precisely ey emphasize that a student’s ability to write andmodify programs provides a powerful medium in which exploring be-comes a natural activity
We also strongly agree with Alan Perlis that programming is lots offun and we had beer be careful to support the joy of programming Part
of this joy derives from observing great masters at work We are nate to have been apprentice programmers at the feet of Bill Gosper andRichard Greenbla
fortu-It is difficult to identify all the people who have contributed to thedevelopment of our curriculum We thank all the lecturers, recitationinstructors, and tutors who have worked with us over the past fieenyears and put in many extra hours on our subject, especially Bill Siebert,Albert Meyer, Joe Stoy, Randy Davis, Louis Braida, Eric Grimson, RodBrooks, Lynn Stein and Peter Szolovits We would like to specially ac-knowledge the outstanding teaching contributions of Franklyn Turbak,now at Wellesley; his work in undergraduate instruction set a standardthat we can all aspire to We are grateful to Jerry Saltzer and Jim Millerfor helping us grapple with the mysteries of concurrency, and to PeterSzolovits and David McAllester for their contributions to the exposition
of nondeterministic evaluation inChapter 4
Many people have put in significant effort presenting this material
at other universities Some of the people we have worked closely withare Jacob Katzenelson at the Technion, Hardy Mayer at the University
Trang 27of California at Irvine, Joe Stoy at Oxford, Elisha Sacks at Purdue, andJan Komorowski at the Norwegian University of Science and Technol-ogy We are exceptionally proud of our colleagues who have receivedmajor teaching awards for their adaptations of this subject at other uni-versities, including Kenneth Yip at Yale, Brian Harvey at the University
of California at Berkeley, and Dan Huenlocher at Cornell
Al Moyé arranged for us to teach this material to engineers at Packard, and for the production of videotapes of these lectures Wewould like to thank the talented instructors—in particular Jim Miller,Bill Siebert, and Mike Eisenberg—who have designed continuing edu-cation courses incorporating these tapes and taught them at universitiesand industry all over the world
Hewle-Many educators in other countries have put in significant worktranslating the first edition Michel Briand, Pierre Chamard, and An-dré Pic produced a French edition; Susanne Daniels-Herold produced
a German edition; and Fumio Motoyoshi produced a Japanese edition
We do not know who produced the Chinese edition, but we consider
it an honor to have been selected as the subject of an “unauthorized”translation
It is hard to enumerate all the people who have made technical tributions to the development of the Scheme systems we use for in-structional purposes In addition to Guy Steele, principal wizards haveincluded Chris Hanson, Joe Bowbeer, Jim Miller, Guillermo Rozas, andStephen Adams Others who have put in significant time are RichardStallman, Alan Bawden, Kent Pitman, Jon Ta, Neil Mayle, John Lamp-ing, Gwyn Osnos, Tracy Larrabee, George Carree, Soma Chaudhuri,Bill Chiarchiaro, Steven Kirsch, Leigh Klotz, Wayne Noss, Todd Cass,Patrick O’Donnell, Kevin eobald, Daniel Weise, Kenneth Sinclair, An-thony Courtemanche, Henry M Wu, Andrew Berlin, and Ruth Shyu
Trang 28con-Beyond the implementation, we would like to thank the manypeople who worked on the Scheme standard, including WilliamClinger and Jonathan Rees, who edited the R4RS, and Chris Haynes,David Bartley, Chris Hanson, and Jim Miller, who prepared the standard.
Dan Friedman has been a long-time leader of the Scheme nity e community’s broader work goes beyond issues of languagedesign to encompass significant educational innovations, such as thehigh-school curriculum based on EdScheme by Schemer’s Inc., and thewonderful books by Mike Eisenberg and by Brian Harvey and MahewWright
commu-We appreciate the work of those who contributed to making this areal book, especially Terry Ehling, Larry Cohen, and Paul Bethge at the
Press Ella Mazel found the wonderful cover image For the secondedition we are particularly grateful to Bernard and Ella Mazel for helpwith the book design, and to David Jones, TEX wizard extraordinaire
We also are indebted to those readers who made penetrating comments
on the new dra: Jacob Katzenelson, Hardy Mayer, Jim Miller, and pecially Brian Harvey, who did unto this book as Julie did unto his book
es-Simply Scheme
Finally, we would like to acknowledge the support of the tions that have encouraged this work over the years, including supportfrom Hewle-Packard, made possible by Ira Goldstein and Joel Birn-baum, and support from , made possible by Bob Kahn
Trang 29Building Abstractions with Procedures
e acts of the mind, wherein it exerts its power over simpleideas, are chiefly these three: 1 Combining several simpleideas into one compound one, and thus all complex ideasare made 2 e second is bringing two ideas, whether sim-ple or complex, together, and seing them by one another
so as to take a view of them at once, without uniting theminto one, by which it gets all its ideas of relations 3 ethird is separating them from all other ideas that accom-pany them in their real existence: this is called abstraction,and thus all its general ideas are made
—John Locke, An Essay Concerning Human Understanding
(1690)
W the idea of a computational process
Com-putational processes are abstract beings that inhabit computers
As they evolve, processes manipulate other abstract things called data.
Trang 30e evolution of a process is directed by a paern of rules called a gram People create programs to direct processes In effect, we conjurethe spirits of the computer with our spells.
pro-A computational process is indeed much like a sorcerer’s idea of aspirit It cannot be seen or touched It is not composed of maer at all.However, it is very real It can perform intellectual work It can answerquestions It can affect the world by disbursing money at a bank or bycontrolling a robot arm in a factory e programs we use to conjureprocesses are like a sorcerer’s spells ey are carefully composed from
symbolic expressions in arcane and esoteric programming languages that
prescribe the tasks we want our processes to perform
A computational process, in a correctly working computer, executesprograms precisely and accurately us, like the sorcerer’s appren-tice, novice programmers must learn to understand and to anticipatethe consequences of their conjuring Even small errors (usually called
bugs or glitches) in programs can have complex and unanticipated
con-sequences
Fortunately, learning to program is considerably less dangerous thanlearning sorcery, because the spirits we deal with are conveniently con-tained in a secure way Real-world programming, however, requirescare, expertise, and wisdom A small bug in a computer-aided designprogram, for example, can lead to the catastrophic collapse of an air-plane or a dam or the self-destruction of an industrial robot
Master soware engineers have the ability to organize programs sothat they can be reasonably sure that the resulting processes will per-form the tasks intended ey can visualize the behavior of their sys-tems in advance ey know how to structure programs so that unan-ticipated problems do not lead to catastrophic consequences, and when
problems do arise, they can debug their programs Well-designed
Trang 31com-putational systems, like well-designed automobiles or nuclear reactors,are designed in a modular manner, so that the parts can be constructed,replaced, and debugged separately.
Programming in Lisp
We need an appropriate language for describing processes, and we willuse for this purpose the programming language Lisp Just as our every-day thoughts are usually expressed in our natural language (such as En-glish, French, or Japanese), and descriptions of quantitative phenomenaare expressed with mathematical notations, our procedural thoughtswill be expressed in Lisp Lisp was invented in the late 1950s as a for-malism for reasoning about the use of certain kinds of logical expres-
sions, called recursion equations, as a model for computation e
lan-guage was conceived by John McCarthy and is based on his paper cursive Functions of Symbolic Expressions and eir Computation byMachine” (McCarthy 1960)
“Re-Despite its inception as a mathematical formalism, Lisp is a
practi-cal programming language A Lisp interpreter is a machine that carries
out processes described in the Lisp language e first Lisp interpreterwas implemented by McCarthy with the help of colleagues and stu-dents in the Artificial Intelligence Group of the Research Laboratory
of Electronics and in the Computation Center.1Lisp, whose name
is an acronym for LISt Processing, was designed to provide manipulating capabilities for aacking programming problems such asthe symbolic differentiation and integration of algebraic expressions
symbol-It included for this purpose new data objects known as atoms and lists,
1e Lisp 1 Programmer’s Manual appeared in 1960, and the Lisp 1.5 Programmer’s Manual(McCarthy et al 1965) was published in 1962 e early history of Lisp is de- scribed in McCarthy 1978.
Trang 32which most strikingly set it apart from all other languages of the period.Lisp was not the product of a concerted design effort Instead, itevolved informally in an experimental manner in response to users’needs and to pragmatic implementation considerations Lisp’s informalevolution has continued through the years, and the community of Lispusers has traditionally resisted aempts to promulgate any “official”definition of the language is evolution, together with the flexibilityand elegance of the initial conception, has enabled Lisp, which is the sec-ond oldest language in widespread use today (only Fortran is older), tocontinually adapt to encompass the most modern ideas about programdesign us, Lisp is by now a family of dialects, which, while sharingmost of the original features, may differ from one another in significantways e dialect of Lisp used in this book is called Scheme.2
Because of its experimental character and its emphasis on symbolmanipulation, Lisp was at first very inefficient for numerical compu-tations, at least in comparison with Fortran Over the years, however,
2 e two dialects in which most major Lisp programs of the 1970s were wrien are MacLisp (Moon 1978; Pitman 1983), developed at the Project , and Interlisp (Teitelman 1974), developed at Bolt Beranek and Newman Inc and the Xerox Palo Alto Research Center Portable Standard Lisp (Hearn 1969; Griss 1981) was a Lisp dialect designed to be easily portable between different machines MacLisp spawned a number
of subdialects, such as Franz Lisp, which was developed at the University of California
at Berkeley, and Zetalisp (Moon and Weinreb 1981), which was based on a purpose processor designed at the Artificial Intelligence Laboratory to run Lisp very efficiently e Lisp dialect used in this book, called Scheme (Steele and Sussman 1975), was invented in 1975 by Guy Lewis Steele Jr and Gerald Jay Sussman of the Artificial Intelligence Laboratory and later reimplemented for instructional use at Scheme became an standard in 1990 (IEEE 1990) e Common Lisp dialect (Steele
special-1982, Steele 1990) was developed by the Lisp community to combine features from the earlier Lisp dialects to make an industrial standard for Lisp Common Lisp became an
standard in 1994 (ANSI 1994).
Trang 33Lisp compilers have been developed that translate programs into chine code that can perform numerical computations reasonably effi-ciently And for special applications, Lisp has been used with great ef-fectiveness.3Although Lisp has not yet overcome its old reputation ashopelessly inefficient, Lisp is now used in many applications where ef-ficiency is not the central concern For example, Lisp has become a lan-guage of choice for operating-system shell languages and for extensionlanguages for editors and computer-aided design systems.
ma-If Lisp is not a mainstream language, why are we using it as theframework for our discussion of programming? Because the languagepossesses unique features that make it an excellent medium for studyingimportant programming constructs and data structures and for relatingthem to the linguistic features that support them e most significant of
these features is the fact that Lisp descriptions of processes, called dures, can themselves be represented and manipulated as Lisp data eimportance of this is that there are powerful program-design techniquesthat rely on the ability to blur the traditional distinction between “pas-sive” data and “active” processes As we shall discover, Lisp’s flexibility
proce-in handlproce-ing procedures as data makes it one of the most convenientlanguages in existence for exploring these techniques e ability torepresent procedures as data also makes Lisp an excellent language forwriting programs that must manipulate other programs as data, such asthe interpreters and compilers that support computer languages Aboveand beyond these considerations, programming in Lisp is great fun
3 One such special application was a breakthrough computation of scientific importance—an integration of the motion of the Solar System that extended previous results by nearly two orders of magnitude, and demonstrated that the dynamics of the Solar System is chaotic is computation was made possible by new integration al- gorithms, a special-purpose compiler, and a special-purpose computer all implemented with the aid of soware tools wrien in Lisp (Abelson et al 1992; Sussman and Wisdom 1992).
Trang 341.1 The Elements of Programming
A powerful programming language is more than just a means for structing a computer to perform tasks e language also serves as aframework within which we organize our ideas about processes us,when we describe a language, we should pay particular aention to themeans that the language provides for combining simple ideas to formmore complex ideas Every powerful language has three mechanismsfor accomplishing this:
in-• primitive expressions, which represent the simplest entities the
language is concerned with,
• means of combination, by which compound elements are built
from simpler ones, and
• means of abstraction, by which compound elements can be named
and manipulated as units
In programming, we deal with two kinds of elements: procedures anddata (Later we will discover that they are really not so distinct.) Infor-mally, data is “stuff” that we want to manipulate, and procedures aredescriptions of the rules for manipulating the data us, any powerfulprogramming language should be able to describe primitive data andprimitive procedures and should have methods for combining and ab-stracting procedures and data
In this chapter we will deal only with simple numerical data so that
we can focus on the rules for building procedures.4In later chapters we
4 e characterization of numbers as “simple data” is a barefaced bluff In fact, the treatment of numbers is one of the trickiest and most confusing aspects of any pro-
Trang 35will see that these same rules allow us to build procedures to manipulatecompound data as well.
the interpreter will respond by printing5
486
gramming language Some typical issues involved are these: Some computer systems
distinguish integers, such as 2, from real numbers, such as 2.71 Is the real number 2.00
different from the integer 2? Are the arithmetic operations used for integers the same
as the operations used for real numbers? Does 6 divided by 2 produce 3, or 3.0? How large a number can we represent? How many decimal places of accuracy can we repre- sent? Is the range of integers the same as the range of real numbers? Above and beyond these questions, of course, lies a collection of issues concerning roundoff and trunca- tion errors—the entire science of numerical analysis Since our focus in this book is on large-scale program design rather than on numerical techniques, we are going to ignore these problems e numerical examples in this chapter will exhibit the usual roundoff behavior that one observes when using arithmetic operations that preserve a limited number of decimal places of accuracy in noninteger operations.
5 roughout this book, when we wish to emphasize the distinction between the input typed by the user and the response printed by the interpreter, we will show the laer in slanted characters.
Trang 36Expressions representing numbers may be combined with an sion representing a primitive procedure (such as+or*) to form a com-pound expression that represents the application of the procedure tothose numbers For example:
e convention of placing the operator to the le of the operands
is known as prefix notation, and it may be somewhat confusing at first
because it departs significantly from the customary mathematical vention Prefix notation has several advantages, however One of them
con-is that it can accommodate procedures that may take an arbitrary ber of arguments, as in the following examples:
Trang 37el-A second advantage of prefix notation is that it extends in a
straight-forward way to allow combinations to be nested, that is, to have
combi-nations whose elements are themselves combicombi-nations:
(+ (* 3 5) (- 10 6))
19
ere is no limit (in principle) to the depth of such nesting and to theoverall complexity of the expressions that the Lisp interpreter can eval-uate It is we humans who get confused by still relatively simple expres-sions such as
following a formaing convention known as prey-printing, in which
each long combination is wrien so that the operands are aligned tically e resulting indentations display clearly the structure of the
Trang 38Even with complex expressions, the interpreter always operates inthe same basic cycle: It reads an expression from the terminal, evaluatesthe expression, and prints the result is mode of operation is oen
expressed by saying that the interpreter runs in a read-eval-print loop.
Observe in particular that it is not necessary to explicitly instruct theinterpreter to print the value of the expression.7
1.1.2 Naming and the Environment
A critical aspect of a programming language is the means it providesfor using names to refer to computational objects We say that the name
identifies a variable whose value is the object.
In the Scheme dialect of Lisp, we name things withdefine Typing(define size 2)
causes the interpreter to associate the value 2 with the name size.8Once the namesizehas been associated with the number 2, we canrefer to the value 2 by name:
size
2
6 Lisp systems typically provide features to aid the user in formaing expressions Two especially useful features are one that automatically indents to the proper prey- print position whenever a new line is started and one that highlights the matching le parenthesis whenever a right parenthesis is typed.
7 Lisp obeys the convention that every expression has a value is convention, gether with the old reputation of Lisp as an inefficient language, is the source of the quip by Alan Perlis (paraphrasing Oscar Wilde) that “Lisp programmers know the value
to-of everything but the cost to-of nothing.”
8 In this book, we do not show the interpreter’s response to evaluating definitions, since this is highly implementation-dependent.
Trang 39Defineis our language’s simplest means of abstraction, for it allows
us to use simple names to refer to the results of compound operations,such as thecircumferencecomputed above In general, computationalobjects may have very complex structures, and it would be extremelyinconvenient to have to remember and repeat their details each time wewant to use them Indeed, complex programs are constructed by build-ing, step by step, computational objects of increasing complexity einterpreter makes this step-by-step program construction particularlyconvenient because name-object associations can be created incremen-tally in successive interactions is feature encourages the incrementaldevelopment and testing of programs and is largely responsible for thefact that a Lisp program usually consists of a large number of relativelysimple procedures
It should be clear that the possibility of associating values with bols and later retrieving them means that the interpreter must maintainsome sort of memory that keeps track of the name-object pairs is
sym-memory is called the environment (more precisely the global ment, since we will see later that a computation may involve a number
Trang 40environ-of different environments).9
1.1.3 Evaluating Combinations
One of our goals in this chapter is to isolate issues about thinking cedurally As a case in point, let us consider that, in evaluating combi-nations, the interpreter is itself following a procedure
pro-To evaluate a combination, do the following:
1 Evaluate the subexpressions of the combination
2 Apply the procedure that is the value of the lemost sion (the operator) to the arguments that are the values of theother subexpressions (the operands)
subexpres-Even this simple rule illustrates some important points about processes
in general First, observe that the first step dictates that in order to complish the evaluation process for a combination we must first per-form the evaluation process on each element of the combination us,
ac-the evaluation rule is recursive in nature; that is, it includes, as one of
its steps, the need to invoke the rule itself.10
Notice how succinctly the idea of recursion can be used to expresswhat, in the case of a deeply nested combination, would otherwise beviewed as a rather complicated process For example, evaluating
9 Chapter 3 will show that this notion of environment is crucial, both for standing how the interpreter works and for implementing interpreters.
under-10 It may seem strange that the evaluation rule says, as part of the first step, that
we should evaluate the lemost element of a combination, since at this point that can only be an operator such as + or * representing a built-in primitive procedure such as addition or multiplication We will see later that it is useful to be able to work with combinations whose operators are themselves compound expressions.