1930 1940 1950 1960 1970 1980 1990 2000neuro− hardware Minsky/Papert book Turing GPS PROLOG probabilistic reasoning first−order logic propositional logic Davis/Putnam hybrid systems deci
Trang 1Slides for the book
Introduction to Artificial Intelligence
Trang 39.8
mff.cuni.cz/\protect\unhbox\voidb@x\penalty\@M\{}bartak/
Trang 4constraints, 1998 5.8
learning Discrete Event Systems, Special issue on reinforcement learning, 13
Berrondo, M.: Fallgruben f¨ur Kopff¨ussler Fischer Taschenbuch Nr 8703,
Bibel, W.: Deduktion: Automatisierung der Logik Volume 6.2, Handbuch der
Trang 5Bratko, I.: PROLOG: Programmierung f¨ur K¨unstliche Intelligenz
Burges, C J.: A Tutorial on Support Vector Machines for Pattern Recognition
Diaz, D.: GNU PROLOG Universit¨at Paris, 2004, Aufl 1.7, f¨ur GNU Prolog
voidb@x\penalty\@M\{}mlearn/MLRepository.html, 1998 8.8
Trang 6Duda, R.O./Hart, P.E.: Pattern Classification and Scene Analysis Wiley, 1973,
¨
Osterreichische Artificial-Intelligence-Tagung Berlin, Heidelberg:
of ATP to Software Reuse In Conference on Automated Deduction (CADE
publications/papers/cade97-reuse.html, 65–68 3.8
Trang 7G¨orz, G./Rollinger, C.-R./Schneeberger, J., editors: Handbuch der
7.5, 7.6, 8.4, 9
kaelbling96a.pdf 10.2
www-unix.mcs.anl.gov/AR/otter/index.html 3.6
lauer.riedml.fgml02.ps.gz, 100–107 2
Trang 8Letz, R et al.: SETHEO: A High-Performance Theorem Prover Journal of
de/\protect\unhbox\voidb@x\penalty\@M\{}letz/setheo 3.6
for Graph Drawing Dutch Research Center for Mathematical and Computer
CWIreports/INS/INS-R0005.pdf 10
\protect\unhbox\voidb@x\penalty\@M\{}tom/mlbook.html 2, 8, 8.8,
10.2
ac.uk/Research/HVG/Isabelle 3.6, 4.1
Trang 9Pearl, J.: Probabilistic Reasoning in Intelligent Systems Networks of Plausible
Trang 10Schulz, S.: E – A Brainiac Theorem Prover Journal of AI Communications, 15
unhbox\voidb@x\penalty\@M\{}schulz/WORK/eprover.html 3.6, 3.7
Schumann, J.: Automated Theorem Proving in Software Engineering Springer
to read aloud The John Hopkins University Electrical Engineering and puter Science Technical Report, 1986 (JHU/EECS-86/01) – Technical report,
@M\{}omega, 3–28 4.1
Trang 11Stone, P./Sutton, R.S./Kuhlmann, G.: Reinforcement Learning for
www.cs.utexas.edu/\protect\unhbox\voidb@x\penalty\@M\
{}pstone/Papers/bib2html-links/AB05.pdf 3
www.cs.ualberta.ca/\protect\unhbox\voidb@x\penalty\@M\
{}sutton/book/the-book.html 10.1, 1, 10.2
Turing, A.M.: Computing Machinery and Intelligence Mind, 59 1950, 433–
Zimmerli/Wolf 11
Trang 12Wiedemann, U.: PhilLex, Lexikon der Philosophie www.phillex.de/
paradoxa.htm 2
swi-prolog.org 5.8
den Autoren in Java entwickelte DataMining Programmbibliothek WEKA:(www.cs.waikato.ac.nz/\protect\unhbox\voidb@x\penalty\@M\
{}ml/weka) 8, 16
uni-tuebingen.de/SNNS 6, 9.8
Trang 13Chapter 1
Introduction
Trang 14What is Artificial Intelligence (AI)
Trang 15John McCarthy (1955):
The aim of AI is to develop machines that behave as if they were intelligent
Two simple Braitenberg-vehicles and their reaction to a light source
Trang 16Encyclopedia Britannica:
AI is the ability of a digital computer or computer-controlled robot to
perform tasks commonly associated with intelligent beings
According to this definition, every computer is an AI-system
Trang 17Elaine Rich:
Artificial Intelligence is the study of how to make computers do things at
which, at the moment, people are better
ability)!
Rich
Trang 18Brain Research and Problem Solving
Different approaches:
Trang 20The Turing Test and Chatterbots
Alan Turing:
The machine passes the test, if it can mislead Alice in 30% of the cases
Trang 21Joseph Weizenbaum (computer critic): the program Eliza talks to his secretary
Trang 22History of AI
statements are derivable In higher-order logics, on the other hand, thereare true statements that are unprovable
1937 Alan Turing points out the limits of intelligent machines with the halting
problem
to propositional logic
about learning machines and genetic algorithms
1951 Marvin Minsky develops a neural network machine With 3000 vacuum
tubes he simulates 40 neurons
1955 Arthur Samuel (IBM) builds a learning chess program that plays better
than its developer
Trang 231956 McCarthy organizes a conference in Dartmouth College Here the name
Artificial Intelligence was first introduced
Theorist, the first symbol-processing computer program
1958 McCarthy invents at MIT (Massachusettes Institute of Technology) the
modify-ing themselves
1959 Gelernter (IBM) builds the Geometry Theorem Prover
1961 The General Problem Solver (GPS) by Newell und Simon imitates human
thought
1963 McCarthy founds the AI Lab at Stanford University
1966 Weizenbaum’s program Eliza carries out dialogue with people in natural
Trang 241969 Minsky and Papert show in their book Perceptrons that the perceptron,
1972 French scientist Alain Colmerauer invents the logic programming language
PROLOG (5)
acute abdominal pain It goes unnoticed in the mainstream AI community
1976 Shortliffe and Buchanan develop MYCIN, an expert system for diagnosis
of infectious diseases, which is capable of dealing with uncertainty
1981 Japan begins, at great expense, the “Fifth Generation Project” with the
goal of building a powerful PROLOG machine
1982 R1, the expert system for configuring computers, saves Digital Equipment
Corporation 40 million dollars per year
Trang 251986 Renaissance of neural networks through, among others, Rumelhart, Hinton
1990 Pearl , Cheeseman , Whittaker, Spiegelhalter bring probability theory into
Multi-agent systems become popular
1992 Tesauros TD-gammon program demonstrates the advantages of
reinforce-ment learning
robots
1995 From statistical learning theory, Vapnik develops support vector machines,
which are very important today
1997 First international RoboCup competition in Japan
2003 The robots in RoboCup demonstrate impressively what AI and robotics are
capable of achieving
Trang 262006 Service robotics becomes a major AI research area.
2010 Autonomous robots start learning their policies
Trang 27Phases of AI history
Trang 281930 1940 1950 1960 1970 1980 1990 2000
neuro−
hardware
Minsky/Papert book
Turing
GPS PROLOG
probabilistic reasoning first−order logic
propositional logic
Davis/Putnam
hybrid systems
decision tree learning
fuzzy logic
neural networks
Trang 30Hardware agent (autonomous robot)
manipulation
perception
hardware−Agent
environment
actuator 1 actuator m
sensor 1 sensor n
software−
agent
Trang 31Reex-Agent: function from the set of all inputs to the set of all outputs.
Agent with a memory: is not a function Why?
Agent capable of learning
Distributed agents
Markov decision process: only the current state is needed for the
determi-nation of the optimal action
goal-oriented agent
Trang 32Example: Spam filter: aims at assigning emails to their correct classes.
correct class desired SPAM Spam filter
decides
desired 189 1
SPAM 11 799
correct class desired SPAM Spam filter
Trang 33Definition 1.1 The goal of a cost-oriented agent is to minimize the term cost (i.e the average cost) caused by wrong decisions The sum of allweighted errors results in the total cost.
benefit (i.e the average benefit) caused by correct decisions
Trang 34Environment
Trang 38Separation of knowledge and inference has advantages:
Representation of knowledge with a formal language:
Trang 39Chapter 2
Propositional Logic
if it is raining the street is wet
Written more formally
it is raining ⇒ the street is wet
Trang 40op-erators and Σ a set of symbols The sets Op, Σ and {t, f} are pairwise disjoint
The set of propositional logic formulas is now recursively dened:
• t and f are (atomic) formulas
• All proposition variables, that is all elements from Σ, are (atomic) formulas
• If A and B are formulas, then ¬A, (A), A ∧ B, A ∨ B, A ⇒ B, A ⇔
Trang 41Definition 2.2 We read the symbols and operators in the following way:
Trang 42With Σ = {A, B, C}, for example
The formulas defined in this way are so far purely syntactic constructions without
meaning We are still missing the semantics
Trang 43Is the formula
true?
Trang 44Definition 2.3 A mapping I : Σ → {w, f}, which assigns a truth value to
a world
inter-pretations
Trang 46Definition 2.4 Two formulas F and G are called semantically equivalent ifthey take on the same truth value for all interpretations We write F ≡ G.
Meta language: natural language, e.g “A ≡ B”
Object language: logic, e.g “A ⇔ B”
Trang 47Theorem 2.1 The operations ∧ , ∨ are commutative and associative, and thefollowing equivalences are generally valid:
Trang 48Proof: only the first equivalence:
The proofs for the other equivalences are similar and are recommended as exercises
2
Trang 49Varianten der Wahrheit
According to how many interpretations in which a formula is true, we can divideformulas into the following classes:
• satisable if it is true for at least one interpretation
• logically valid or simply valid if it is true for all interpretations True
• unsatisable if it is not true for any interpretation
Clearly the negation of every generally valid formula is unsatisfiable The negation
of a satisfiable, but not generally valid formula F is satisfiable
Trang 50Proof systems
satisfying interpretations because their proposition is empty
Trang 51Truth table for implication:
interpretation that makes A true, B is also true The critical second row of the
shown
Trang 52Theorem 2.2 (Deduction theorem)
Trang 53• The truth table method is a proof system for propositional logic!
¬(WB ⇒ Q) ≡ ¬(¬WB ∨ Q) ≡ WB ∧ ¬Q
To show that the query Q follows from the knowledge base WB , we can also add
Trang 54Fields of application:
Derivation: syntactic manipulation of the formulas WB and Q by application
of inference rules with the goal of greatly simplifying them, such that in the
Calculus: syntactic proof system (derivation)
Trang 55Soundness and completeness
semantically That is, if it holds for formulas WB and Q that
if WB ` Q then WB |= Q
is, if it holds for formulas WB and Q that
if WB |= Q then WB ` Q
Trang 56Syntactic derivation and semantic entailment
Trang 57To keep automatic proof systems as simple as possible, these are usually made tooperate on formulas in conjunctive normal form.
form The conjunctive normal form does not place a restriction on the set offormulas because:
Trang 58Theorem 2.4 Every propositional logic formula can be transformed into an
equivalent conjunctive normal form
≡ ((¬A ∨ C) ∧ (¬A ∨ D)) ∧ ((¬B ∨ C) ∧ (¬B ∨ D)) (distributive law)
Trang 59Proof calculus: modus ponens
Modus ponens is sound but not complete
Trang 60General resolution rule:
Trang 61We call the literals B and ¬B complementary.
formu-las in conjunctive normal form is sound and complete
The knowledge base WB must be consistent!
Trang 62Example: Logic puzzle number 7, entitled A charming English family, from
Despite studying English for seven long years with brilliant success, I must admit that
when I hear English people speaking English I’m totally perplexed Recently, moved
by noble feelings, I picked up three hitchhikers, a father, mother, and daughter, who
I quickly realized were English and only spoke English At each of the sentences that
follow I wavered between two possible interpretations They told me the following (the
second possible meaning is in parentheses): The father: “We are going to Spain (we are
from Newcastle).” The mother: “We are not going to Spain and are from Newcastle (we
stopped in Paris and are not going to Spain).” The daughter: “We are not from Newcastle
(we stopped in Paris).” What about this charming English family?
Three steps:
Trang 63Empty clause not derivable, thus KB is non-contradictory.
query
Trang 64The “charming English family” evidently comes from Newcastle, stopped in Paris,but is not going to Spain.
Trang 65Example: Logic puzzle number 28 from Berrondo entitled The High Jump
reads:
Three girls practice high jump for their physical education final exam The bar is set to
1.20 meters “I bet”, says the first girl to the second, “that I will make it over if, and only
if, you don’t” If the second girl said the same to the third, who in turn said the same to
the first, would it be possible for all three to win their bets?
We show through proof by resolution that not all three can win their bets
Formalization:
The first girl’s jump succeeds: A,
the second girl’s jump succeeds: B,
the third girl’s jump succeeds: C
Claim: the three cannot all win their bets:
Transformation into CNF: First girl’s bet:
Trang 66The bets of the other two girls undergo analogous transformations, and we obtain
the negated claim
Trang 67Horn clauses
A clause in conjunctive normal form contains positive and negative literals and can
be represented in the form
This clause can be transformed in
Examples:
“If the weather is nice and there is snow on the ground, I will go skiing
or I will work.” (non-definite clause)
“If the weather is nice and there is snow on the ground, I will go skiing.”
(definite clause)
Trang 68Definition 2.9 Clauses with at most one positive literal of the form
or (equivalently)
To better understand the representation of Horn clauses, the reader may derivethem from the definitions of the equivalences we have currently been using (Exer-
Trang 69Example: Knowledge base:
Does skiing hold?
Inference rule (generalized modus ponens):
Trang 70forward chaining: starts with facts and finally derives the query
backward chaining: starts with the query and works backwards until the factsare reached
Trang 71SLD resolution
“Selection rule driven linear resolution for definite clauses”
Trang 72• linear resolution: further processing is always done on the currently derivedclause.
Inference rule:
Trang 73SLD resolution and PROLOG
its clause head, the proof terminates and no contradiction can be found
Trang 74Computability and Complexity
num-ber of clauses in the worst case
linearly as the number of literals in the formula increases
Trang 75Applications and Limitations
vari-ables
Trang 76Chapter 3
First-order Predicate Logic
Statement:
Robot 7 is situated at the xy position (35,79)
propositional logic variable:
Robot 7 is situated at xy position (35,79)
Trang 77⇒ 100 · 100 · 100 = 1 000 000 = 106 variables
Relation
Robot A is to the right of robot B
Robot 7 is to the right of robot 12 ⇔Robot 7 is situated at xy position (35,79)
∧ Robot 12 is situated at xy position (10,93)
∨
Trang 78First-order predicate logic:
position(number, xPosition, yPosition)
∀u ∀v is further right(u, v) ⇔
∃xu ∃yu ∃xv ∃yv position(u, xu, yu) ∧ position(v, xv, yv) ∧ xu > xv,
Trang 79Terms, e.g.: f (sin(ln(3)), exp(x))
of function symbols The sets V , K and F are pairwise disjoint We dene the
• All variables and constants are (atomic) terms
is also a term
Trang 80Definition 3.2 Let P be a set of predicate symbols Predicate logic mulas are built as follows:
is an (atomic) formula
• If A and B are formulas, then ¬A, (A), A ∧ B, A ∨ B, A ⇒ B, A ⇔
• If x is a variable and A a formula, then ∀x A and ∃x A are also formulas
∀ is the universal quantier and ∃ the existencial quantier
• Formulas in which every variable is in the scope of a quantier are called
rst-order sentences or closed formulas Variables which are not in
predi-cate logic literals analogously
Trang 81Examples:
Trang 82∃x baker(x) ∧ ∀y customer(y) ⇒ mag(x, y) There is a baker who likes all of his customers
Trang 83• a mapping from the set of constants and variables K ∪ V to a set W ofnames of objects in the world;
• a mapping from the set of function symbols to the set of functions in theworld Every n-place function symbol is assigned an n-place function;
• a mapping from the set of predicate symbols to the set of relations in theworld Every n-place predicate symbol is assigned an n-place relation
Trang 84Example: Constants: c1, c2, c3 , two-place function symbol“plus”, two-place
Choose interpretation:
Thus the formula is mapped to