44
4. Probleml ¨ osen und Suche 4.1. Intelligente Suche Vorlesung Intelligente Informationssysteme Wintersemester 2004/2005 11. 2. 2005 Prof. Dr. Bernhard Thalheim Information Systems Engineering Group Computer Science Institute Kiel University, Germany

4. Probleml˜osen und Suche 4.1. Intelligente Suchefiedler/teaching/ws2005/... · 2005-02-28 · To solve a problem, we must formalize them in a precise way. States - descriptions

  • Upload
    dodat

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

4. Problemlosen und Suche4.1. Intelligente Suche

Vorlesung Intelligente Informationssysteme

Wintersemester 2004/2005

11. 2. 2005

Prof. Dr. Bernhard ThalheimInformation Systems Engineering Group

Computer Science InstituteKiel University, Germany

1

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Vorlesungsprogramm¨§

¥¦Suche ist das grundlegende Problem

Search spaces

find good algorithms based on appropriate data structures

Reduction

for higher feasibility

Search problem in general

towards a unified theory

Techniques

classical techniques of searching

Optimization

intelligent search is informed heuristic search

Summarizing

2

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Search Space¨§

¥¦To solve a problem, we must formalize them in a precise way.

States - descriptions of the “world”

Operators - actions that transform one state into another

Given: initial state S and a set of goals G = {G}Goal: sequence of operator applications from S to G ∈ GDifferent conceptualizations of states, goals and operators:Example: Water jug problem given: 2 jugs (3 and 4 l) and a pump

question: how do you get exactly two liters into the four liter jug?Step 1: choose a representation for the state space (x, y) with 0 ≤ x ≤ y, 0 ≤ y ≤ 3Step 2: represent operators, that is, transition rules between spaces,1. (x, y), x < 4 → (4, y) fill x2. (x, y), y < 3 → (x, 3) fill y3. (x, y), x > 0 → (0, y) dump x4. (x, y), y > 0 → (x, 0) dump y5. (x, y), x + y ≥ 4, y > 0 → (4, y − (4− x)) pour from y to x until x is full6. (x, y), x + y ≥ 3, x > 0 → (x− (3− y), 3) pour from x to y until y is full7. (x, y), x + y ≤ 4, y > 0 → (x + y, 0) pour all water from y to x........

One solution fill y; pour from y to x; fill y; pour from y to x; dump x;

pour all from y to x

(0, 0) →2 (0, 3) →7 (3, 0),→2 (3, 3) →5 (4, 2) →3 (0, 2) →7 (2, 0)

3

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Solution through production system

(1) a set of rules, each with an applicability condition anda description of the action to be takenhow to move from one state to another

(2) database with appropriate information

(3) control strategydecides which rules to use in a given situationit should cause motion and be systematic

often by heuristic control strategies

Rich: “A heuristic is a technique that improves efficiency of a search process,

possibly by sacrificing claims of completeness.”

Often solution by Horn formulas on a database consisting of facts (atomar formulas)

Problem cha-

racteristics:

(1) decomposable

(2) solution steps can be ignored or undone

(3) predictable universe

(4) is a good solution immediately clear, oris it relative good one compared with other solutions?

(5) is the information base internally consistent?

4

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Types of production systems

(1) monotonic

if a and b could be applied, applying a doesn’t keep b from being applied

(2) partially commutative, Church-Rosser, or confluent

if a; b; c transforms X to Y then b; a; c also transforms X to Y (if b; a; c is ’legal’)

(3) commutative: monotonic and partially commutative

monotonic: good for ignorable problems, e.g., Theorem proving

non-monotonic, reversible: order doesn’t matter, e.g., 8-puzzle

non-monotonic, irreversible: e.g., chemical synthesis

# Problems can be represented by states and transitions that move between states.

A solution to the problem is a transition sequence between initial and goal states.

Water jug problem

(4,3) (0,0) (1,3)

(4,0)

... ...

(4,3) (0,0) (3,0)

(0,3)

(0,0)

5

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Deletion strategies (1)eliminate clauses before they are ever used

Pure literal elimination: remove any clause containing a ’pure’ li-

teral (a literal that has no complementary)

instance in the data base: {¬P ∨ ¬Q ∨R,¬P ∨ S,¬Q ∨ S, P, Q,¬R}S is a pure literal, for purposes of refutation S-clauses won’t help

Tautology elimination: eliminate clauses that contain complemen-

tary literals - tautologies

{P (G(a)) ∨ ¬P (G(a)), P (x) ∨Q(y) ∨R(z) ∨ ¬Q(y)} ⇒ ∅Subsumption elimination: a clause α subsumes a clause β if there

is a substitution σ such that β → σ(α)

{P (x) ∨Q(y)} subsumes {P (A) ∨Q(v) ∨R(w)} for σ = {x/A, y/v}delete β (for refutation α is ‘necessary’ to reduce)

6

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Deletion strategies (2)

Unit resolution: at least one of the clauses being resolved at every step is a

‘unit clause’, i.e. one containing a single literal

resolvent contains only as much disjuncts as length of the second - 1

Unit resolution is not refutation complete

{p ∨ q,¬p ∨ q, p ∨ ¬q,¬p ∨ ¬q}! A Horn clause is a clause with at most one positive literal.

There is a unit resolution of a set of Horn clauses if and only if

that set is unsatisfiable.

Linear Resolution: At least one of the clauses being resolved at every step

is either in the initial database or is an ancestor of the other clause,

p ∨ q,¬p |= q, q,¬q |= 0

Linear resolution is refutation complete.

Input resolution: At least one of the clauses being resolved at every step

is a member of the initial (i.e., input) database.

Input refutation is complete for Horn clauses but incomplete in general.

7

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Orderung & reduction strategies (1)restrict derivations to a certain extent

Background: α→β,β→δα→δ

¬α ↔ (α → 0) α ↔ (1 → α)

Ordering of clauses: complete

extension that is also complete: linear resolution with ordered clauses

Ordered resolution: Each clause is treated as a linearly ordered set.

Resolution is permitted only on the first literal of each clause.

Literals in the conclusive preserve parent clauses’ order - ’positive parent’ clau-

ses followed by ’negative parent’ clauses.

Refutation by ordered resolution is complete for Horn clauses but

incomplete in general (just like unit resolution and input resolution).

8

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Orderung & reduction strategies(2)restrict derivations to a certain extent

Directed resolution: Use of ordered resolution on a database of

‘direction’ clauses - i.e., Horn clauses with any positive literal either

at the beginning or the end of the clause.

¬α1 ∨ . . . ∨ ¬αn ∨ β ←→ ¬α1 ∨ . . . ∨ ¬αn → β

β ∨ ¬α1 ∨ . . . ∨ ¬αn ←→ β ← ¬α1 ∨ . . . ∨ ¬αn

(¬α1 ∨ . . . ∨ ¬αn) ←→ (¬α1 ∨ . . . ∨ ¬αn →) ←→ (← ¬α1 ∨ . . . ∨ ¬αn)

Example: (forward)

1. ¬M(x) ∨ P (x) M(x) → P (x)

2. M(a) M(a)

3. ¬P (z) P (z) →4. P (a) 1. + 2. P (a)(M(a), M(x) → P (x))

5. 0 3. + 4.

Same problem in

backward directed

resolution

1. P (x) ∨ ¬M(x) P (x) ← M(x)

2. M(a) M(a)

3. ¬P (z) ← P (z)

4. ¬M(z) 1. + 3. ← M(z)(← P (z) ← M(z))

5. 0 2. + 4. (← M(z), M(a))

9

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Orderung & reduction strategies(3)restrict derivations to a certain extent

Lock resolution: an arbitrary number is associated with each literal

in a clause; any application of resolution must be upon literals of the

least index in each clause; literals in resolvents inherit the number

of parents; duplicate removal for higher numbers

Example: {p ∨ q,¬p ∨ ¬q,¬p ∨ q,¬q ∨ p}⇒ {1p ∨2 q,¬2p ∨ ¬1q,¬1p ∨2 q,¬1q ∨2 p}

1. 1p ∨2 p2. ¬2p ∨ ¬1q

3. ¬1p ∨2 q4. ¬1q ∨2 p5. 2q 1. + 3. (but 1. + 4. not allowed)

now there are two cuts possible: 2.+ . and 4.+ . (and 2.+4.)6. ¬2p 2. + 5.7. 1p 4. + 5.8. 0 6. + 7.

Lock resolution is refutation complete.

10

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Semantical concepts (1)

idea: divide and conquer;

divide the set of clauses into a ‘positive’ and a ‘negative’ part so

that resolution is only allowed between opposite parts

Semantical clauses: Instead of deriving sequentially formula after

formula we can derive it in one step {p, q,¬p ∨ ¬q ∨ α} |= α

Semantical clash + Ordered Resolution: Hyperresolution

11

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Semantical concepts (2)

Set of support resolution: a subset M of a set N is called set of

support for N if N \M is satisfiable;

at least one of the clauses being resolved at every step is selected

from a set of support M

{p,¬p ∨ q,¬p ∨ ¬q ∨ r,¬r}first three are satisfiable → M = {¬r}resolution:

(3) (4) ¬p ∨ ¬q (5) added to M

(2) (5) ¬p (6) added to M

(1) (6) 0

Set of support: refutation is complete.

Often M is chosen to be the clauses derived from a negated goal (if the initial

database is satisfiable).

Can be seen as working backwards from the goal.

12

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Connection methods (1)

Connection graph procedure: representing a set of clauses in agraph(nodes are clauses, edges are assigned if there is a unification -labels which are assigned: most general unifier)

(1) delete all edges which resolvents gives a tautology

(2) delete all nodes which haven’t an outgoing edge

(3) choose an edges and change the graph by replacing the two nodes by its

resolvent and changing the other mgu in accordance

Given {Q(b),¬Q(z) ∨R(z), Q(a),¬P (x, f(x)),¬R(y) ∨ P (y, f(y))}Q(b) z/b ¬Q(z) ∨R(z) z/y ¬R(y) ∨ P (y, f(y))¬Q(z) ∨R(z) z/a Q(a)reduction graph for the set: {Q(b),¬Q(z) ∨R(z), Q(a),¬R(x)}reduction graph for the set: {R(b),¬R(x)}Q(z) → R(z) has two edges to Q(a); one these could be deletedIn some case a subgraph is derived together with the 0.

The connection-graph method is refutation complete.

13

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Reduction of the search space forresolution strategies

Connection methods (1)

Connection-matrix method: rewrite the set of clauses by a matrixL1,1L1,2 . . . L1,m1

L2,1L2,2 . . . L2,m2

...................Ln,1Ln,2 . . . Ln,mn

then construct a vertical path through the matrix if there are two opposite

literals in the path then this path is complementary

if all vertical pathes are complementary then the matrix is complementary

α is unsatisfiable if Mα is complementary

extending for FOPL formulas

path is potentially complementary if there is at least one pair of potentially

complementary literals on it

matrix is refutable if each path is potentially complementary and

the mgu of at least one pair of potentially complementary literals

on each path is compatible with the substitutions computed for other paths

using Herbrand universes for the replacement of original rows by instances of them

there is a variant for the proof of tautologies using a similar method for

disjunctive normal forms

14

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

The Search Problem¨§

¥¦Parameters of the search problem

(1) direction

(2) topology

(3) node representation

(4) selecting rules

(5) heuristic functions

Direction: search forward from I to G or backward from G to I.

The same rules are used in either direction, but the way we use them changes.

Sometimes it is easier to search forward, and sometimes it is easier to search

backward.

(1) More start or goal states? We want to move towards the larger set.

(2) Branching factor? We want to move in the direction with lower branching factor.

(3) Need to interact with human and explain reasoning? One direction may be more

natural than the other.

Bi-directional search is also possible....

15

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

The Search Problem¨§

¥¦Parameters of the search problem

Topology: graphs versus trees

Representation of nodes: 1. How are the objects and facts represented?

(ontology)

2. How is a complete state represented?

3. How are sequences of states represented?

Frame problem When moving between states, what facts change, and what

facts don’t change?

Selecting rules: 1. Efficient hashing (indexing) schemes

2. Matching with variables

3. Building “too much” into the rules, so matches are automatic

4. Resolving conflicts (e.g., choose most specific rule)

Heuristic functions: Assigns to each state a “measure of desirability” Helps

guide search, but not used in every search technique.

16

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

∧/∨ Graphs

∧/∨ graphs tree with nodes representing the objects which branches

are interpreted as OR-choices

Example: traveling salesman

Representation by a tree through full unfolding

∧/∨ graphs: for better representation of some problemsa solution to an AND/OR graph is a subgraph all of whose leafnodes are in the goal setcan be used to represent grammars (CF-grammars are productions); the resul-

ting ’parse tree’ is an AND/OR graph

# Problem: Lack of independence. In And/Or graphs, individual paths from

node to node cannot be considered in isolation from other paths through other

nodes (connected to the first by an AND arc).

Longer paths may be better (In OR graphs this is never true.).

17

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Search techniques: Depth-first-searchDive into the search tree as far as you can, backing up only when you

reach the end of the tree

see OR-tree

(1) Form a one-element queue consisting of the root node

(2) Until queue is empty or goal has been reached, determine if first

element in queue is the goal

• If it is, stop.

• If not, remove it and add its children (if any)

to the front of the queue

(3) If goal is found, announce success; if not, failure.

Depth-first search can be very inefficient, e.g., goal = second child of the root

0(bd)-time complexity and 0(d)-space complexityfor search spaces represented by trees of depth d and breadth d;

in the bidirectional variant still 0(bd/2)-time

18

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Search techniques: Hill climbingusing a heuristic function, make the next best local move

OR-tree above (any function or an ’enriched’ distance function; below: arbitrary function)

! heuristic is a rule of thumb or judgemental technique that leads to a solution

some of the time but provides no guarantee of success; it may in fact end in failure;

however play an important role in search strategies because of the exponential

nature of most problems; they help to reduce the number of alternatives from an

exponential number to a polynomial number and, thereby, obtain a solution in a

tolerable amount of time;

example: traveling salesman problem - ‘take the nearest unvisited neighbor’ requires only

0(n2);

heuristic search functions - informed search

19

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Search techniques: Hill climbing

(1) Form a one-element queue consisting of the root node

(2) Until queue is empty or goal has been reached, determine if first

element in queue is the goal

• (a) If it is, stop.

• (b) If not, remove it then sort the first element’s children, if any,

by estimating remaining distance, and add them to the front of

the queue.

(3) If goal is found, announce success; if not, failure.

Problems with hill climbing:

• foothills (local maxima or peaks) (the children have less promising goal distan-

ces than the parent node; no indication of goal direction)

• Plateaus(flat areas) (all neighboring nodes have the same value)

• Ridges (e.g. snow flake trees) (several adjoining nodes have higher values than

surrounding nodes)

20

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Search techniques: Breadth-first search

(1) Form a one-element queue consisting of the root node

(2) Until queue is empty or goal has been reached, determine if first

element in queue is the goal

• (a) If it is, stop.

• (b) If not, remove it, and add its children

to the back of the queue

(3) If goal is found, announce success; if not, failure.

Breadth-first search can be expensive (in time and space).

0(bd) - time and space complexity for search spaces representedby trees of depth d and breadth d

21

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Search techniques: Beam searchlike breath-first search, but use a heuristic function.

Proceed level by level, but only expand best k nodes at each level,

ignoring the rest.

22

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Search techniques: Best-first searchPress forward from the best unexpanded node so far, regardless of

where it is in the tree,

(1) Form a one-element queue consisting of the root node

(2) Until queue is empty or goal has been reached, determine if first

element in queue is the goal

• (a) If it is, stop.

• (b) If not, remove it and add its children (if any) to the queue.

Sort the entire queue by estimation remaining distance.

(3) If goal is found, announce success; if not, failure.

Problem: heavily depends from the heuristic function

Best-first searches will always find good paths to a goal, even when

local anomalies are encountered. All that is required is that a good

measure of goal distance be used.

23

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Some informal criteria for searchtechniques

Depth-first: when blind alleys aren’t too deep

Breath-first: when the branching factor isn’t too big

Hill-climbing: when a good heuristic function exists, and local choi-

ces lead to final goal

Beam-search: when a good heuristic function exists, and good-

looking partial paths at each level lead to a goal

Best-first: when a good heuristic function exists, and a good path

may look bad at shallow levels.

Some search methods give us an optimal path to a goal, others give

us just some path.

“British-Museum” procedure

Exhaustive search

24

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Optimal search techniques: Branch andbound¨

§¥¦Trick: paying attention to path length

always expand the shortest path so far

no heuristic is used, just the actual elapsed distance

still optimal

More interesting heuristic function:g - distance traveled so farh’ - estimated distance remaining

new heuristic function f’ = g + h’bad estimates of remaining distance can cause extra workoverestimates can cause us to fail to find an optimal pathbut with an underestimated h′ , we are guaranteed to findan optimal pathbecause, with underestimates, the actual distance from any unexpanded node

to the goal will be more than the heuristic function guesses.

The heuristic function must be developed from prior knowledge of the domain,not by searching to a solution.

Functions used for the 8-puzzles:

1. number of tiles in wrong place 7

2. sum of the ’distances’ each tile is from its home position 12

....

25

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Optimal search techniques: DynamicProgramming

¨§

¥¦In graph searches, there are sometimes redundant paths.

Discarding redundant paths (dynamic programming) can improve

efficiency ....

When looking for an optimal path from S to G, all paths from S

to I , other than the minimum length path from S to I, can be ignored.

Branch and bound + dynamic programming: h′ = 0

Expand the shortest path so far, but discard redundant paths that are

too long.

26

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Optimal search techniques: A∗ algorithm¨§

¥¦branch and bound + estimate of remaining distance + dynamic programming

(1) Form a queue of partial paths, initially consisting of the zero length,

zero-step path from the root node to nowhere

(2) Until the queue is empty or the goal has been reached, determineif the first path reaches the goal.

• (a) If so, stop.

• (b)If not, then:

(i) remove the queue’s first path

(ii) Form new paths by extending it one step

(iii) Add the new paths by extending to the queue

(iv) Sort the queue using f’ = g + h with the least cost path in front

(v) If 2 or more paths reach a common node, delete all those paths except

for one that reaches the common node with minimum cost

(3) If goal is found, success; if not, failure.

Termination condition (to guarantee that the path it finds is optimal):

Even after a goal is found (at cost c) expand unexplored paths until

each of their g + h′ values is greater than or equal to c

27

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Optimal search techniques: A∗ algorithm

Britishmuseumalgorithm

small search tree

branch and bound

bad pathsturn distinctlybad quickly

A∗ algorithm

branch and bound with a guess

good lower-bound estimateof the remaining distance

A∗ algorithm

dynamic programming

many paths convergeon the same place

large search tree

applications of the A∗ algorithm

Other applications:

(A) robot path planning (collision-free path)1. redescribe the problem in another simpler representation2. build a fence for one point of the robot configuration-space transformation3. if fences are not dense then there exists a path4. point representation (full graph; (e, e’) if direct adjacency (visibility graph))5. guess function: direct distance to find point6. A*

(B) configuration space generalizations

moving object many rotate ; several configuration space other move trajectory

28

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Optimal search techniques: AO∗ algorithm¨§

¥¦branch and bound + estimate of remaining distance + dynamic programming

(1) Traverse the graph (from the initial node) following the best current

path.

(2) Pick one of the unexpanded nodes on that path and expand it. Add

its successors to the graph and compute f’ for each of them (using

only h’ , not g).

(3) Change the expanded node’s f’ value to reflect its successors. Pro-

pagate the change up the graph. Reconsider the current best path.

29

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Optimal search techniques: Trees andadversarial search

how to play board games (checkers & chess: one choice ; producing another tree of

choices)

but here now: interdigitated choices of two adversaries

new issue: competition!

min-max search with alpha-beta pruning reducing search by stopping

work on guaranteed losers or progressive deepening or heuristic deepe-

ningnodes represent board positions:

play of game tree level number of the tree = depth + 1 (root: depth 1)

move: on player’s choice and another player’s reaction or one choicee.g. deep blue ≈ 14 Kasparov ≈ 7 moves

exhaustive search is impossiblee.g. branching factor 16 ; 16100

(chess with depth: 100 ) more than picoseconds since the big bang (if any)

restricted exhaustive search: local search

30

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Min-Max Proceduree.g. games: node evaluation . . . -3 -2 -1 0 1 2 . . .

negative numbers: favor the second positive numbers: favor the first

minimizer maximizer

node evaluation: static with evaluation score

Game tree: semantic tree

nodes denote board configurations

branches denote moves

writer: evaluation score + board configuration - description table

reader: whom next move (minimizer, maximizer)

board configuration’s description

Alpha-beta principle:

if you have an idea that is surely bad, do not take time to see

how truly awful it is

31

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Procedure Min-Max SearchMin-Max Search: 1. search limit reached? # report the result

2. if level = minimizing # apply min

3. if level = maximizing # apply max

2 4

Maximizing = 4

2 1

Maximizing = 2

Minimizing = 2

1 8

Maximizing = 8

2 7

Maximizing = 7

Minimizing = 7

Maximizing = 7

32

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Procedure Alpha-beta Principle

• root ; (−∞,∞)

• repeat

• if minimizing-level then

begin repeat

get next child

if β (next child) < β then β := β(next child)until all children examined or α ≥ β

report β

end

• if maximizing level then

begin repeat

get next child

if α (next child) > α then α := α (next child)until all children examined or α ≥ β

report α

end

report α

until search limit or α (root) = β (root)

33

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Alpha-beta Principle(α, β) is better but does not prevent exponential growth

s =

{2b

d2 − 1, d even;

bd+12 + b

d−12 − 1, d odd.

but does much better than exponential growthbranching factor 10

depth 1 2 3 4 5 6 7

usual 101 102 103 104 105 106 107

(α, β) 101 19 102 + 10− 1 =109 199 1099 1999 10999

hometask: tic-tac-toe using symmetries

MAX: crosses; MIN: circles e(p) = ∞ win for MAX

e(p) = # ( complete row, columns, diagonals open for MAX) -

# ( complete row, columns, diagonals open for MIN)

o x 6 - 4 = 2 x

o

5 - 4 = 1

34

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Escaping Local Optima¨§

¥¦so far: optimal solution & expensive

¨§

¥¦or inexpensive, local & get stuck in local optima

2 ideas: ! iterative hill-climbing

additional parameter (temperature)changing the probability of moving from one pointof the search space to another à simulated annealing

memory which forces the algorithm to explorenew areas of the search space à tabu search

Analogical reasoningphysical system optimization/search problem

state feasible solution

energy evaluation function

ground state optimal solutions

rapid quenching local search

temperature control parameter

careful annealing simulated annealing

35

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

From Local Search to Simulated Annealing

procedure local search

beginx = some initial starting point in Swhile improve(x) 6= ’no’ do y ∈ N(x) but y better than x

x = improve (x)return (x)

end

procedure simulated annealing T - temperature

begin

x = some initial starting point in Swhile not termination - condition do

x = improve? (x,T) might return a better point ∈ N(x)

update (T)

return (x)

end

36

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Simulated AnnealingProblem-specific questions: What is a solution?

What are the neighbors of a solution?What is the cost of a solution?How do we determine the initial solution?

Neighborhood, evaluation function, initial starting point

Additional questions:How do we determine the initial “temperature” of T?How do we determine the cooling ratio g (T,t)?How do we determine termination condition?How do we determine halting criterion?

Tmax - boiling Tmin - freezing

T := Tmax

select vc at random

pick randomly a point vn from the neighborhood vc

if eval (vn) is better than eval (vc)

then select it vc := vn

else select it with probability eeval(vn)−eval(vc)

T

T : = r T cooling ratio

if T ≥ Tmin then repeat this step k T times

until halting condition

37

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

From Local Search to Tabu Searchprocedure local search

beginx = some initial starting point in Swhile improve(x) 6= ’no’ do y ∈ N(x) but y better than x

x = improve (x)return (x)

end

procedure tabu search H - history

begin

x = some initial starting point in Swhile not termination - condition do

x = improve? (x,H) returns accepted solution y ∈ N(x)

acceptance based on search history

update (H)

return (x)

end

38

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Example Tabu Search: SATGiven formula F (x1 . . . , x8) T evaluates to true for a truth assignment

e.g. x = (0, 1, 1, 1, 0, 0, 0, 1) as starting point

eval(x) - weighted sum of a number of satisfied clauses

weight: e.g. number of variables in the clause

N(x) all neighbors of x (here 8) flipping 1 bit which is the best x = besty∈N(x)y ?

memory M = (χ1, . . . , χ8) - how many steps xi ist not allowed

e. g. flip 3 ; M+ = (0, 0, 5, 0, 0, 0, 0)

next selection step with improvement only for allowed slots

e.g. ⇒ new M = (3, 0, 1, 5, 0, 4, 2, 0)

available for flipping: x2, x5, x8

x = (0, 1, 1, 1, 0, 0, 0, 1) ; x′ = (1, 1, 0, 0, 0, 1, 1, 1) with eval(x) = 33

other neighbors of x for flipping: 10000111, 11001111, 11000110

all are evaluated:

1) if there is an outstanding then forget about the principles

or 2) take the best of the possible

e.g. x5 ; 32 ; M = 200045310

other overriding rules as aspiration criterion

other M applications M(i) = j

e.g. during last h iterations of the algorithms the i th bit was changing j times

searching into under-represented directions

adding penalties for bad choices

with long term memory e.g. eval(x) - ratio × penalty(x)

39

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Evolutionary approaches¨§

¥¦also genetic algorithmic evolutionary programming

so far:

• deterministic or probabilistic

• single “current best” solution

idea:

• working with a population of solutions and• competition among solutions

example: population of foxes and rabbitsfast/slow intelligent/stupid rabbits & inherit propertiesstupid ∨ slow are eaten, random permutations of parentssame with foxes

example: ant’s pathes

SAT problem (F (x1, . . . , x100): search for x : F (x) = TRUE)initial population: p1, . . . , p20 none of them solutionstep: evaluate p1, .., p20, e.g., # clauses which evaluate to TRUE

fitness for pairing & surviving: weight= roulette wheel slices for the individuals

20 random selections (1, j1), . . . (20, j20)if (i, ji) with ji = i then surviveif (i, j) with ji 6= i then i will not pass unchanged into the next

generation with probability t ,e.g., 0.9 and then broken apartcut & slice

check p′1, . . . , p′20 : if none is the solution then repeat step

40

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Designing evolutionary algorithmsimprovement: if population gets too unique then flip a single bit at random

application: classificationtesting

Designing evolutionary algorithms

• several cut points, e.g., 2: P1 = (1, 3, |5, 6, 2|, 4, 7, 8) with length 3

• off spring:

(1) P1 between cut points

(2) use P2 = (3, 6, 2, 1, 5, 4, 8, 7) for the rest if they are not already taken

i.e. use (|5, 6, 2|), reduce P2 by (|5, 6, 2|): P ′2 = (3, 1, 4, 8, 7)offspring: (5, 6, 2, 3, 1, 4, 8, 7)

• permutations:

(1) choose cutpoints, e.g., 3 in P2 with P2 = (3, |6|, 2, |1|, 5, 4, |8|, 7)

(2) cutpoints, e.g. (6, 1, 8)

(3) take the rest from P1 to the offspring if they are not already taken

(4) offspring: (6, 1, 8, 3, 5, 2, 4, 7)

41

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Application of evolutionary algorithms:Moving a robot

Directions for moves: n, ne, e, se, s, sw, w, nw

Additionally: grids, walls

Robots which can follow different directions

Try to evolve a wall-following robot

Example

(IF (AND (OR(north)(north east)) (NOT(east))) (east)

(IF (AND (OR(east)(southeast)) (NOT(south))) (south)

(IF (AND (OR(south)(southwest)) (NOT(west))) (west)

(north))))

north northeast

OR

east

NOT

AND east

east southeast

OR

south

NOT

AND south

south southeast

OR

west

NOT

AND west north

IF

IF

IF

42

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Development of evolutionary algorithms

(1) Start with a population of random programs e.g. 5000 evaluation function: x

runs - # cells next to the wall that are visited during 60 steps10 runs with random points

; fitness of the program 0 .... 320 for 32 cells at the wall

(2) i 7→ i + 1

(A) 10 % of programs copied to the next generation

tournament (for selection): choose-random 7, take the fittest among those

(B) 90 % by mother & father crossover

tournament for mother selection, father selection

(C) [ ik ] = i

k then apply to 10 % of the 90 %

select single parent by tournament

chose subtree, delete, replace by newly grown random subtree

43

IntelligenteIS.12

Intelligente

Suche

11. 2. 2005

B. Thalheim

Suchraum

Reduktion

Suche i.a.

Techniken

Optimierung

Ausblick

Concept Topic

Content

Information

Example of randomly chosen crossover

southeast

west

NOT south

west south

OR

IF

AND

NOT

southeast

north

NOT

AND

east northwest

OR southeast

IF

mother:

west south

OR

father:

southeast

north

NOT

AND

child:

southeast

west

NOT south

southeast

north

NOT

AND

IF

AND

NOT

44