Declarative Programming (Prolog)

Ace your homework & exams now with Quizwiz!

Section 14: Operators used for comparing *terms*

@< : less than @=< : less than or equal to @> : greater than @>= : greater than or equal to. These are best used on ground terms.

Section 12: SLD Resolution

A deduction strategy that determines the consequences of a logic program. to show that a goal is true, it is sufficient to show that any of the clauses that goal pattern matches to is true. We essentially create a tree where we branch out when we're given many clauses that match the goal. Sometimes, a goal contains many goals within it, in which case you'd need to choose which goal to resolve first. Prolog builds this tree and searches it.

Section 12: How is a ground atomic formula a logical consequence of a program.

A ground atomic formula 'a' is a logical consequence of a program 'P' if 'P' makes 'a' true. This means that ¬a is a logical consequence of P if 'a' is not a logical consequence of P. (i.e. something is false if the program doesn't say that it is true).

Section 12: interpretation

A mapping from the symbols in the program to the world of the program.

Section 10: substitution

A mapping from variable to terms (Substitutions only replace variables (not atomic or compound terms))

Section 12: How to represent the meaning of the predicate

A predicate is defined by a disjunction of a finite number of clauses, each of the form of an implication ∀A∀B : parent(A, B) ← (A = queen elizabeth ∧ B = prince charles) ∨ (A = prince philip ∧ B = prince charles) ∨ (A = prince charles ∧ B = prince william) ∨ (A = prince charles ∧ B = prince harry) ∨ (A = princess diana ∧ B = prince william) ∨ (A = princess diana ∧ B = prince harry)

Section 10: Proper List

A proper list is defined as either an empty list or a list where the tail is a proper list properlist is a predicate that recognises a proper list properlist([]) properlist([_|Tail]) :- properlist(Tail)

Section 9: Conjunction

A query may use multiple goals. You treat a comma as an "and", and the resulting query asks for all the possible bindings that satisfy all of the goals in the query. parent(queen_elizabeth, X), parent(X,Y)

Section 13: Tail recursive

A recursive predicate is tail recursive if the last call in the body is a recursive call to itself.

Section 15: A constraint problem

A set of constraints that any solution must satisfy over a set of variables that you specify. You solve (find a binding for) the set of variables so that they satisfy all the constraints. An objective function selects which of the possible solutions is preferred.

Section 9: Fact

A statement of a relation that's fully saturated. Written into the source file and loaded into Prolog, it's treated as a true statement. A collection (may be infinite) of facts define a relation

Section 15: Boolean Satisfiability (SAT systems)

A subclass of finite domain problems. All of the variables are boolean. You have a propositional formula and you must find some satisfying substitution (binding for each of the boolean variables)

Section 10: Unifier

A substitution that unifies two terms

Section 9: Rule

A way of defining a predicate, by saying the predicate is true if a condition is met. Head :- Body Head is of the form of a fact Body is of the form of a query. It's read as "Head is true if the Body is true"

Section 13: Accumulating lists

Adding an accumulator that holds a list. With each recursive call, we pop the head off of the input list and use list construction to add that element to the head of the accumulator. This avoids the use of append which is linear time at each step, whereas list construction is constant time at each step.

Section 10: Term

All datastructures are called terms atomic term: integers, floats, atoms, etc. compound term: what we would consider datastructures variable: *type of term*, considered a datastructure in Prolog

Section 15: Normalized definition of a program.

All nested expressions have been replaced by let-bound temporary variable. Deconstructions are replaced by case constructs. for example: map :: (a->b) -> [a] -> [b] map _ [] = [] map f (x:xs) = (f x):(map f xs) map f l = case l of [] -> [] (e:ex) -> let t = map f es h = f e in h:t

Section 10: atom

An atom is a symbol that stands for something. Must start with lowercase letter in Prolog (i.e. queen_elizabeth)

Section 9: And is not always commutative.

And is usually commutative but this is not always the case in Prolog. *Particularly, if one of your goals is a negated goal.* Prolog executes goals from left to right. Ensure all variables in a negated goal are bound before executing that goal. This means you have to put goals that will bind the variables in a negation before the negation!

Section 13: Tail Recursion Optimisation

Applying the Last Call Optimisation technique on a tail recursive function. Everytime a function calls itself, it cleans its frame and builds a new one for the new call. This saves time and it saves a lot of space (constant stack space, basically equivalent to running a while loop).

Section 12: SLD Resolution in Prolog

At each resolution step, we must make two decisions. 1. Which goal to resolve 2. Which clause matching the selected goal to pursue. Prolog resolves all goals eventually, but will choose the first option when there is a choice. This gives the programmer control over execution.

Section 15: Sudoku

Classic finite constraint satisfaction problem. Think of each square represented as r1c1, r1c2... (row column pair). There are 27 all_distinct constraints. (one for each row, one for each column, and one for each grid box). If you fix a variable, then you reduce the domain of each square in that variables column, row and grid. This concept of systematically reducing the domain of each variable is how Sudoku problems are solved.

Section 15: Constraint Logic Programming strategy for variable binding

Constrain and generate. More sophisticated and efficient than just binding to a Herbrand term. Prolog can mimic this strategy using attributed variables.

Section 9: Mode

Different way of interpreting a relation. The different types of queries you can ask based on which arguments are bound (inputs, non-variable) and which are unbound (output, variable). For example: parent(X, prince_williams) parent(prince_charles, X) parent(X, Y) are three different modes on the same predicate.

Section 10: Compound Term

Equivalent to a "data constructor" Compound terms are called functors. Example: node(leaf, 1, node(leaf, 2, leaf)) Since prolog is dynamically typed, there is o limitation on what terms you construct and no concept of arity.

Section 12: Procedural Interpretation

Example: the grandparent clause: "To show that X is a grandparent of Z, it is sufficient to show that X is a parent of Y and Y is a parent of Z" Whereas the logical interpretation is: "For all X, Y, Z, if X is a parent of Y and Y is a parent of Z, then X is a grandparent of Z"

Section 13: Accumulator

Making predicates tail recursive often requires adding another argument. This additional argument is called the accumulator.

Section 11: How to deal with unwanted nondeterminism

Option 1: Include another constraint fact(0, 1) fact(N, F) :- N > 0, N1 is N - 1, fact(N1, F1), F is N * F1. This still has a choice point, which inhibits the use of last-call-optimisation Option 2: use if then else fact(N,F) :- (N =:= 0 -> F = 1 ; N > 0, N1 is N - 1, fact(N1, F1), F is F1 * N ).

Section 9: goals

Predicate Applications (called atoms in predicate logic)

Section 9: Variables in Prolog

Predicate arguments in a query can be variables, in which case the query asks what value(s) for the variables(s) makes this statement True? Variables start with a capital letter or an underscore.

Section 14: What are all solution predicates?

Predicates that bring together all solutions to a goal

Section 12: Indexing

Prolog automatically makes an index for a predicate (usually only for the first argument). This means, if the first argument is bound, Prolog jumps to the first clause it matches. Then when backtracking occurs, Prolog jumps straight to the next clause that matches. It also means that Prolog can quickly identify when there are no more matching clauses, in which case it can remove the choice point (or avoid making the choice point altogether).

Section 12: Backtracking

Prolog drops a "choicepoint" at the point where multiple clauses match a goal, and then chooses one of those clauses to pursue. When Prolog fails, it backtracks to the most recent choice point, removing all variable bindings that happen after that point). And then prolog begins resolution on the next matching clause. Prolog continues this process until there are not more matching clauses, in which case it removes that choice point.

Section 14: I/O being impure

Prolog executes I/O when the I/O operation has been reached. I/O is not undone on backtracking.

Section 15: The steps for solving finite domain constraints.

Propagation Labelling

Section 15: Linear inequality constraints

Real number constraints. You have inequalities with constants and varaibles.

Section 15: Propagation

Reduce the domain of each variable as must as possible. For each constraint, check if the constraint reduces the domain of any variables in that constraint. If it does, remove that value from the domain of the variable, and run the constraints that include that variable again. End propagation when: If every variable has a domain of size one. If some variable has an empty domain If there are no more constraints to look at.

Section 10: applying a substitution to a term

Replacing all occurrences of a variable with the term it is mapped to.

Section 9: Clause

Rules and Facts. Predicates can be defined with any number of clauses.

Section 9: Prolog can't solve for the negative

Solving for the negative means asking for the possible values for a variable in a statement written as "not predicate" i.e. \+ parent(queen_elizabeth, X) This is asking who queen elizabeth is not a parent of. This will return False because Prolog can't compute this. Prolog won't bind variables to a goal that failed. In order for \+ parent(queen_elizabeth, X) to succeed, Prolog looks for when parent(queen_elizabeth, X) does not succeed. but when parent(queen_elizabeth, X) does not succeed, X won't get bound. *Negated goals will never give you values for the variables.*

Section 11: Problems with nondeterminism

Sometimes we want things to be considered mutually exclusive. for example: fact(0,1) fact(N, F) :- N1 is N - 1, fact(N1, F1), F is N * F1 This will cause an infinite loop, because it leaves a choice point at fact(0, 1), return one of the options, and then tries to compute fact when N is negative.

Section 14: msort/2

Sorts a list (doesn't remove duplicates) msort(List1, List2).

Section 14: keysort/2

Sorts a list of X-Y pairs based solely on the X value.

Section 14: sort/2

Sorts a list with duplicates removed sort(List1, List2).

Section 15: Constraint Logic Programming

Specify your program purely as the set of constraints on the result you want (possibly an objective function to specify which of many possible solutions is the preferred one.)

Section 14: exclude/3

Takes a Predicate, a list1 and list2 Like include but the opposite (list2 is a list of list1 elements that fail on the predicate).

Section 14: include/3

Takes a Predicate, a list1 and list2. Where list2 is the elements of list1 (in the same order) that succeed on predicate. Haskell equivalent: filter :: (a -> Bool) -> [a] -> [a]

Section 15: append/2

Takes a list of lists and flattens it to be a single list.

Section 14: maplist/2

Takes a predicate and a list. Succeeds if every element in the list succeeds the predicate. Haskell equivalent: all :: (a->Bool) -> [a] -> Bool

Section 14: maplist/3

Takes a predicate and list1 and list2. Succeeds if the predicate succeeds when it is applied to the ith value of the first list and the ith value of the second list (for all i). Haskell equivalent map :: (a -> b) -> [a] -> [b]

Section 14: maplist/4

Takes a predicate, list1, list2, and list3 Succeeds if when the predicate applied to the ith element of the first list, the ith element of second list and the ith element of the third list succeeds for all elements in the list. Haskell equivalent zipWith :: (a -> b -> c) -> [a] -> [b] -> [c]

Section 12: Clark completion of a program

Taking the meaning of the predicate and implementing the closed world assumption. To do this, we only need to change the implication to a bi-implication: ∀A∀B : parent(A, B) ↔ (A = queen elizabeth ∧ B = prince charles) ∨ (A = prince philip ∧ B = prince charles) ∨ (A = prince charles ∧ B = prince william) ∨ (A = prince charles ∧ B = prince harry) ∨ (A = princess diana ∧ B = prince william) ∨ (A = princess diana ∧ B = prince harry) This means that A is not a parent of B unless they are one of the listed cases.

Section 10: Unification

Taking two terms, binding variables to them that would make these two terms equivalent. term u is an instance of term t if there exists a substitution 𝜃 such that u = t𝜃 We say that 𝜃 unifies the two terms if t𝜃 = u𝜃

Section 9: Closed World Assumption

The assumption that ll true things can be derived from the program. This is what Prolog assumes, and it's often not true. So you have to use negation with care when dealing with incomplete predicates, because something might be true but it's just not written into the predicate (and then asking for the negation is really just checking if the predicate is not provable)

Section 13: Difference pairs

The method of defining predicates that produce a list with an accumulator. Provide two arguments, the list produced and what comes after. generate_whole (X, L, L0) :- generate_first_part(X, L, L1), generate_second_part(X, L1, L0). This is like generating the first part of the list, and then generating the second part of the list, and concatenating the two together, without actually using the concatenation (append) predicate.

Section 9: Predicate

The name of a relation There's no directionality, and there's not necessarily one unique answer.

Section 15: Type inference

The process of inferring the types of variables inside functions. This is an example of a Herbrand constraint problem, because its a matter of solving a set of constraints over variable types. The approach: 1. represent the type of each variable in the program as a Herbrand term. 2. Apply rules to produce quality constraints for each subexpression in the function definition.

Section 10: instance

The result of a term after you apply a substitution to it. Ground terms only have one instance whereas non-ground terms have many (infinite) instances.

Section 12: Meaning of a logic program.

The set of the programs logical consequences as ground unit clauses (atomic formulas)

Section 9: Datalog

The stuff in prolog that doesn't require datastructures The basic stuff, like facts, rules, clauses, predicates. You can form complex queries with just using datalog.

Section 13: How to make a non tail recursive predicate tail recursive: Approach three

The transformation method: Step 1: Define the new predicate we're thinking of as the old predicate and the stuff that comes after the recursive call in the old predicate Step 2: unfold. Step 3: Move the last call of the code closer to the code that comes after the recursive call by putting it into both arms of the conditional at the bottom. Step 4: Simplify (particularly combine the two calls that come after the recursive call.) Step 5: Re associate Step 6: Un combine the calls we just combined, but differently Step 7: Fold (replace a definition of a predicate with the call of the predicate).

Section 11: Byrd Box Model

Think of goal execution as a box with ports for each way to enter and exit. Prolog has 4 ports (2 for entering, 2 for exiting) call: initial entry exit: successful completion redo: backtrack into goal fail: final fail

Section 15: Herbrand constraint systems

This is what prolog can handle. Specifies a set of equality constraints on terms. Basic constraints are unification.

Section 12: The immediate consequence operator

Tp. takes a set of ground unit clauses (C) and a program (P) and produces another set of all of the consequences of C with the program P. P = {q(X, Z) :- p(X,Y), p(Y,Z)} C = {p(a, b), p(b,c), p(c,d)} if you know C is true, this is what you're able to deduce from the Program. Tp = {p(a, b), p(b, c), p(c,d), q(a, c), q(b, d)} This is what you can deduce after one inference step.

Section 12: How to interpret "predicate"

Two equivalent ways: 1. A function from all possible combinations of n terms to a truth value. 2. a set of tuples of n terms. Every tuple in this set is implicitly mapped to true, while every tuple not in this set is implicitly mapped to false. A predicate definition (1) defines the mapping or (2) define the set of tuples in the set.

Section 10: Unifiable

Two terms are unifiable if there exists a unifier

Section 15: Linear Inequality Constraints

Type of problems where you've got limited materials, and you want to find the best solution when given the tradeoffs. To do this, you set up the system of constraints (as provided by the limitations in the problem), and provide an objective function that you want to maximise.

Section 14: setof with an existential quantifier

Use an existential quantifier if you want a list of solutions for a template regardless of the variable binding for the variable not in template (for example, P). setof(C, P^parent(P, C), List) This could be read as, get the set of children C such that for each C there exists a P that satisfies parent(P,C). This returns one solution, whereas if you didn't have the existential quantifier, it would return many solutions.

Section 14: how to implement len to work in different modes

Use if then else, where len2 works when N is known, and len1 works if List is known len(L, N) :- ( integer(N) -> len2(L,N) ; nonvar(N) -> throw(error(type_error.... ; len1(L, 0, N) ).

Section 14: The order of classes

Variables < Numbers < Atoms < Compound terms Within these, terms are ordered as followed: Numbers: numerically Atoms: alphabetically Compound terms: (1) by arity, (2) alphabetically by functor, (3) by arguments (left to right).

Section 10: Singleton variables

Variables that we don't use in our code. If there's a singleton variable in your code, its better form to put an underscore at the front of the variable name or just use an underscore instead of the variable name you provided.

Section 12: How to find the semantics of a logical program.

We build the meaning from the bottom up. Apply the immediate consequence operator starting with C = ∅. Then let the clauses that we're able to deduce from that be the new set of clauses, and apply the immediate consequence operator on the new set of clauses. Repeat infinitely many times. In other words, the semantics of program is Tp applied infinitely many times (starting with the empty set).

Section 11: Infinite Backtracking Loop

When the goal execution never reaches a fail port. This occurs when the goal is constructed using a compound query, where one of the queries takes in arguments that are (majoritavly) non-ground and not constrained by previous queries. Example: rev1([],[]) rev1([A|BC], CBA) :- rev1(BC, CB), append(CB, [A], CBA). This would find itself in an infinite backtracking loop if BC and CB are nonground (i.e. when [A|BC] is nonground).

Section 15: Symmetry

When two different solutions aren't actually different. Symmetry breaking is trying to get rid of the symmetry to reduce the size of the search space.

Section 13: How to make a non tail recursive predicate tail recursive: Approach two

Work out how you would write the predicate as a loop and "recursive-fy" it.

Section 13: How to make a non tail recursive predicate tail recursive: Approach one

Work out what the predicate is suppose to be and implement it.

Section 14: Comparing terms (order matters!)

X @< 7 , X = foo. This will return true, because upon the comparison, X @< 7, X is a variable, and variables are less than Numbers. X is bound to an atom after the comparison. X = foo, X @< 7 This will return false, because upon the comparison, X @< 7, X is a placeholder for the atom foo. foo is an atom, and atoms are not less than numbers.

Section 9: Recursion

You can define rules recursively You need a base case and a recursive case, and it's kind of like pattern matching in Haskell. ancestor(A, D) :- parent(A, D) ancestor(A, D) :- parent(P, D), ancestor(A, P)

Section 15: Finite Domain Constraints

You have a number of variables that are each constrained to hold some value from a finite set of possible values. Then you have additional constraints on what values they can take. These constraints could be on individual values or on relationships among values.

Section 10: List Syntax

[] :empty list [1, 2, 3] : representation of a list [X|Xs] : as opposed to x:xs [X1, X2| Xs] : as opposed to x1:x2:xs

Section 9: Negation

\+ means 'not' (not provable) *Side note: \= means not equal to (this is different from \+) Prolog executes \+G by first trying to prove G. If G is True, then \+G is False, and visa versa. This construct for negation is referred to "negation as failure"

Section 12: Meaning of constants, functors and predicates.

constants: stands for an entity in the world of the program function: stands for a function from n entities to one entity in the world of the program; predicate: a particular relationship between n entities in the world of the program

Section 14: How to use higher order functions (write filter)

filter(_, [] []). filter(P, [X|Xs], F) :- ( call(P X) -> F = [X|F1] ; F = F1 ), filter(P, Xs, F1). This is a common pattern, where you have an if then else where both arms use this term F1 that we define after the if then else.

Section 14: How to use higher order functions (write map)

map(_, [], []). map(P, [X|Xs], [Y|Ys]) :- call(P, X, Y), map(P, Xs, Ys). This is the equivalent to maplist/3

Section 15: Sudoku using Prolog clpfd library

sudoku(Rows) :- length(Rows, 9), maplist(same_length(Rows), Rows), append(Rows, Vs), Vs ins 1..9, maplist(all_distinct, Rows), transpose(Rows, Columns), maplist(all_distinct, Columns), Rows = [A, B, C, D, E, F, G, H, I], blocks(A, B, C), blocks(D, E, F), blocks(G, H, I). built-in Prolog: length, maplist, same_length, append user-defined: blocks clpfd: ins, all_distinct, transpose

Section 10: How would you define "take"

take(N, List, Front) :- length(Front, N), append(Front, _, List)

Section 12: Mathematical notation of a clause

∀A∀C : grandparent(A,C) ← ∃B : parent(A, B) ∧ parent(B, C) The variables appearing in the head are universally quantified over the entire clause, while those appearing only in the body are existentially quantified over the body.

Section 13: Two things to think about for efficiency

1. Determinism (no choice points) 2. Tail recursive.

Section 11: Overcoming the infinite backtracking loop

1. Make sure you only use questionable queries with the right modes (not ideal) 2. Add a further constraint. (not as efficient) rev3(ABC, CBA) :- samelength(ABC, CBA), rev1(ABC, CBA)

Section 10: Less than, equal to, not equal to and greater than

< : less than =< : less than or equal to > : greater than >= : greater than or equal to =:= : equal to (only works for numbers) =\= : not equal to (only for numbers)

Section 9: Equality

= is an infix operator for equality Used for both binding variables and checking for equality *Note: Prolog is a single-assignment language

Section 10: =

= is only a unifier X = 6\7 makes it so X is mapped to the data structure 6 \ 7. *equals does not evaluate an arithmetic expression*

Section 13: Choice points in the stack

Choice points live in the same call stack. Choice points freeze the stack (because we need this information later). That means it inhibits the action of popping off frames. When you get to the last alternative of something, you can pop off the choice point and then treat the stack as you normally would. Less choice points are therefore more efficient. (and hence why indexing is important)

Section 9: Query

Looks like a fact, but is treated as a question asking whether the statement is true. Queries are made at the Prolog Prompt

Section 15: Prolog strategy for variable binding.

Generate and test. Nondeterministic goals generate potential solutions, later goals test those solutions.

Section 9: Disjunction

Goals can be combined to form a compound query with disjunction using the ';' symbol. Conjunction (,) has higher precedence over disjunction (;).

Section 10: Boolean Function in Haskell vs. Predicate in Prolog

Haskell takes into account when it should fail Prolog only tries to make it pass. If it exhausts all the options, then it will fail on its own. (Therefore you don't need a base case for 'failure').

Section 14: var/1

Holds if the term is a nonbound variable.

Section 14: nonvar/1

Holds if the term is any term other than an unbound variable.

Section 14: ground/1

Holds if the term is ground.

Section 11: if then else caveats

If <condition> then <expression 1> else <expression 2> if the variable in the condition is nonground, then we have problems. If the condition CAN BE true, then prolog binds that variable. All possible alternatives are not considered, and therefore does not return all the correct solutions.

Section 13: Last Call Optimisation

If you've got a stack where a calls b and b calls c, but by the time b calls c, b has done all of its computations, so it's taking up stack space when its only going to take the return value of c and return it to a. Instead of keeping b in the stack, just free up the frame that b was sitting in and call c. That way, when c finishes it will just return directly to 'a'.

Section 11: Nondeterminism in Prolog

In Haskell, when you call a function, it will run the equation that it first gets pattern matched to and thats it. In Prolog, if there are multiple patterns it can be matched to, it will drop a choice point, run one of the patterns, then on redo backtracks to the choice point and runs the other pattern.

Section 14: Currying in Prolog

In order to curry in Prolog, you omit some of the final arguments of a higher order goal In Prolog, call can be of a higher arity, where the first argument is a partially completed goal and the following arguments are the arguments that are needed to complete the goal X = append([1, 2], [3]), call(X, L)

Section 15: Labelling

Labelling generates a search tree. Picks an unfixed variable, and partitions the domain of the selected variable into k parts (usually k = 2, but it can be anything less than or equal to the size of the variables domain.) Then recursively invoke the restraint solving algorithm for each part and restricting the domain to that of "d". Each recursive call consists of propagation steps. The root of the tree represents the computation for solving the whole constraint problem, then every other node is a choice within the computation of the constraints.

Section 14: bagof/3

bagof(Template, Goal, List) similar to setof except solutions are collected in the order they are produced (i.e. not sorted) and duplicates are not removed.

Section 10: append

build in Predicate append([], C, C) append([A|B], C, [A|BC] :- append(B, C, BC)

Section 10: length

build in Predicate length(List, Len)

Section 10: member

build in predicate member(Elt, List) determines if Elt is a member of List (like elem) member(Elt, [Elt|_]) member(Elt, [_|Rest]) :- member(Elt, Rest) An alternative way to define member (but not as efficient): member(Elt, List) :- append(_, [Elt|_], List) This is not as good because you have to construct the front of the list, which isn't necessary.

Section 10: is

built in predicate is is a built in (infix) predicate that evaluates arithmetic. *it only works if the second argument (the term on the right) is ground.*

Section 14: call/1

call/1 is teh most basic higher order function. It takes a term and executes it as a goal. X = append([1, 2], [3], L), call(X). This binds X to the goal, and then calls X which executes that goal. The return value will be the value for X (now fully bound) and the value for L.

Section 10: How would you define "drop"

drop(N, List, Back) :- length(Front, N), append(Front, Back, List)

Section 10: ground vs. nonground

ground: term w/ no variables nonground: term with variables

Section 14: predicates that determine type

integer/1 float/1 atom/1 compound/1 These succeed if the term is of the specified type. They fail for all variables.

Section 14: I/O operations

read(X). write(X).

Section 13: reverse using an accumulator

rev([], A, A) rev([B|CD], A, DCBA) :- rev(CD, [B|A], DCBA)

Section 14: setof/3

setof(Template, Goal, List) List is a *sorted* list of all the *distinct* instances of Template satisfying the Goal. setof(P-C, parent(P,C), List) List is a sorted (lexicographically) list of all P-C pairs that satisfy the parent(P,C) predicate, with duplicates removed. The template usually contains variables that appear in Goal.

Section 14: setof/3 where Goal contains variables not appearing in Template i.e.: setof(C, parent(P,C), List)

setof/3 will backtrack over each distinct binding for that variable (in example, P), and list will be a list of template instances for that binding (i.e. all the children of P). Prolog drops a choice point when it needs to bind P, then if finds a solution for that P (which is now binded) and then backtracks to choicepoint if another solution is asked for.

Section 10: '_' (underscore)

specifies a different variable each time you use it.

Section 9: relation

specifies a relationship among entities


Related study sets

India's Political System and Institutions

View Set

opioids, pain, fever, inflammation

View Set

Science Chapter 9- Test and quizzes

View Set

Introduction to Dietary Supplements

View Set

Env Chem - Unit 2 Combined Review

View Set

Chapter 2: Law, ethics, and quality

View Set

MGT 2010 Chapter 2 Practice Test

View Set

Module: Unit V – International Trade Agreements and Organizations and Regional Integration

View Set