chapter 2

Ace your homework & exams now with Quizwiz!

difference between "reflex" agents and "goal-based" agents?

"Reflex" agents map directly from percepts to actions, while "goal-based" agents have knowledge that can be modified to affect its decision making

for a "taxi learning" agent, describe: 1.) performance element 2.) critic 3.) learning element 4.) problem generator

1.) Performance element: uses percepts to determine that left turn across 3 lanes of traffic is needed 2.) Critic: observes angry drivers after cutting across lanes of traffic 3.) Learning element: determines that cutting across 3 lanes for a turn is bad 4.) Problem generator: try cutting across 2 lanes for a turn

for "utility-based" agents, what does "suitability" mean? (2 things)

1.) When there are conflicting goals, only some of which can be achieved 2.) When there are several goals, none of which can be achieved with certainty

1.) Rationality maximizes what? 2.) perfection maximizes what?

1.) rationality --> expected performance 2.) perfection --> actual performance

"discrete" versus "continuous" applies to what 3 things?

1.) the state of the environment 2.) the way time is handled, and to 3.) the percepts and actions of the agent

what is the "A task environment" specification? The acrynom is PEAS

1.)Performance measure Safe, fast, legal, comfortable, maximize profits 2.)Environment Roads, traffic, pedestrians, customers 3.)Actuators Steering, accelerator, brake, signal, horn, display 4.)Sensors Cameras, sonar, speedometer, GPS, odometer, accelerometer, engine sensors, keyboard

what is an agent?

An agent is anything that can be viewed as perceiving its environment through sensors and acting upon that environment through actuators

DISCRETE example 1.) Chess •the state of the environment: discrete •the way time is handled: discrete (without a clock) •the percepts and actions of the agent: discrete

CONTINUOUS example 2.) Taxi driving •the state of the environment: continuous •the way time is handled: continuous •the percepts and actions of the agent: mostly continuous (speed, locations, steering, etc)

what is "information gathering"? what is the importance of it?

Doing actions in order to modify future percepts. it is important part of rationality. e.g. Looking both ways before crossing the road

what does it mean when the task environment is "fully observable"?

If an agent's sensors give it access to the complete state of the environment at each point in time. •A task environment is effectively fully observable if the sensors detect all aspects that are relevant to the choice of action •In a fully observable environment, an agent need not maintain any internal state to keep track of the world

what does it mean when the environment is "unobservable"?

If the agent has no sensors at all then the environment

what does it mean when we say that the agent lacks "autonomy"?

To the extent that an agent relies on the prior knowledge of its designer rather than on its own percepts

in a "competitive multiagent" environment, what is the agent trying to do?

agent is trying to maximize its performance measure while minimizing the performance measures of other agents

in a "cooperative multiagent" environment, what is the agent trying to do?

an increase in one agent's performance measure increases the performance measures of other agents as well

what is a "percept sequence"?

complete history of everything the agent has ever perceived

what is "good behavior"?

concept of rationality

what is a "nondeterministic" environment?

environment is one in which actions are characterized by their possible outcomes, but no probabilities are attached to them

simple reflex agents operate using a collection of "condition-action rules", what does this mean?

essentially a series of if statements. •They are appropriate in environments where the correct decision can be made on the basis of only the current percept (i.e. the environment is fully-observable)

what is "performance measure"?

evaluates the behavior of an agent in an environment

What is a "learning" agent?

improves the performance element

what is a general rule of "design performance"?

it is better to design performance measures according to what one actually wants in the environment, rather than according to how one thinks the agent should behave

what is an "ominiscient agent"?

knows the actual outcome of its actions and can act accordingly; but omniscience is impossible

what is "agent function"

maps any given "percept sequence" to an action

can an agent's choice of action be any action it chooses?

no, it must depend on the entire "percept sequence" observed to date (everything previously perceived)

what is a "rational agent"?

one that does the right thing - conceptually speaking, every entry in the table for the agent function is filled out correctly

rationality is not the same as?

perfection

what is "critic"? note: important to look at the diagram as well

provides feedback on how the agent is doing and determines how the performance element should be modified to do better in the future

what is a "Problem generator"?

responsible for suggesting actions that will lead to new and informative experiences

For each possible percept sequence, a rational agent should do what?

should select an action that is expected to maximize its performance measure, given the evidence provided by the percept sequence and whatever built-in knowledge the agent has

what is a "percept"?

something perceived or sensed by an agent (see slides to view diagram)

what is the "performance" element?

takes in percepts and decides on actions

what for the "unknown" environment?

the agent will have to learn how it works in order to make good decisions

what is "agent program"?

the implementation of an agent's function

what is a "stochastic" environment?

the next state of the environment is affected by forces outside the control of the agent, and the uncertainty is qualified in terms of probabilities. •Taxi driving is stochastic

what is a "deterministic" environment?

the next state of the environment is completely determined by the current state and the action executed by the agent •Vacuum world is deterministic

what are the outcomes in a "known" environment?

the outcomes (or outcome probabilities) for all actions are understood

what are "simple reflex" agents? (check diagrams on slides)

these agents select actions on the basis of the current percept, ignoring the rest of the percept history

what is a "sequential" task environment?

when the current decision could affect all future decisions •Examples •An agent that spots defective parts on an assembly line is episodic •Chess is sequential

what does it mean when the environment is "dynamic" for an agent?

when the environment can change while an agent is deliberating, otherwise it's considered "static"

what does it mean when the environment is "semidynamic"?

when the environment itself does not change with the passage of time but the agent's performance score does. •Examples •Solving crossword puzzles is static •Taxi driving is dynamic •Chess, when played with a clock, is semidynamic

what is a "episodic" task environment?

when the next episode does not depend on the actions taken in previous episodes

what does it mean when the environment is "partially observable"?

when the sensors are noisy and inaccurate or because parts of the state are simply missing from the sensor data •A vacuum agent with only a local dirt sensor cannot tell whether there is dirt in other squares

what is a "multiagent" environment?

when there's multiple agents.

what is a "single agent" environment?

when there's only one agent

what are "Model-based reflex" agents

•These agents keep track of the part of the world they can't currently see. •They model their environments by maintaining an internal state that depends on the percept history

what are "utility-based" agents?

•While goals provide a crude binary distinction between "goal met" and "goal not met", utility agents allow a more general performance measure. •An agent's utility function is an internalization of the performance measure that can indicate the degree to which a goal is met

how are "models" and :states: represented?

•models and states are represented vary widely depending on the environment and technology •It is seldom possible for the agent to determine the current state of a partially observable environment exactly

what does it mean when the environment is "uncertain"?

•not fully observable or not deterministic

what are "goal-based" agents?

•uses some sort of goal information that describes situations that are desirable •This type of decision making involves consideration of the future, both "What will happen if I do x?" and "Will that make me happy?"


Related study sets

Science Chapter 7 water lesson 1

View Set