Task Environments
Performance measure
A metric used to evaluate the success or achievement of an agent in accomplishing its goals.
PEAS
An acronym that stands for Performance, Environment, Actuators, and Sensors. It is used to describe the components of an intelligent agent.
Multiagent
An environment that involves multiple intelligent agents interacting with each other and the task environment.
Single-agent
An environment that involves only one intelligent agent interacting with the task environment.
Actuators
Controls or mechanisms used by an automated taxi driver, such as steering and braking systems, to interact with the driving environment.
Sensors
Devices equipped in an automated taxi driver that gather information from the environment, such as cameras and radar, to perceive and understand the surroundings.
Task environment
The specific problem or task that rational agents aim to solve
Fully observable
When an agent has complete and accurate information about the state of the environment through its sensors.
Episodic
When an agent's experience is divided into atomic episodes or individual interactions with the environment, with no influence from previous episodes.
Partially observable
When an agent's sensors provide incomplete or limited information about the state of the environment, requiring the agent to make assumptions or predictions.
Unknown
When the agent lacks complete knowledge of the environment's outcomes or probabilities, requiring the agent to make decisions under uncertainty.
Known
When the agent possesses complete knowledge of the environment's outcomes or probabilities, allowing for precise decision-making.
Sequential
When the current decision made by an agent can impact future decisions and outcomes, requiring a consideration of long-term consequences.
Dynamic
When the environment can change or evolve while the agent is making decisions or taking actions, requiring adaptability and real-time decision-making.
Stochastic
When the environment explicitly incorporates probabilities, where outcomes are not certain but have associated likelihoods.
Continuous
When the environment has a continuous range of possible states or actions, often involving real-valued variables or parameters.
Discrete
When the environment has a finite and distinct set of possible states or actions.
Static
When the environment remains unchanged while the agent is making decisions or taking actions.
Deterministic
When the next state of the environment is completely determined by the current state and the actions of the agent.
Nondeterministic
When the next state of the environment is not completely determined by the current state and the actions of the agent, introducing uncertainty or randomness.