Intelligent Agents(Algorithms for Intelligent Systems)
Unit II: Intelligent Agents
Intelligent Agents: Agents and Environments, The Concept of Rationality, The Nature of Environments (PEAS), The Structure of Agents: Simple Reflex Agents, Model-Based Reflex Agents, Goal-Based Agents, Utility-Based agents, Learning Agents.
Agents
and Environments:
Agent
An agent is
an entity that perceives its environment through sensors and acts upon that
environment through actuators. Agents can be anything from simple programs to
complex systems like robots or humans. The key characteristic of an agent is
its ability to take actions based on its perceptions to achieve specific goals.
Environment
The environment is
the external context or world in which the agent operates. It includes
everything the agent can perceive and interact with. The environment provides
the agent with inputs (percepts) and receives outputs (actions) from the agent.
Example
Consider a self-driving car:
- Agent: The
self-driving car itself, equipped with sensors (cameras, radar, etc.) and
actuators (steering, acceleration, brakes).
- Environment: The road, other vehicles, pedestrians, traffic signals, weather conditions, etc.
In this scenario:
- The car (agent) perceives the environment
through its sensors (e.g., detecting a red traffic light).
- Based on its programming or learning, it takes
actions (e.g., slowing down and stopping).
- The environment changes in response to the
car's actions (e.g., the traffic light turns green, and the car proceeds).
This
interaction between the agent and the environment is fundamental to fields like
artificial intelligence, robotics, and control systems.
Example: Agent and Environment in the Context of a Robot
Vacuum Cleaner
Agent
The agent in
this example is the robot vacuum cleaner. It is an
autonomous entity designed to perform the task of cleaning floors. The robot
vacuum cleaner is equipped with:
·
Sensors: To perceive its
environment (e.g., dirt, obstacles, walls).
·
Actuators: To take actions (e.g.,
moving around, sucking up dirt).
Environment
The environment is
the space in which the robot vacuum cleaner operates. This includes:
·
Floor Surfaces: Carpets, tiles,
wood, etc.
·
Obstacles: Furniture, walls,
toys, etc.
· Dirt: Dust, crumbs, pet hair, etc.
Example Scenario
·
Initial State: The robot vacuum
cleaner starts in the living room. Its sensors detect a patch of dirt near the
sofa.
·
Perception: The robot perceives
the dirt and also detects the sofa as an obstacle.
·
Decision Making: The
robot decides to move towards the dirt while avoiding the sofa.
·
Action: The robot moves to the
location of the dirt, avoiding the sofa, and activates its vacuum mechanism to
clean the spot.
·
New State: The dirt is removed,
and the robot moves on to detect and clean the next area.
This
continuous loop of perception, decision making, and action allows the robot
vacuum cleaner to effectively clean the environment while navigating around
obstacles.
Concept of Rationality
Rationality in
the context of agents refers to the ability of an agent to make decisions that
maximize its performance measure, given the available information and its
knowledge of the environment. A rational agent selects actions that are
expected to achieve its goals most effectively, based on what it knows and
perceives.
Key Components of Rationality
1. Performance
Measure: A criterion that evaluates how well the agent is doing in
terms of achieving its goals.
2. Prior
Knowledge: The information the agent has about the environment before
it starts acting.
3. Percepts:
The inputs the agent receives from the environment through its sensors.
4. Actions: The possible moves or steps the agent can take to influence the environment.
Example: Autonomous Delivery Robot
Consider
an Autonomous Delivery Robot designed to deliver
packages within an office building.
Performance Measure
·
Goal: Deliver packages to the
correct recipients as quickly and efficiently as possible.
·
Performance Criteria: Minimize
delivery time, avoid collisions, ensure packages are delivered to the right recipients.
Prior Knowledge
·
The robot knows the layout of the office
building, including the locations of rooms, doors, and elevators.
·
It has a list of packages to deliver, including
the recipient's name and location.
Percepts
·
The robot uses sensors to perceive its
environment, such as detecting obstacles (e.g., people, furniture), recognizing
room numbers, and identifying recipients.
Actions
·
The robot can move forward, backward, turn,
stop, and use elevators.
·
It can also interact with recipients to hand
over packages.
Rational Decision-Making Process
1. Perception:
The robot perceives its current location and detects an obstacle (e.g., a
closed door) in its path.
2. Decision
Making: Based on its prior knowledge, the robot knows there is an
alternative route through another corridor. It calculates that taking this
route will still allow it to deliver the package on time.
3. Action:
The robot decides to take the alternative route, avoiding the closed door.
4. Feedback:
The robot successfully navigates the alternative route, delivers the package to
the correct recipient, and receives a confirmation that the package has been
delivered.
Rationality in Action
·
Optimal Decision: The robot's
decision to take the alternative route is rational because it maximizes the
performance measure by ensuring timely delivery while avoiding collisions.
·
Adaptability: If the robot
encounters an unexpected obstacle (e.g., a new piece of furniture), it
reassesses the situation and makes a new rational decision, such as finding
another path or waiting for the obstacle to move.
The Nature of Environments (PEAS)
PEAS stands for Performance
measure, Environment, Actuators, and Sensors. It is a framework used
to define the task environment of an intelligent agent. By specifying these
components, we can clearly understand the nature of the environment in which
the agent operates and how it should perform its tasks.
Components of PEAS
1. Performance
Measure:
o Definition:
Criteria that evaluate how well the agent is performing its task.
o Example:
For a self-driving car, the performance measure could include safety, fuel
efficiency, and passenger comfort.
2. Environment:
o Definition:
The external context or world in which the agent operates.
o Example: For a self-driving car, the environment includes roads, traffic, pedestrians, weather conditions, and traffic laws.
3. Actuators:
o Definition:
The mechanisms through which the agent acts upon the environment.
o Example:
For a self-driving car, actuators include the steering wheel, accelerator,
brake, and turn signals.
4. Sensors:
o Definition:
The devices through which the agent perceives the environment.
o Example:
For a self-driving car, sensors include cameras, radar, LIDAR, and GPS.
Examples of PEAS for Different Agents
1. Autonomous Vacuum Cleaner
·
Performance Measure:
Cleanliness of the floor, battery life, time taken to clean.
·
Environment: Rooms, furniture,
dirt, and obstacles.
·
Actuators: Wheels, brushes,
vacuum mechanism.
·
Sensors: Dirt detection
sensors, obstacle detection sensors, cliff sensors.
2. Medical Diagnosis System
·
Performance Measure: Accuracy
of diagnosis, speed of diagnosis, patient outcomes.
·
Environment: Patient data,
medical history, symptoms, test results.
·
Actuators: Display recommendations;
generate reports, alert medical staff.
·
Sensors: Input devices for patient
data, interfaces for test results.
3. Online Shopping Recommender System
·
Performance Measure: Customer
satisfaction, sales conversion rate, relevance of recommendations.
·
Environment: User profiles,
browsing history, product database, current trends.
·
Actuators: Display
recommendations, send notifications, update user profiles.
· Sensors: Clickstream data, purchase history, user feedback.
Detailed Example: Self-Driving Car
Performance Measure
·
Safety: Minimize accidents and
ensure passenger safety.
·
Efficiency: Optimize fuel
consumption and travel time.
·
Comfort: Provide a smooth and
comfortable ride.
Environment
·
Roads: Highways, city streets,
rural roads.
·
Traffic: Other vehicles,
pedestrians, cyclists.
·
Conditions: Weather (rain,
snow, fog), road conditions (potholes, construction).
·
Regulations: Traffic laws,
speed limits, traffic signals.
Actuators
·
Steering Wheel: Control the
direction of the car.
·
Accelerator: Control the speed
of the car.
·
Brake: Slow down or stop the
car.
·
Turn Signals: Indicate turning
or lane changing.
Sensors
·
Cameras: Capture visual
information about the surroundings.
·
Radar: Detect the distance and
speed of nearby objects.
·
LIDAR: Create a 3D map of the
environment.
·
GPS: Determine the car's
location and navigate routes.
Comments
Post a Comment