PHIL 241 Exam 3
Describe Brooks' approach to artificial intelligence?
"Explicit representations and models of the world simply get in the way. It turns out to be better to use the world as its own model." 1. Start with simple, whole intelligent systems. 2. Build them up incrementally 3. At each step, have real, physical things that sense and act in the real world.
Fodor's Transducer Argument
"There is non non-inferential way of deducing shirtness." Inference = could be worn by a human, is shirt shaped, has holes for arms, etc. 1. No way to build a transducer, which takes sensory input and converts it into a different form or signal for processing. 2. No simple transducer for NON-NOMIC properties. 3. Difference between tracking transducible features vs. complex
How would cricket phonotaxis be replicated in a robot?
- Listen for a range of sounds -Identify sounds from crickets -Identify species -Identify particular cricket -Locate source of sound -Move towards sound -Repeat to ensure correct movement
Describe Barbara Webb's multimodal integration proposal as it pertains to cricket phonotaxis.
-Copy the motor command from phonotaxis -Predict how future visual inputs will be altered -Block the optomotor reflex with respect to these alterations -Allow optomotor reflex for all other visual changes -This works!
How did walking robots evolve?
-Start with passive walkers using gravity -Make use of physiological build -Add minimal power source to preserve passive dynamics
What is Brooks' approach to building realistic AI?
1. Break intelligent systems in activities, not functions, so there is realistic sense-act interaction with the real world 2. Use multiple, overlapping layers of activities that are independent and tested in the real world
What is the benefit of linguistic labeling?
1. Cheap: don't have to actually move objects in the physical world 2. Virtual: adds non-physical problem spaces that brains can work with 3. Arbitrary: context-free but works with context-sensitive systems 4. Might explain metacognition: self-evaluation, revising a plan, reflecting. 5. Allows for increasingly complex thoughts, plans, etc.
What are Brooks' requirements for AI creatures?
1. Cope with changes in environment 2. Robust in environmental interaction 3. Multiple competing goals in a flexible, responsive way 4. Do something in the world; contain some purpose in being
Describe Eileen Crist's response to earlier doubts about animal intelligence.
1. Doubt that animals can behave intentionally: we shouldn't assume this from the start 2. Doubt that science could examine: if we take the possibility of intentional behaviors seriously, we can learn 3. Doubt that consciousness isn't biased anthropomorphizing: it is a worse sin to be anthropocentric and assume that there is an unbridgeable gap between humans and animals.
Describe Brooks' two replies to the primary objection against his AI development views.
1. Each species has its own sensors and using reasoning strategies for humans might not work. 2. The programmed world relies on human perception but this is different from the creature's world; our introspection about internal representations might be misleading.
Principle of Ecological Balance
1. Given a certain task environment, there must be a match between the agent's sensory, motor and neural systems. 2. The physical layout of robot body and materials used to build allows for problem-solving to be distributed between body, brain and world. 3. Embodied agents can learn and generate computationally potent patterns of sensory stimulation by acting on the world.
Discuss the response to the EMH objection regarding Otto's belief.
1. If this works, we have to say the same for Inga; she wanted to go to MOMA and believed her memory had the address , consulted it and went. 2. The explanation of Inga's belief adds unecessary steps so the same goes for Otto; beliefs involving biological memory are bad explanations of beliefs.
Describe Brooks' approach to building a mobile robot.
1. Independent layers with implicit purposes that extract only relevant information; a collection of competing behaviors 2. No unified perception; sensor data handled independently and in parallel; not connectionist. 3. No explicit representation of the world or the intentions of the system; the creature is responding to its world Example: First robot layer allows the creature to avoid hitting objects, second layer gives locomotion to visible places.
Describe Charles Darwin's earthworm experiments and relevance to key findings.
1. Perception and Judgement: Earthworms plug holes to their burrows, but not randomly. They pull leaves in by stem or tip depending on leaf type; grabbing matches leaf type. Can also handle novel leaves. 2. Intent and Experience: Construction of dens seems purposeful, lined with glass and pebbles to protect from cold soil in "winter quarters" His observations reflected intelligence and subjective experience; isolated actions might be instinctive but execution was not.
What does Brooks see as problems with traditional approaches to AI development?
1. Perceptual and action modules are symbolic and rely on central system 2. Different researchers work on different pieces and rarely unite in central intelligence; not constrained realistically: -Knowledge -Planning -Reasoning 3. Leads to unrealistic, unintegrated, hypothetical components
Name 3 examples of advanced behaviors and reasoning that might require inner representations.
1. Physical disconnection 2. Imaginary circumstances 3. Counterfactual circumstances
According to Brooks, what is the goal of artificial intelligence?
1. Replicate human-level intelligence in a machine 2. Then shift to replicating pieces of human intelligence: representing knowledge, understanding natural language, vision.
How do linguistic labeling and physical grouping improve cognition?
1. Simplify sensory information 2. Allow for selective attention 3. Make comparisons between complex relationships
Describe two examples of meshing of body and environment.
1. Tuna: by themselves they are too weak to swim, but they exploit eddies and vortices to gain speed and navigate 2. Archer fish: shoots high velocity water at bugs that exceeds mechanical capacity, but uses large drop of water propelled by water behind it as an external mechanism to amplify muscular power
How does work on artificial life and robotics emphasize different aspects of the mind than other traditional research methods?
1. Unexpected ways brain, body and environment converge in problem-solving 2. Support robust response without centralized planning or control 3. Power of simple rules + behavioral routines
Why did the focus shift to complete, low-level systems?
1. We are trying to understand what biological creatures are doing 2. Biological solutions seem to involve existing solutions to more basic problems 3. Might not be able to neatly separate sensing, planning and action
Representationalist Response to Artificial Life Objection
1. We can explain SOME intelligent behavior without representationalism. 2. Inner representations are needed to explain more advanced behaviors and reasoning 3. The minimal-representation examples given by artificial life are NOT good examples of cognitive phenomena and therefore 4. This is NOT a problem for representationalism Example: A car moves downhill without an engine, but an engine is needed for more complex maneuvers.
What is the problem identified by Nagel?
A good analysis of something can't leave a crucial element of that thing out. The proposed analyses of the mental in therms of the physical leaves out consciousness.
Describe the evolutionary stance for the genesis of cognitive technology.
A small difference in brain structure and power facilitated a cycle: 1. Brain made cognitive technologies 2. Yielded more adapted brains 3. Brain made better technologies 4. Yielded more adapted brains 5. Repeat
Describe an objection against Brooks.
Abstraction and simplification are normal in science. Abstraction reduces input data so it experiences the same perceptual world as humans. Further research can fill in details that are abstracted out so NO PROBLEM.
The Parity Principle
According to Clark: Ignore prejudices about bodies and focus on the computational and problem-solving whole.
Emergence as Unprogrammed Functionality
Adaptively valuable behaviors that don't arise from explicit programming. Arise from repeated sequences of agent-world interactions. No internal state encoding goals. Can only be controlled indirectly. Example: cricket phonotaxis and wall-following robots Drawback: struggles with collective self-organization
Artificial Life Objection
An argument against representationalism: 1. We can explain intelligent behavior without the use of inner representations 2. We shouldn't add extra theories of cognition that aren't necessary 3. Intelligence and cognition don't require or involve inner representations
Complete Low-Level Systems
Artificial organisms that are: 1. Whole 2. Autonomous 3. Sense and act in realistic environments
Describe an example of the principle of ecological balance.
Babybot: learns object boundaries by shoving objects around.
Describe Nagels' bat comparison.
Bats have conscious experience; there is something it is like to be a bat. But bats have different sensory experiences: echolocation. We can objectively imagine what it is like for us to behave like a bat but we can't imagine what it is like to BE a bat - the SUBJECTIVE character. Even if gradually metamorphosed into a bat, nothing in our present constitution would enable understanding of the experience of metamorphosed self.
Emergent and Collective Effects
Behavior that seems complex might result from simple, dumb agents following simple rules. Complex general planning or centralized storage is not necessary.
Content vs. Vehicles of Content
Content: a mental state; thinking being + its environment Vehicles of content: physical material within a system that plays a special role in enabling the system to possess a certain mental state Example: External traces such as notes or files can be physical vehicles of dispositional beliefs. They do NOT work the same way as biological memory, but they are integrated into strategies and share the same broad functional role.
Counterfactual Circumstances
Coordinating activity and choice using hypothetical, non-factual circumstances. Example: Imagining what you would have done or would do if your car has a flat tire.
Imaginary Circumstances
Coordinating activity and choice using imagining circumstances. Example: Imagining a zombie apocalypse and what you would do to survive.
Physical Disconnection
Coordinating activity and choice when physically disconnected from environment. Example: Using mental imagery to count the windows of your parent's house while sitting in the classroom.
Name an example of complete low-level systems.
Cricket phonotaxis: a technique female crickets use to locate their mates in a noisy environment. -Male crickets chirp different frequency, related patterns and volume -Female crickets hear and identify same species, identify source location and move toward it
What differentiates Barbara Webb's method from earlier methods?
Eliminates: 1. INTERNAL representations 2. CENTRALIZED processing 3. Identifying all sounds first 4. GENERALIZED mechanisms Introduces: 1. SPECIALIZED mechanisms 2. Problem solving diversified between brain, body and world 3. Biologically accuracy
Describe the theories of embedded cognition vs. extended mind.
Embedded cognition: resides inside creatures that are embedded in their environments. Extended cognition: minds as extended systems that go beyond bodies.
Wideware
External or artificial cognitive aids. Example: artist sketches translate mental imagery into physical form, evaluate, re-evaluate, and repeat
Describe the importance of external linguistic labeling in cognition.
External tags and labels enables the brain to solve more complex problems. Labeling simplifies complex sensory patterns and allows the brain to find patterns. We can then use new words and labels for these patterns and so on, indefinitely. Example: making comparisons between all green objects is really difficult without labeling the objects as green.
Name an example of agent-agent emergent and collective effects.
Flocking results from agents following 3 basic rules: 1. Stay near a mass of neighbors 2. Match the speed of neighbors 3. Don't get too close to any neighbor
Discuss the response to the EMH objection regarding gestures.
Gesture might be a result of wanting to express, independent of cognitive benefits. BUT it could still provide benefits. We need to explore this further in people who have arms and can feel them when gesturing.
What is Nagel's take on good reductions and how is this a problem for consciousness?
Good reductions shift from the subjective to the objective, but that does not work for consciousness because we can't get rid of the subjective experience. We can't eliminate the one particular viewpoint of conscious experience. Example: subjective human senses lead to objective, reductionist explanations in terms of properties
Describe Nagel's comparison of physicalism compared to the hypothesis that matter is energy.
He is not arguing that physicalism is false, rather that we don't have the means to understand it if it is true. The status of physicalism is similar to the hypothesis of matter as energy would have had if uttered by a pre-Socratic philosopher. We do not have the beginnings of a conception of how it might be true.d
What is Clark's view on cognitive technology?
His proposed answer to the question of how we should link AI and situated cognition with higher cognition: 1. Basic capacity for online, adaptive responses is applied to wideware 2. Brain is ADAPTED to use these external structures to help structure itself; we shape them and they shape us 3. There is an interaction between brain, body, world and external aids
Describe the use of labels in training chimps.
In an experiment, chimps were trained to label pairs of objects with symbols to represent first-order sameness. Then the chimps made higher order sameness comparisons between those symbols. Chimps without such training cannot do so.
What does Thomas Nagel claim about consciousness?
In his 1974 challenge for reductionist accounts of consciousness and cognition, he claims that if a being has conscious experience at all, then there is something IT IS LIKE TO BE THAT THING; subjective character of experience. Every subjective phenomenon is connected with a single point of view and an objective, physical theory abandons that point of view. Thus... What is it like to be a bat?
Why do representationalists argue that advanced reasoning requires inner representations?
In such cases, the agent must respond to stimuli not actually present. Coordinating this kind of activity seems to require some kind of inner item to stand in for the absent stimuli. The stand-in is inner representation.
What was Charles Darwins' discovery regarding earthworms?
Initially doubted that invertebrates could have inner lives, but conducted experiments that showed worms have: -Perception -Judgement -Intent -Experience
Describe the advent of multimodal integration as it pertains to cricket phonotaxis.
Integration of phonotaxis with optomotor reflex: correct for changes to visual field and orient body. Example: changes to currents, terrain and body angle
Limits of Biology
Language gives humans cognitive tools but not all animals can use them; some essential brain components must be present.
Peter Godfrey-Smith's Strong Continuity Hypothesis
Life and mind have a common abstract pattern or set of basic organizational properties. The functional properties of mind are an enriched version of the functional properties of life in general. Mind is literally life-like. Not that they are equivalent, but central features overlap. Example: Understanding life requires self-organization, collective dynamics, circular causal processes; understanding the mind requires these too.
Tierra Virtual Ecosystem
Life instantiated, not just modeled, on a computer "in silica" including: -digital organisms compete for CPU time -codelet can copy with mutations -dominant survival strategies change -codelet exploitation aka parasites
Bedau's Definition of Life
Life is supple adaptation; being capable of responding appropriately in a variety of ways to unpredictable circumstances. Virtual organisms would count as alive.
Provide an example of an AI system that is too limited in scope and fragile.
MYCIN, AI that is good at diagnosing bacterial infections but bad at understand that bleeding out is irrelevant.
Describe Barbara Webb's proposed method of replicating phonotaxis.
Make use of physiological systems of crickets and build a model: -Ears are located on knees -Tracheal tube only transmits sounds of desired frequency -Sound waves out-of-phase on side closer to sound source -Hears these louder than in-phase -Interneurons fire when sound reaches critical level at side closest to sound -Cricket turns toward that side -At start of each sound burst, cricket reorients -Uses physical features of tracheal tubes that only amplify relevant frequencies of correct species
Challenges of Abstraction
Many AI problems are solved by abstracting away perception and motor skills and using limited problem spaces, which constrains solutions. The hard parts of intelligence often involve figuring out what is relevant. There is no clean division between perception aka abstraction and reasoning in the real world.
Challenges for the Strong Continuity Hypothesis
Mind and life might share features, but the mind still involves reason-based transitions and abstraction; where are the parallels in life?
What is Brooks' view of representationalism?
Much of human level activity is merely a reflection of the world through very simple mechanisms without detailed representations.
Explain the MOMA example of the Extended Mind Hypothesis.
Navigating to MOMA: 1. Inga hears of an exhibit, thinks and recalls where MOMA is located and goes. 2. Otto has Alzheimer's and writes down new information in his notebook. He hears about the exhibit, retrieves the address from his notebook and goes. The physical vehicle is OUTSIDE of Otto's brain and thus, his belief is as well.
Describe the neural constructivism stance for the genesis of cognitive technology.
Neural growth is experience-dependent and involves constructing new circuitry. The circuitry changes because of agent-environment interactions.
Describe the cognitive dovetailing stance for the genesis of cognitive development.
Neural resources become structured to factor reliable resources and operations into the core parts of problem-solving routines.
Discuss the EMH objection regarding INFORMATION TRANSFER.
No cognitive processing actually happens outside the head. Information transfer happens, but that is not real cognitive processing; it is inert.
What is the classic view on animal intelligence?
Non-human animals are not intelligent and have no inner lives. Animal intelligence is problematic because: 1. Doubtful that animals can behave intentionally 2. Doubtful that conscious action could be a reasonable subject of science 3. Doubtful that conscious behavior isn't just some biased, anthropomorphic way of viewing phenomena
Occurrent vs Dispositional Beliefs
Occurrent: beliefs you are thinking about right now Dispositional: beliefs you are NOT thinking about right now
Describe the focus of older robotics and AI.
Older robotics and AI focused on isolated pieces of advanced cognition -Example: checkmate pattern in chess
What is the proposed criteria for non-biological inclusion in dispositional belief system?
Only counts if it is: 1. Reliably available and typically invoked 2. Automatically endorsed 3. Easily accessible 4. Consciously endorsed in the past and as a result, is present Example: blind person's cane, artist's sketchpad and cochlear implants.
Discuss the EMH objection to the regarding Otto's BELIEF.
Otto only BELIEVES the address is in his notebook; he does not believe the address is located on 53rd street. The belief leads him to check his notebook and then go to 53rd street. There is no extended cognizing happening.
Discuss the EMH objection regarding GESTURES.
People born without arms still gesture with relevant circuitry firing, so extra body circuits might not be doing any cognizing.
Non-Nomic Properties
Properties that don't have to do with physical laws: -Being a genuine dollar bill -Being an admired author -Being a game-winning catch -Being a shirt
Brooks' Radical Hypothesis
Representation is the wrong unit of abstraction in building the bulkiest parts of intelligent systems. 3.5-3.7 billion years ago: first single-celled organism 200 thousand years ago: early humans 5 thousand years ago: writing
Describe how gestures can be seen as instances of the extended mind.
Research has shown that: -We gesture in the dark -Gesturing increases with harder tasks -Blind people gesture -Continuously informs and alters verbal thinking and vice versa in cyclic manner Gestures are like writing thoughts on paper and can serve as extended cognition.
Explain the Tetris example of the Extended Mind Hypothesis.
Scenario for rotating pieces in Tetris: 1. Mental rotation 2. External rotation by pressing key 3. Retinal implant linked to thought 4. Martian with naturally-evolved circuitry of retinal implant Vehicles of content: Using the parity principle and given that scenario 4 counts as mental processing, circuitry outside of agent in scenario 3 should also count. Not vehicles of content: Scenario 2 only works in the arcade, so it is location-dependent.
Emergence as Incompressible Unfolding
Simulation based method; a systemic feature is emergent if and only if it can predict in detail only by modeling the interactions that give rise to it Drawback: Too restrictive and seemingly emergent properties can be modeled by simulating only some of the interactions, so not emergent properties.
Emergence as Collective Self-Organization
System organizes itself, but there is no agent inside the system doing the organizing. Activity in simple components leads to a larger pattern.
Name an example of agent-agent and agent-environment emergent and collective effects.
Termite arch building: 1. Balls of dirt are randomly rolled and marked with scent 2. Dropped randomly at first 3. Natural clumping into columns 4. Dirt is deposited between columns where scent is greatest
What did Collins argue about Asimo?
The energy and computational advantages of passive dynamics cannot be overcome by Asimo-type walkers. "Human-like motion might come natural to human-like mechanism" Example: Capability of helicopters vs. gliders and planes.
The Extended Mind Hypothesis
The machinery of the mind isn't limited to our bodies. Some cognitive process and mental states extend beyond the body.
Why is it important to study the interaction of brain, body and environment?
The naked brain won't tell us much - we must look at how we interact with our environments.
Emergence as Interactive Complexity
The process by which complex, cyclic characteristics give rise to stable patterns: 1. Weakly emergent = linear interactions + simple feedback loops 2. Strongly emergent = non-linear interactions + multiple asynchronous loops
What is an example of external memory storage?
The structuring of the external world to store information: -A bartender arranges drinking glasses in order of shape to recall store drink orders -Tyina a string around your finger -Sticky notes
Describe the relationship between information structuring and information processing.
They are continuously linked through sensorimotor loops: 1. Nervous system processes streams of sensory stimulation 2. Generates motor actions 3. Actions guide further production and selection of sensory information
Clark's Turbocharger Metaphor
Turbochargers are self-stimulating systems and true cognizing systems are similar, sometimes utilizing extra neural and extra body circuitry such as gestures and Otto's notebook: -Cognitive processes create outputs -Outputs recycled into inputs and drive cognitive process along; ie. speech, gesture, expressive movements, written words
Describe Brooks' concerns for the simple-to-complex piecemeal method of developing artificial intelligence?
Unclear that human intelligence could be broken down into pieces; even if it could, unlikely we would pick correctly. Instead, we need practice working with simpler, whole intelligence systems.
Why does emergence matter?
Understanding emergence could help us understand how simple inputs and agent-world interactions lead to complex behaviors.
Name an early example of complex contributions of body, action and environment to adaptive behavior.
Walking robots such as Asimo: -26 degrees of freedom -Joint-angle control systems -Powered operations -Careful planning and movement coordination Drawbacks: -Stiff motion -Energy-inefficient
Describe some key takeaways from Eileen Crist's paper on earthworms.
We can't see into each other's heads and compare qualia but we can learn about the experience of other humans. Could we therefore learn about the potential qualia of animals?
Discuss the response to the EMH objection regarding information transfer.
We cannot underestimate the importance of information transfer. The pattern of information flow enforced by something can be what allows a cognitive system to perform. Example: the tri-layer connectionist net uses a feedback layer that simply transfers information and does not transform it, but it drastically changes what the system can do.
How do we use words as more than external memory storage?
Words are used as objects and then we do things with those objects that we couldn't do before; with words and labels, we create new objects and interact with them.
What is the problem with emergence as collective self-organization?
Works well for systems of large numbers of identical elements following simple rules but can't handle systems with fewer elements that are different from each other. Example: robot cricket
The Extended Mind Hypothesis focuses on ______________.
vehicles of content