COGST 1101 Prelim 2

¡Supera tus tareas y exámenes ahora con Quizwiz!

eye movements as function of grade level

average duration of fixations decreases as grade increases; regresions per 100 words decreases as grade increases; fixations per 100 words decreases as grade increases; number of words per fixation increases as grade increases; rate with comprehension in WPM increases as grade increases

the lack of invariance (challenges in speech perception)

context determines the acoustic signal for a given phoneme; little difference between some phonemes and big difference between instances of the same phoneme [know diagrams]

cohort model

initial phoneme used to activate words starting with that phoneme (activated words form a cohort, words in the cohort are activated according to frequency); initial activation is bottom-up; context effects only after initial cohort activation (top-down); word recognition: elimination of cohort members as more input is received -- until only one candidate remains [know diagram]

the connectionist model

input = phonetic features and boundary cues; output = word boundary activation; hidden layer adjusted based on feedback [know diagrams]

high-amplitude sucking (speech discrimination studies)

infants tend to pay more attention to novel stimuli compared to habituated stimuli; infants increase their sucking rate when listening to novel stimuli (one-month-olds perceive the /ba/-/pa/ distinction [know diagram]

modeling neurons

input from thousands of neurons; input is combined (summation); effectiveness depends on synaptic weight; output when there is enough excitatory output; inhibitory input counteracts excitatory input

word perception summary

overarching theme: integration of bottom-up and top-down processing; phoneme recognition: problem of invariance and speaker variability with solution: categorical perception; speech segmentation: problem of speech segmentation with solution multiple cue integration; spoken word recognition: context effect and cohort model

neural networks

parallel, distributed representations, no distinction between procedures and representations, learning built in, bottom-up

visualizing speech signals

spectogram: shows frequencies of sounds produces over time

mathematical model of neural summation

[know diagram] neurons sum their inputs

physical symbol systems

"A physical symbol system has the necessary and sufficient means for intelligent action." -Newell & Simon necessary: if something is intelligent, it depends on a physical symbol system sufficient: if there s a physical symbol present, it is intelligent

why computer science?

"By reasoning I understand computation. And to compute is to collect the sum of many things added together at the same time, or to know the remainder when one thing has been taken from another. To reason is therefore he same as to add or to subtract" -Thomas Hobbes Shimon Edleman: everything computes, but the mind does it with purpose (e.g., for survival)

syntanctic ambiguity

"One day, I shot an elephant in my pajamas." "One day, I shot an elephant in my pajamas (how he got in my pajamas, I don't know...)" (know trees)

4 simple ideas stemming from SHRDLU (1-2)

(1) its world was so simple that the entire set of objects and locations could be described by using as few as 50 words (nouns like block/cone and verbs like place on and move to and adjectives like big/blue) -- the possible combinations of these basic language building blocks were quite simple and the program was fairly adept at figuring out what the user meant (2) SHRDLU also included a basic memory to supply context; one could ask to "put the green cone on the red block" and then "take the cone off"; the cone would be taken to mean the green cone one had just talked about -- SHRDLU could search back further through the interactions to find the proper context in most cases when additional adjectives were supplied; one could also ask questions about the history (ex. did you pick up anything before the cone?)

the role of sentence context in visual word recognition

(1) the pirate found the treasure (2) the person liked the treasure (3) the house was destroyed by the treasure faster at recognizing treasure in 1 than 2 than 3

4 simple ideas stemming from SHRDLU (3-4)

(3) a side effect of this memory, and the orignial rules SHRDLU was supplied with, is that the program could answer questions about what was possible in the world and what was not (ex. SHRDLU would deduce that blocks could be stacked by looking for examples, but would realize triangles couldn't be stacked, after having tried it); the world contained basic physics to make blocks fall over independent of the language parser (4) SHRDLU would remember names give to objects or arrangements for them (ex. one could say "a steeple is a small triangle on top of a tall rectangle" and SHRDLU could then answer questions about steeples in the blocks world and build new ones

compounds

(ex. blackboard, president-elect, orange juice) have multiple roots and non-compositional meaning (a blackboard isn't a board that's black); different from 2-word phrases: stress is on first root rather than second (BLACKboard vs. black BOARD)

recursive tree

(know picture)

the computer metaphor of the mind

(know the diagram)

syntax & semantics in language of thought hypothesis (1)

I ate a muffin for breakfast. Ate a I breakfast muffin for. -> syntax is needed for meaning I ate a dinosaur for breakfast. -> meaning reflects the relationship between symbols and the things they refer to

problem solving with physical symbol systems

The cook team (Walt, Jesse, and Saul) is traveling with the distribution team (Gus, Mike, and Hector) [PROBLEM STATE]. They're trying to get across a river to their hideout [GOAL STATE]. There's a boat to move across the river, but it can only hold 2 people [OPERATIONS]. The cook team knows that the distribution team wants to kill them [PROBLEM STATE]. How can everyone get to to the other side of the river so that [GOAL STATE] the cook team is never outnumbered by the distribution team? [CONSTRAINTS]

recursion

The dog is in the house. The dog is in the house in the woods. The dog is in the house in the woods in PA. The dog is in the house in the woods in PA... this could go on infinitely long (although you wouldn't be able to process it after a few iterations)

mathematical model of neural output

[know diagram] output is based on activation level, where output is some function of total input and threshold

learning perceptron convergence rule

[know diagram] slowly adjust threshold and weights until they converge to a solution; requires a "teacher" that knows what output should be

logic circuits (single layer network)

[know diagram] this is computation; this is a "perceptron" that computes a boolean function

wason card selection task (meaning matters for problem solving)

[E] [G] [2] [5] if a card has a vowel on one side, then the other side has an even number.. which cards should you turn to test this? [D] [ND] [25] [19] anyone who's drinkng must be older than 21.. which cards should you turn to test this? importance is that it demonstrates inconsistency of applying logical rules by the people when the problem is set in 2 different contexts but with very similar connection between the facts

neurons and synapses and neural summation

[know diagram]

turing machine

a subtracting algorithm: 5 - 3 = ? --(recode input)--> [1 1 1 1 1] - [1 1 1] --(recursively apply the "subtracting" rule)--> [1 1 1 1] - [1 1] .... [1 1] - [] --(recode end state for output)--> 2; this is analogous to being able to appled the "embed prepositional phrase" rule

chinese room

a thought experiment presented by John Searle to challenge the claim that it's possible for a computer running a program to have a "mind" and consciousness" in the same sense that people do, simply by virtue of running the right program; to contest this view Searle states "Suppose that I'm locked in a room and ... that I know no Chinese, either written or spoken". He further supposes that he has a set of rules in English that "enable me to correlate one set of formal symbols with another set of formal symbols", that is, the Chinese characters. These rules allow him to respond, in written Chinese, to questions, also written in Chinese, in such a way that the posers of the questions - who do understand Chinese - are convinced that Searle can actually understand the Chinese conversation too, even though he cannot. Similarly, he argues that if there is a computer program that allows a computer to carry on an intelligent conversation in a written language, the computer executing the program would not understand the conversation either (possible for there to be syntactic symbol manipulation without any form of intelligence or understanding)

long-term potentiation (LTP)

action potential can trigger an increase in receptors in the dendrite [know diagram]

breadth first search

an algorithm for traversing or searching tree of graph data structures; it starts at the tree root (or some arbitrary node of a graph, sometimes referred to as a search key) and explore the neighbor nodes first, before moving to the next level neighbors)

depth first search

an algorithm for traversing or searching tree or graph data structures; it starts at the root (selecting some arbitrary node as the root in the case of a graph) and explores as far as possible along each branch before backtracking

mcgurk effect (evidence for the influence of nonspeech input)

best fit between visual and auditory input; subjects see a speaker producing /ga/, subjects hear the sound /ba/, subjects perceive the sounds as /da/

neural network

biologically inspired models of the mind; bottom-up approach: can we model minds by modeling neurons and their connections? first step: can we model neurons and connections? YES, second step: how do we combine neurons into networks? CONNECT LOTS OF UNITS IN LAYERS AND USE BACKPROPAGATION

combining morphemes

can't just combine morphemes any old way you choose; morphemes are put together using rules, much like phrases are put together using phrase structure rules in syntax

morpheme combinations

carelessness -> morphemes have to be combined in a certain order: care + -less > careless; careless + -ness > carelessness; these morphemes much like phrases in syntax have hierarchical structures (study tree diagram)

connectionism

cognitive psychology: observe behavior under different conditions cognitive neuroscience: observe neural activity (large scale--EEG and fMRI and small scale --electrophysiology) modeling: can a proposed system produce the right kind of output? e.g., logical mistakes, developmental stages, response time distributions; existence proofs, not proofs

game trees

combinations that can happen as a result of a game; chess game tree has 20^50 states which causes a combinatorial explosion; what's the best move? [know diagram]

sentences (S)

composed of an NP and a VP; S -> NP VP (Jackie sang, Sarah likes rabbits, Bob passed the ball to Sue)

the segmentation problem (challenges in speech perception)

continuous stream of speech, no breaks between words; difficult to segment the speech signal into words (to recognize speech vs. to wreck a nice beach)

productivity of morphology

creating and understanding utterances that we've never heard before; we know the rules that are used to created past tenses (for example) and are able to extend those rules to novel morphemes; we put together morphemes productively using non-declarative memory -- the knowledge is implicit

the language properties

description, abitrariness, productivity, and ambiguity; hallmark of human intelligence, can we program it?

categorical perception (potential solution to speech perception problems)

different acoustic versions of a phoneme is perceived as the same phoneme; a perceptual discontinuity across a continuously varying physical dimension [know diagrams]

chinese room argument

directed against the philosophical positions of functionalism and computationalsm which hold that the mind may be viewd as an information processing system operating on formal symbols

roots

do not have a predictable pattern of combination with other morphemes; often they can stand alone and their meaning percolates to the word as a whole (ex. washable -> wash)

intelligence

does it need to be human, organic evolved? (species chauvinism); what kinds of behaviors are necessary before we say something is intelligent (functionalist); turing test: something is intelligent if you can't tell it apart from a person; chat bots typically work by matching input to appropriate output; but WAIT, everything so far is just rule based transformation -- where is the understanding? symbol grounding problem and problem of intentionality

multi-layer networks and backpropagation

expected output is unknown for hidden layers; backpropagation algorithm; hidden units get credit for error in the next layer; units that contribute more to an output that is wrong get more adjustment <- error learning [know diagram]

heuristic search

exploits additional knowledge about the problem that helps direct search to more promising paths; a heuristic function provides an estimate of the cost of the path from a given node to the closest goal state

neural network

extremely powerful, data driven approach; networks are used for: model development of grammar, google for image classification, Netflix recommendations, translation, speech recognition, weather, computationally obscure problems; desired output emerges from the system as a whole

supervised learning

faster and more flexible (error-driven); backpropagation (no known biological mechanism)

AI approaches to the mind

formal/rule-based AI: serial, logical, algorithmic operations (if-then), knowledge and operations can be described parallel, connectionist systems: parallel processing, solutions, operations, knowledge emerges through learning

spoken word recognition

frequency effect: high-frequency words are recognized more (cat vs. platypus); phonological neighborhood effect: word recognition is more difficult for words for which there are many similar sounding words (mail: rail, bail, wail...)

word superiority effect

from easiest to hardest to detect a letter: when presented in a word--D or J in WORD? when presented in a nonword--D or J in ORWD? when the letter is presented alone--D or J in D?

the continuity hypothesis (theory of language evolution)

gradual evolution of language; some language ability in existing non-human primates; overlap between language and other cognitive abilities; a large part of language is leaned

affixes

have predicable patterns of combination with other morphemes; they generally cannot stand alone, and they usually don't carry the bulk of a word's meaning; in english, we mostly have prefixes and suffixes (ex. washable -> able)

noun phrases (NP)

have the same distribution as a noun; NP -> N (water, cheese, rabbits), NP -> determiner N (the tree, a rabbit), NP -> Adj N (pretty colors, small rabbits)

verb phrases (VP)

have the same distribution as a verb; VP -> V (sleeps, hopped), VP -> V NP (ate an apple, plays the piano), VP -> V NP PP (gave a book to John)

the role of context in spoken word recognition

helps with word recognition: meaningful sentence context, steady speaking rate information, familiarity with speaker

means-end analysis

heursitic search: trim the search space down to make the search process more tractable; bring the current state closer to the goal (or sub-goal) state; are they different? what will make hem less different? is it allowed? if yes, do it, if no, do something else; repeat...until goal state if reached

what can we learn from neural networks?

how does it work? hidden layers transform input into something useful for the next layer [know diagram]; biological plausibility? neurons/networks more complicated, time matters for real networks, cortex isn't homogeneous, what biological process does backpropagation? some types of learning are more plausible than others

major types of writing systems (the nature of visual stimulus)

logographic: unique symbol for each morpheme (chinese) syllabic: arbitrary symbols for each syllable (japanese) alphabetic: each symbol approximates a phoneme (english)

syntax

looks at how we combine words in sentences; like morphemes, phrases are the building blocks of syntax; phrases are constructed and combined using phrase structure riles; like morphology, syntax is hierarchically organized and productive

frequency x regulairy interaction

low-frequency words: regulars are recognized more easily than irregulars (yak vs. yacht) high frequency words: regularity doesn't affect recognition rate (cat vs. have)

problem solving -- people

may not have a clear progression to a solution; depends on how the problem is represented; restructure to overcome fixedness or recognize analogy; PSS: look for search heuristics that reproduce human behavior (top-down analysis of what people do)

how to study language acquisition

methods for studying receptive language ability: high amplitude sucking, conditioned head turn, preferential looking paradigm; methods for studying expressive language: corpus study and elicited production

morphology

morphemes are the smallest units of meaning and the building blocks of words; can be either roots or affixes and are combined using rules; sound-meaning correspondences are generally arbitrary at the morpheme level

speech discrimination

phoneme: smallest unit of sound that makes a difference to meaning; contrastive phonemes: back vs. pack; noncontrastive phonemes: sport vs. pit; contrast isn't universal: japanese doesn't distinguish /r/ and /l/ -> red = led

speech perception: smallest units

phonemes: the smallest unit of sound that makes a different to meaning (e.g., cat vs. bat -- /k/ from cat and /b/ from bat are each a phoneme)

subfields (levels of analysis) of linguistics

phonetics -- physical sounds phonology -- sound systems morphology -- words (roots + affixes) syntax -- phrases, sentences semantics -- word/sentence meaning pragmatics -- discourse meaning

possible cues to help speech segmentation

phonological regularities (helpful: spaces, speech, grasp; exception: cats pajamas) lexical stress (helpful: spaces, fluent, basics, quickly; exception: between) utterance boundaries (helpful: speech -> each; exception: the) integration of multiple cues

general problem solver

physical symbol system (symbolic knowledge); production system: if...then rules; decision procedure moves the system to the goal

what is a physical symbol system?

physical: can be engineered symbol: patterns that represent something else (symbol structure); procedures are a way to transform symbol structures (these can be represented symbolically); thinking = algorithmic manipulation of symbols; like a turing machine (rules, action [read/write/move], and input: infinite tape of symbols)

symbol grounding problem (how do symbols become meaningful?)

problem about how words and thoughts become meaningful to speakers and thinkers; more fundamental than problem of intentionality; answer: words in a language are meaningful for us because we attach meanings to them, but we can't say that same thing about thoughts

problem solving with PSS

problem space; states defined by symbol structures; procedures (rule-based transformations and constraint satisfaction); problem solving through programming; can human thinking be characterized the same way?

coincidence model (implementation matters)

process by which a neuron can encode information by detecting the occurrence of temporally close but spatially distributed input signals; coincidence detectors influence temporal jitter, reducing spontaneous activity, and forming associations between separate neural events

the discontinuity hypothesis (theory of language evolution)

punctuated evolution of language; no language ability in existing non-human primates; language is a module separate from other cognitive abilities; a large part of language is innate

productivity of syntax

recursion is one way in which syntax is productive: an algorithm is performed on an input to get an output, the the same algorithm is performed on that output to get a second output, etc.

access routes in visual word recognition

semantic route going direclty from orthography to meaning; a phonological route going from orthography to meaning [know diagram]

formal and rule-based approaches to AI

serial, logical, transparent; powerful -- capable of solving a variety of problems, but there are some diffculties

rule-based approaches

serial, symbolic representations, if-then rules, transparent, learning is a challenge, top-down

unsupervised learning

slow and limited (raw signal driven); hebbian learning (LTP and long-term depression)

review of challenges faced by vision and speech perception

solved by integrating knowledge with data (top-down and bottom-up processing); reading involves mapping visual words to meaning and to sound; connectionist models can solve these problems by learning from available data

the motor theory of speech perception (one possible explanation for mcgurk effect)

speech sounds are analyzed in terms of how they're producted; categorical perception: speech sounds are perceived categorically because speech is processed as intended phonetic gestures; mcgurk effect: both visual and auditory information about sound protection is used in speech perception

multilated checkerboard problem (meaning maters for problem solving)

suppose a standard 8x8 chessboard has 2 diagonally opposite corners removed, leaving 62 squares; is it possible to place 31 dominoes of size 2x1 so as to cover all of these squares?

symbols broken down into features

symbols are recognized in terms of distinctive and separable visual features; it's harder to find a particular letter if it's embedded among other letters with similar features; the context effect [know diagram]

syntax & semantics in language of thought hypothesis (1)

syntax is a set of formal, logic based rules; allow for combination of ideas/inference; a is an object, F and G are properties: (1) Fa (2) Ga -> Fa & Ga (logical deductibility); a is an animal, F is large, G is sitting -> a is a large sitting animal (logical consequence)

3 types of procedures

syntax: language parsing and production (e.g., sentence procedure) [know diagram] semantics: labels, sizes, spatial relationships and constrants (e.g., cleartop procedure) cognitive deduction: retrieval and comparison

hebbian learning

the connection between 2 neurons is strengthened when firing of one leads to firing of the other and weakened when firing of one does not lead to firing of the other; does this depend on a teacher as well? how does learning occur? -> occurs locally, there is no teacher or error signal

interaural time difference (implementation matters)

the difference in arrival time of a sound between 2 ears; it's important in the localization of sounds as it provides a cue to the direction or angle of the sound source from the head; if a signal arrives at the head from one side, the signal has further to travel to reach the far ear than the near ear; this path length difference results in a time difference between the sound's arrivals at the ears, which is detected and aids the process of indentifying the direction of sound source

debate on the phonological route

the dual-route account: a rule-based mechanism dealing with regular orthography to phonology mappings (lint, mint); table look-up mechanism with exception (pint) [know diagram]

descritpive grammar

the grammar that we spontaneously use and understand in everyday speech; this is what linguists care about

problem of intentionality

the property symbols have of being about things in the world; problem about how words and thoughts connect up with the world (what makes it the case that a given chinese character refers to tables is that that is how it's used by people in china, if this is right, then we have a good answer to the question of how the symbols connect up with the world)

prescriptive grammar

the rules of "proper" language are referred to as prescriptive: they tell you how your language ought to be used based on some standard of educated speech/writing; these rules are often arbitrary/irrelevant to the way we use language every day

SHRDLU

the user instructed SHRDLU to move various objects around in the "blocks world" containing various basic objects; what made SHRDLU unique was the combination of 4 simple ideas that added up to make the simulation of "understanding" far more convincing; interacting modules, algorithms, language rules can be physically implemented, but *limited* to the micro-world [know diagram]

the language of thought hypothesis

thinking = syntactic operations on symbolic representations; representations = propositional attitudes; proposition: a statement with truth value (my nephew is adorable); attitude: beliefs, desires, positions; intentional realism: beliefs and desires are the causes of behavior; thinking has some properties of language; syntax vs. semantics

language of thought hypothesis summary

thinking has language-like properties; representations are physical symbol structures; syntax respects formal (logical) properties of representations; semantics follows from these

speaker variability (challenges in speech perception)

vocal tract differences among people

modeling human intelligence

what makes human intelligence different? general -- people can solve many kinds of problems; good enough, if not optimal or logical; coordinated with behavior; fast when it needs to be; learning; novel situations; creativity and insight

multi-layer networks

why do we need more than one layer? perceptrons can only compute some kinds of functions (XOR: true if A and B are different, false if they are the same); to compute any function more layers are needed, just add more units...units are the same as units in the single-layer network. how should the weights and thresholds be set? -> feedforward processing [know diagram]

ambiguity

words and sentences can have more than one distinct meaning (so can items at other levels of linguistic analysis); the ambiguity in words/sentences is often a result of there being more than one way to derive the morphemic or sentential structure (ex. undoable: (1) un + do > undo; undo + able > undoable = 'able to be undone', (2) do + able > doable; un + undoable > undoable = 'not able to be done')


Conjuntos de estudio relacionados

Chapter 8 Solids, liquids and gases

View Set

digestion and metabolism of lipids

View Set

COMPTIA A+ Terms, comptia a+ 901, study guide

View Set

Chp32 Care of Pts w/Noninfectious Lower Resp Problems

View Set

Guía de Estudio - Bioquímica (ESTRUCTURA DE LAS PROTEÍNAS)

View Set

NUR 366 chapter 41, 29, 31, & 43

View Set

Assignment: Chapter 03: Organizational Environments and Cultures

View Set

Chapter 29: Infection Prevention and Control

View Set