CHAP 6

¡Supera tus tareas y exámenes ahora con Quizwiz!

Emotional Overtones

As learners pay attention to and think about new information, their thoughts and memories sometimes become emotionally charged—a phenomenon called hot cognition. For example, learners might get excited when they read about advances in science that could lead to effective treatments for cancer, spinal cord injuries, or mental illness. Or they might feel sadness and compassion when they read about severely impoverished living conditions in certain parts of the world. And they will, we hope, get angry when they learn about atrocities committed against African American slaves in the pre-Civil War days of the United States or about large-scale genocides carried out in more recent times in Europe, Africa, and Asia. When information is emotionally charged in such ways, learners are more likely to pay attention to it, continue to think about it for an extended period, and repeatedly elaborate on it (Bower, 1994; Heuer & Reisberg, 1992; Manns & Bass, 2016). And over the long run, learners can usually retrieve material with high emotional content more easily than they can recall relatively nonemotional information (LaBar & Phelps, 1998; Phelps & Sharot, 2008; Reisberg & Heuer, 1992). It appears that students' emotional reactions to classroom topics become integral parts of their network of associations in long-term memory (Barrett, 2017; Immordino-Yang & Gotlieb, 2017). Academic subject matter certainly doesn't need to be dry and emotionless. In addition to presenting subject matter that evokes students' emotions, we can promote hot cognition by revealing our own feelings about a topic. For instance, we might bring in newspaper articles and other outside materials about which we're excited, or we might share the particular questions and issues about which we ourselves are concerned (Brophy, 2004; R. P. Perry, 1985)

Critiquing the three-component model

As mentioned earlier, the three-component model just described oversimplifies—and probably overcompartmentalizes—the nature of human memory. For example, attention may be an integral part of working memory, rather than the separate entity depicted in Figure 6.1 (Cowan, 2007; Kiyonaga & Egner, 2014; Oberauer & Hein, 2012). Furthermore, studies conducted by neuropsychologists and other researchers have yielded mixed results about whether working memory and long-term memory are distinctly different entities (e.g., Nee et al., 2008; Öztekin, Davachi, & McElree, 2010; Talmi, Grady, Goshen-Gottstein, & Moscovitch, 2005). Some psychologists have proposed that working memory and long-term memory simply reflect different activation states of a single memory (e.g., J. R. Anderson, 2005; Campo et al., 2005; Nee et al., 2008; Postle, 2016). According to this view, all information stored in memory is in either an active or an inactive state. Active information, which may include both incoming information and a certain bit of information previously stored in memory, is what people are currently paying attention to and thinking about— information we've previously described as being in working memory. As attention shifts, other pieces of information in memory become activated, and the previously activated information gradually fades into the inactive background. The bulk of information stored in memory is in an inactive state, such that people aren't consciously aware of it; this is information we've previously described as being in long-term memory. Despite its imperfections, the three-component model can help us remember aspects of human learning and memory that we teachers should take into account as we plan and teach lessons. For example, the model highlights the critical role of attention in learning, the limited capacity of attention and working memory, the interconnectedness of the knowledge that learners acquire, and the importance of relating new information to existing knowledg

Regular Practice and Review

As noted earlier, rehearsal—mindlessly repeating information over and over within the course of a few seconds or minutes—is a relatively ineffective way of getting information into long-term memory. But by "regular practice" here, we mean repetition over a lengthy time span: reviewing and using information and skills at periodic intervals over the course of a few weeks, months, or years. When practice is spread out in this manner—ideally in a variety of contexts—people of all ages learn something better and remember it longer (Lindsey, Shroyer, Pashler, & Mozer, 2014; Rohrer & Pashler, 2010; Sodorstrom & Bjork, 2015). When learners continue to practice things they've already mastered, they eventually achieve automaticity: They can retrieve what they've learned quickly and effortlessly and can use it almost without thinking (J. R. Anderson, 2005; Pashler, Rohrer, Cepeda, & Carpenter, 2007; Proctor & Dutta, 1995). The next exercise will almost certainly illustrate your automaticity for one of your most basic skills The correct answers are "red," "purple," "pink," and "green," in that order. Many skilled readers have trouble with this task because those particular letter combinations automatically evoke different responses: "blue," "green," "yellow," and "orange" (e.g., J. D. Cohen, Dunbar, & McClelland, 1990; Stroop, 1935). Most young children—those who haven't yet acquired automaticity for basic word recognition—have a much easier time answering correctly. By and large, of course, automaticity for basic knowledge and skills is an advantage rather than a liability, in that learners can easily retrieve knowledge they frequently need. Learning some things to a level of automaticity has a second advantage as well. Remember that working memory has a limited capacity: The active, consciouslythinking part of the human memory system can handle only so much at a time. When much of its capacity must be used for recalling isolated facts or carrying out simple procedures, little room is left for addressing more complex situations or tasks. One key reason for learning some facts and procedures to the point of automaticity, then, is to free up working memory capacity for complex tasks and problems that require those simpler facts and procedures (De La Paz & McCutchen, 2011; L. S. Fuchs et al., 2013; Kalyuga, 2010). For example, second graders who are reading a story can better focus their efforts on understanding it if they don't have to sound out words like before and after. High school chemistry students can more easily interpret the expression Na2CO3 (sodium carbonate) if they don't have to stop to think about what the symbols Na, C, and O represent. Unfortunately, automaticity is achieved in only one way: practice, practice, and more practice. Practice doesn't necessarily make perfect, but it does make knowledge more durable and more easily retrievable. When learners use information and skills frequently, they essentially pave their retrieval pathways—in some cases creating superhighways. This is not to say that we should continually assign drill-and-practice exercises involving isolated facts and procedures (e.g., see Figure 6.12). Such activities promote rote (rather than meaningful) learning, are often boring, and are unlikely to convince students of the value of the subject matter (Mac Iver, Reuman, & Main, 1995). A more effective approach is to routinely incorporate basic knowledge and skills into a variety of meaningful and enjoyable activities, such as problem-solving tasks, brainteasers, group projects, and games.

How knowledge can be organized

As we consider the nature of long-term memory storage, it's helpful to remember that to a considerable degree, learners construct their knowledge and understandings. In the process of constructing knowledge, learners often create well-integrated entities that encompass particular ideas or groups of ideas. For example, beginning in infancy, human beings form concepts that enable them to categorize objects and events (G. Mandler, 2011; J. M. Mandler, 2007; Quinn, 2002). Some concepts, such as butterfly, chair, and backstroke, refer to a fairly narrow range of objects or events. Other concepts are fairly general ones that encompass many more-specific concepts. For example, the concept insect includes ants, bees, and butterflies (e.g., see Figure 6.4), and the concept swimming includes the backstroke, dog paddle, and butterfly. In the examples here, notice how two different meanings of the word butterfly fall into two distinctly different category groups, which might lead you to follow a train of thought such as this one: horse → cowboy → lasso → rope → knot → Girl Scouts → camping → outdoors → nature → insect → butterfly → swimming By combining numerous objects or events into single entities, concepts take some of the strain off of working memory's limited capacity (G. Mandler, 2011; Oakes & Rakison, 2003). For instance, the concept molecule takes very little "space" in working memory despite the many things we know about molecules, such as their composition and very tiny size. The Into the Classroom feature "Teaching Concepts" offers suggestions for fostering concept learning in a variety of academic disciplines. Learners also pull some concepts together into general understandings of what things are typically like. Such understandings are sometimes called schemas (e.g., Rumelhart & Ortony, 1977; Kalyuga, 2010; Schraw, 2006). For example, let's return once again to the concept horse. You know what horses look like, of course, and you can recognize one when you see one. Thus you have a concept for horse. But now think about the many things you know about horses. What do they eat? How do they spend their time? Where are you most likely to see them? You can probably retrieve many facts about horses, perhaps including their fondness for oats and carrots, their love of grazing and running, and their frequent appearance in pastures and at racetracks. The various things you know about horses are closely interrelated in your long-term memory in the form of a "horse" schema. People have schemas not only about objects but also about events. When a schema involves a predictable sequence of events related to a particular activity, it's sometimes called a script. The next exercise provides an example You probably had no trouble making sense of the passage because you've been to a doctor's office yourself and have a schema for how those visits usually go. You can therefore fill in a number of details that the passage doesn't tell you. For example, you probably inferred that John actually went to the doctor's office, although the story omits this essential step. Likewise, you probably concluded that John took off his clothes in the examination room, not in the waiting room, even though the story doesn't tell you where John did his striptease. When critical information is missing, as is true in the story about John, schemas and scripts often enable learners to fill in the gaps in a reasonable way. On a much larger scale, humans—young children included—construct general understandings and belief systems, or personal theories, about particular aspects of the world (Barger & Linnenbrink-Garcia, 2017; Gelman, 2003; Keil & Newman, 2008). Such theories include many concepts and the relationships among them (e.g., correlation, cause-and-effect). To see what some of your own theories are like, try the next exercise. Chances are, you concluded that the coffeepot was transformed into a bird feeder but that the raccoon was still a raccoon despite its cosmetic makeover and stinky surgery. Now how is it possible that the coffeepot could be made into something entirely different, whereas the raccoon could not? Even young children seem to make a basic distinction between human-made objects (e.g., coffeepots, bird feeders) and biological entities (e.g., raccoons, skunks) (Gelman & Kalish, 2006; Inagaki & Hatano, 2006; Keil, 1986, 1989). For instance, human-made objects are defined largely by the functions they serve (e.g., brewing coffee, feeding birds), whereas biological entities are defined primarily by their origins (e.g., the parents who brought them into being, their DNA). Thus, when a coffeepot begins to hold birdseed rather than coffee, it becomes a bird feeder because its function has changed. But when a raccoon is cosmetically and surgically altered to look and smell like a skunk, it still has raccoon parents and raccoon DNA and so can't possibly be a skunk. By the time children reach school age, they've already constructed basic theories about their physical, biological, social, and psychological worlds (Flavell, 2000; Geary, 2005; Gelman, 2003; Torney-Purta, 1994). In general, self-constructed theories help children make sense of and remember personal experiences, classroom subject matter, and other new information. Yet because children's theories often evolve with little or no guidance from more knowledgeable individuals, they sometimes include erroneous beliefs that interfere with new learning (more about this point in the discussion of conceptual change later in the chapter)

Basic assumptions of cognitive psycoloogy

At the core of cognitive psychology are several basic assumptions about how people learn Cognitive processes influence what is learned. The specific things people mentally do as they try to interpret and remember what they see, hear, and study—that is, their cognitive processes—have a profound effect on what they specifically learn and remember. For example, in the opening case study, Kanesha thinks about the nasal bone and humerus in ways that should help her remember them. However, she thinks about the sternum in a way that interferes with her ability to remember it correctly, and she gives little or no thought to why certain other bones have particular names. The extent to which Kanesha thinks about the material she needs to learn—and also how she thinks about it—affects her performance on the quiz

Diversity in Cognitive Processes

Children and adolescents differ considerably in the various factors that influence their ability to learn and remember in the classroom, including their attention spans, working memory capacity, executive functioning, long-term memory storage processes, and prior knowledge. For example, students with a relatively small working memory capacity and poorly developed executive functioning are apt to have trouble remembering instructions, tackling complex tasks, and keeping their minds on a task at hand—all of which adversely affect their academic achievement levels (Alloway, Gathercole, Kirkwood, & Elliott, 2009; DeMarie & López, 2014; Miyake & Friedman, 2012). Working memory and executive functioning difficulties are especially common in children who have grown up in chronically stressful living conditions, often as a result of living in extreme poverty (G. W. Evans & Schamberg, 2009; Masten et al., 2012; Noble, McCandliss, & Farah, 2007). Meanwhile, students who have grown up in multi-language environments and hence are fluent in at least two different languages—that is, students who are bilingual—tend to surpass their single-language peers in both working memory capacity and executive functioning, probably because they've had considerable practice in mentally inhibiting their knowledge of one language while speaking or writing in the other (Bialystok, Craik, & Luk, 2008; Grundy & Timmer, 2017). Another noteworthy source of diversity in cognitive processing is cultural background. Different cultures foster somewhat different ways of looking at physical and social events—different worldviews—that influence how students interpret classroom subject matter. For example, students whose cultures have taught them to strive to live in harmony with their natural environment may struggle with a science curriculum that urges them to change their environment in some way (Atran, Medin, & Ross, 2005; Medin, 2005). And whereas students of European ancestry are apt to view the Europeans' migration to North America in the 1600s and 1700s as a process of settlement, students with Native American backgrounds might instead view it as increasing invasion (Banks, 1991; VanSledright & Brophy, 1992). Children's varying cultural backgrounds may also have prepared them to handle different kinds of learning environments and tasks. For instance, African American and Hispanic students are more likely than European American students to be comfortable in environments in which several activities are going on at once and can more easily shift their attention from one activity to another (Correa-Chávez, Rogoff, & Mejía Arauz, 2005; Tyler, Uqdah et al., 2008). Students from North American, East Asian, and Middle Eastern cultures are apt to have had experience rote-memorizing specific facts and written materials (perhaps in the form of multiplication tables, poems, or religious teachings), whereas students from certain cultures in Africa, Australia, and Central America may have been encouraged to remember oral histories or particular landmarks in the local terrain (L. Chang et al., 2011; Rogoff, 2001, 2003; Rogoff et al., 2007; Q. Wang & Ross, 2007). The importance of wait time depends partly on students' cultural backgrounds as well. For example, some Native American students may wait several seconds before responding to a question as a way of showing respect for an adult (Castagno & Brayboy, 2008; Gilliland, 1988). And English language learners—students who have grown up in a non-English-speaking environment and are still developing their proficiency in English—are apt to require more mental translation time than their native-Englishspeaking peers (Igoa, 2007). To maximize each student's learning and achievement in the classroom, we must take such individual and group differences into account. For example, we should be especially careful to engage the interest of—and also minimize distractions for—those students whose attention easily wanders. In addition, in our attempts to promote meaningful learning and other effective storage processes, we should relate classroom subject matter to the diverse background experiences that students have had. And we must also allow sufficient wait time after questions and comments so that all students can actively think about and elaborate on topics of discussion.

Model of human memory

Cognitive psychologists have offered many explanations regarding how people mentally process and remember new information and events—explanations that fall into the general category of information-processing theory. Some early explanations portrayed human thinking and learning as being similar to the ways computers operate. It has since become clear, however, that the computer analogy is too simple: People often think about and interpret information in ways that are hard to explain in the relatively simplistic, one-thing-always-leads-to-another ways that are typical for computers (e.g., Hacker, Dunlosky, & Graesser, 2009a; G. Marcus, 2008). Central to information processing theory is the concept of memory. In some instances, we'll use this term to refer to learners' ability to mentally save previously learned knowledge or skills over a period of time; in other instances, we'll use it when talking about a particular location where learners "put" what they learn—perhaps in working memory or long-term memory. The process of putting what is being learned into memory is called storage. For example, each time you go to class, you undoubtedly store some of the ideas presented in a lecture or class discussion. You may store other information from class as well—perhaps the name of the person sitting next to you (George) or the pattern of the instructor's shirt (a ghastly combination of orange and purple splotches). Yet learners rarely store information exactly as they receive it. Instead, they engage in encoding, modifying the information in some way. For instance, when listening to a history lecture, you might imagine what certain historical figures may have looked like—thus encoding some verbal input as visual images. And when you see your instructor's orange and purple shirt, you might think, "This person desperately needs a wardrobe makeover"—thus assigning a specific meaning and interpretation to what you've seen. At some point after storing a piece of information, you may discover that you need to use it. The process of recalling previously stored information—that is, finding it in memory—is called retrieval. The following exercise illustrates this process. As you probably noticed when you tried to answer these questions, retrieving some kinds of information from memory—your name, for instance—is quick and easy. Other things—perhaps the capital of France (Paris) and the year of Columbus's first voyage across the Atlantic (1492)—can be retrieved only after a little bit of thought and effort. Still other pieces of information, even though you may have stored them in memory at one time, may be almost impossible to retrieve. Perhaps the correct spelling of hors d'oeuvre falls into this category. Despite their common use of such terms as storage, encoding, and retrieval, information-processing theorists don't all agree about the precise nature of human memory. However, many suggest that memory has three key components: a sensory register, a working (short-term) memory, and a long-term memory. A three-component model of human memory, based loosely on one proposed by Atkinson and Shiffrin in 1968 with modifications to reflect more recent research findings, is presented in Figure 6.1. The model oversimplifies the nature of memory to some degree (more about this point later), but it provides a good way to organize much of what we know about how memory works. Please note that in referring to three components of memory, we're not necessarily referring to three separate parts of the brain. The model of memory we describe here has been derived largely from studies of human behavior, rather than from studies of the brain.

visual imagery

Earlier we mentioned imagery as one possible way in which information might be encoded in long-term memory. Many research studies have shown that visual imagery—forming mental pictures of objects or ideas—can be a highly effective method of storing information (Sadoski & Paivio, 2001; D. L. Schwartz & Heiser, 2006; Urgolites & Wood, 2013). We authors are classifying it as a form of meaningful learning in part because it often involves making use of other visual images that are already stored in long-term memory. To show you how effective visual imagery can be, the next exercise will teach you a few words in Mandarin Chinese Did the Chinese words remind you of the visual images you stored? Did the images, in turn, help you remember the English meanings? You may have remembered all five words easily, or you may have remembered only one or two. People differ in their ability to use visual imagery: Some form images quickly and easily, whereas others form them only slowly and with difficulty (Behrmann, 2000; J. M. Clark & Paivio, 1991; Kosslyn, 1985). In the classroom, we can encourage the use of visual imagery in several ways. We can ask students to imagine how certain events in literature or history might have looked (Johnson-Glenberg, 2000; Sadoski & Paivio, 2001). We can provide visual materials (pictures, charts, graphs, etc.) that illustrate important but possibly abstract ideas (R. K. Atkinson et al., 1999; R. Carlson, Chandler, & Sweller, 2003; Verdi, Kulhavy, Stock, Rittschof, & Johnson, 1996). We can also ask students to create their own illustrations or diagrams of the things they're studying, as 9-year-old Trisha has done in the art shown in Figure 6.7 (Schwamborn, Mayer, Thillmann, Leopold, & Leutner, 2010; van der Veen, 2012; Van Meter & Garner, 2005). Visual imagery can be especially powerful when used in combination with other forms of encoding. For example, students more readily learn and remember information they receive in both a verbal form (e.g., a lecture or textbook passage) and a graphic form (e.g., a picture, map, or diagram) (R. E. Mayer, 2011b; Moreno, 2006; Rau, Aleven, & Rummel, 2015). They're also likely to benefit from being explicitly asked to represent information both verbally and visually—that is, as both words and pictures.

how declarative knowledge is learned

Especially when talking about the kinds of declarative knowledge acquired at school, learning theorists distinguish between two general forms of long-term memory storage processes—rote learning and meaningful learning—and among more specific storage processes that differ considerably in their effectiveness (see Table 6.3).

Promoting Conceptual Change

For the reasons just identified, promoting conceptual change can be quite a challenge. Not only must we help students learn new things, but we must also help them unlearn—or at least inhibit—their existing beliefs. Following are strategies that seem to have an impact, especially when used in combination.

Why Learners Sometimes Forget

Fortunately, people don't need to remember everything they've stored. For example, you may have no reason to remember the Internet address of a website you looked at yesterday, the plot of last week's episode of a certain television show, or the due date of an assignment you turned in last semester. Much of the information learners encounter is, like junk mail, not worth keeping, and forgetting it enables learners to get rid of needless clutter. But sometimes learners have trouble recalling what they do need to remember. Here, we look at several possible explanations for why students may sometimes forget important information.

Developmental Trends in Storage Processes for Declarative Information

Meaningful learning—relating new information to prior knowledge—probably occurs in one form or another at virtually all age levels. More specific strategies—such as rehearsal, internal organization, and visual imagery—are fairly limited in the early elementary years but increase in both frequency and effectiveness over the course of childhood and adolescence. The frequency of elaboration—especially as a process that learners intentionally use—picks up a bit later, often not until adolescence, and is more common in high-achieving students. Table 6.4 provides more detailed information on the nature of long-term memory storage processes at different grade levels

Learning,memory, and the brain

Historically, theorists and researchers have believed that the physiological basis for most learning and memory lies in changes in the interconnections among neurons—in particular, in forming new synapses, strengthening existing ones, or eliminating counterproductive ones (e.g., M. I. Posner & Rothbart, 2007; Siegel, 2012; Trachtenberg et al., 2002). In addition, some learning may involve the formation of new neurons, especially in a small, seahorse-shaped structure in the middle of each side of the brain—a structure called the hippocampus—and possibly also in certain areas of the cortex. New learning experiences appear to enhance the survival rate and maturation of the young neurons; without such experiences, these neurons slowly die away (Kaku, 2014; Leuner et al., 2004; Shors, 2014; Spalding et al., 2013). Some neuroscientists have proposed that certain star-shaped cells in the brain, known as astrocytes, are just as important as neurons—possibly even more important—in learning and memory. Figure 6.3 illustrates the general nature of an astrocyte and its connections with both neurons and the local blood supply. In humans, astrocytes far outnumber neurons, have many connections with one another as well as with neurons, and appear to have considerable control over what neurons do and don't do and how much neurons communicate with one another. A normal brain produces many new astrocytes throughout its lifespan (X. Han et al., 2013; Koob, 2009; Oberheim et al., 2009). As for where learning occurs, the answer is: many places. Key in the process is the cortex, the large, lumpy structure that covers the top and sides of the brain. The part of the cortex that's right behind the forehead—the prefrontal cortex—seems to be the primary headquarters for working memory and its central executive, although all of the cortex may be active to a greater or lesser extent in interpreting new input in light of previously acquired knowledge (C. Blair, 2016; Chein & Schneider, 2012; Gonsalves & Cohen, 2010; Nee, Berman, Moore, & Jonides, 2008). The hippocampus is also actively involved in learning, in that it pulls together the information it simultaneously receives from various parts of the brain (Davachi & Dobbins, 2008; Shohamy & TurkBrowne, 2013; Shors, 2014). As you might guess, a healthy brain is essential for effective learning. The Applying Brain Research feature "Enhancing Students' Brain Functioning" presents four general recommendations that are grounded in brain research.

Nature of the sensory register

If you have ever played with a lighted sparkler at night, you've seen a trail of light that follows the sparkler as you wave it about. If you have ever daydreamed in class, you may have noticed that when you tune back in to a lecture, you can still hear the three or four words that were spoken just before you started paying attention to your instructor again. The sparkler's trail and the words that linger aren't actually out there in the environment. Instead, they're recorded in your sensory register. The sensory register is the component of memory that holds the information you receive—the input—in more or less its original, unencoded form. Thus, visual input is stored in a visual form, and auditory input is stored in an auditory form (e.g., Coltheart, Lea, & Thompson, 1974; Cowan, 1995). The sensory register has a large capacity: It can hold a great deal of information at any one time. That's the good news. The bad news is that information stored in the sensory register doesn't last very long (e.g., Cowan, 1995; Dahan, 2010; Wingfield & Byrnes, 1981). Visual information (i.e., what you see) probably lasts for less than a second. Auditory information (i.e., what you hear) probably lasts slightly longer, perhaps for 2 or 3 seconds. To keep information for any time at all, then, learners need to move it to working memory. Whatever information isn't moved is probably lost, or forgotten.

Relevant Retrieval Cues

If you were educated in North America, then at one time or another you probably learned the names of the five Great Lakes. Yet, at any given moment, you might have trouble retrieving all five, even though they're all still stored somewhere in your long-term memory. Perhaps Lake Michigan doesn't come to mind when you recall the other four. The HOMES mnemonic presented in Figure 6.9 provides a retrieval cue—a hint about where to "look" in long-term memory. The mnemonic tells you that one lake begins with the letter M, prompting you to search among the M words you know until (we hope) you find Michigan. Learners are more likely to retrieve information when relevant retrieval cues are present to start their search of long-term memory in the right direction (e.g., Morris, Bransford, & Franks, 1977; Tulving & Thomson, 1973). Providing retrieval cues is often useful in the classroom, especially when students have trouble recalling information that might help them remember or apply other information. An example comes from Jess Jensen, a former teacher intern of one of us authors. A student in her eighth-grade history class had been writing about the Battle of New Orleans, which was a decisive victory for the United States in the War of 1812. The following exchange took place: Student: Why was the Battle of New Orleans important? Jess: Look at the map. Where is New Orleans? [The student finds New Orleans. Jess: Why is it important? Student: Oh! It's near the mouth of the Mississippi. It was important for controlling transportation up and down the river In the early grades, teachers typically provide many retrieval cues for their students; for instance, they remind students about tasks they need to do at certain times ("I hear the fire alarm. Remember, we must all walk quietly during a fire drill"). But as students grow older, they must develop greater independence, relying more on themselves and less on their teachers for the things they need to remember. At all grade levels we can teach students ways of providing retrieval cues for themselves. For example, if we expect first graders to get a permission slip signed, we might ask them to write a reminder on a piece of masking tape that they put on their jacket or backpack. If we give junior high school students a major assignment due in several weeks, we might suggest that they tape a note with the due date to their bedside table or add one or more reminders to their cell phone calendar. One tenth grader developed several effective retrieval cues, each appropriate for certain situations:

Meaningful Learning

In contrast to rote learning, meaningful learning involves recognizing a relationship between new information and one or more things already stored in long-term memory. Whenever we use such words as comprehension and understanding, we're talking about meaningful learning. In the vast majority of cases, meaningful learning is more effective than rote learning for storing information in longterm memory. It's especially effective when learners relate new ideas not only to what they already know about the world but also to what they know or believe about themselves—for instance, to self-descriptions or personal life experiences (Craik, 2006; Heatherton, Macrae, & Kelley, 2004; Nairne & Pandeirada, 2016). Meaningful learning takes a variety of forms. Three forms that researchers have studied in depth are elaboration, organization, and visual imagery. All three are constructive in nature: They involve combining several pieces of information into a meaningful whole

Elaboration

In elaboration, learners use their prior knowledge to embellish on a new idea, thereby storing more information than was actually presented. For example, when one of us authors had a class in Mandarin Chinese in high school, she learned that the Chinese word wŏmen means "we." "Aha!" she thought to herself, "the sign on the restroom that we girls use says wŏmen!" Similarly, when a student learns that the crew on Christopher Columbus's first trip across the Atlantic threatened to revolt, the student might speculate, "I bet the men were really frightened when they continued to travel west day after day without ever seeing signs of land." As we'll see later in the chapter, learners sometimes elaborate on new information in inaccurate and counterproductive ways. On average, however, the more students elaborate on new material—that is, the more they use what they already know to help them understand and interpret new material—the more effectively they will store and remember it. Thus, students who regularly elaborate on what they learn in school usually show higher achievement than those who simply take information at face value (J. R. Anderson, 2005; McDaniel & Einstein, 1989; Paxton, 1999). One effective way to encourage elaboration in the classroom is to have students talk or write about a topic—for instance, to summarize what they've learned, relate new concepts to their personal experiences, or express and defend certain positions on controversial topics (e.g., Bangert-Drowns, Hurley, & Wilkinson, 2004; Shanahan, 2004). Another good strategy is to ask questions that require students to draw inferences from or in some other way expand on something they've just learned—questions such as

Moving information to long term memory

In the model of memory depicted in Figure 6.1, the arrow between working memory and long-term memory points in both directions. In most cases, effectively storing new information in long-term memory involves connecting it to relevant information that's already in long-term memory—a process that requires bringing the "old" information back into working memory. The following exercise can give you an idea of how this might happen. No doubt the second letter string was easier to learn because you could relate it to something you already knew: the words familiar words. How easily were you able to learn and remember the picture? Do you think you could draw it from memory a week from now? Do you think you could remember it more easily if it had the title "Bird's Eye View of a Cowboy Riding a Bicycle"? The answer to the last question is almost certainly yes, because the title would help you relate the picture to familiar shapes, such as those of a bicycle and a cowboy hat (e.g., see Bower, Karlin, & Dueck, 1975).

Distinctiveness

Learners are more likely to remember things that are unique in some way—for instance, things that are new, unusual, or a bit bizarre (R. R. Hunt & Worthen, 2006). For example, second graders are more likely to remember a visit to the local firehouse than, say, their teacher's explanation of what an adverb is. And when U.S. high school students recall what they've learned about events leading up to the American Revolution, they're more likely to remember the Boston Tea Party—a unique and colorful illustration of colonists' dissatisfaction with British taxation policies—than, say, the Quartering Act or the publication of Thomas Paine's Common Sense. Certainly, learners are more likely to pay attention to distinctive information, increasing the odds that they store it in long-term memory in the first place. But even when attention and initial learning have been the same, distinctive information is easier to retrieve than dull-and-ordinary information (Craik, 2006; Mather & Sutherland, 2011)

Rote Learning

Learners engage in rote learning when they try to learn and remember something without attaching much meaning to it. For example, in the "Letters and a Picture" exercise presented earlier, you would be engaging in rote learning if you tried to remember the letter string FAMILIARWORDS simply as a list of isolated letters or if you tried to remember the cowboy/bicycle drawing as a collection of random, unrelated lines and curves.One common form of rote learning is rehearsal, repeating something over and over within a short timeframe (typically a few minutes or less), either by saying it aloud or by continuously thinking about it in an unaltered, verbatim fashion. Earlier we described how maintenance rehearsal—verbally repeating something over and over—helps us keep information in working memory indefinitely. Contrary to what many students think, however, rehearsal is not an effective way of storing information in long-term memory. If a learner repeats something often enough, it might eventually "sink in," but the process is slow, laborious, and not much fun. Furthermore, for reasons we'll identify later, people who use rehearsal and other forms of rote learning often have trouble remembering what they've learned (J. R. Anderson, 2005; Craik & Watkins, 1973; McDermott & Naaz, 2014). Verbally rehearsing information is probably better than not actively processing it at all, and rehearsal may be one of the few strategies students can use when they have little prior knowledge to draw on to help them understand new material (E. Wood, Willoughby, Bolger, & Younger, 1993). For example, in the opening case study Kanesha resorts to rehearsal in her efforts to remember such seemingly nonsensical bone names as the coccyx, clavicle, and patella. Ideally, however, we should encourage students to engage in meaningful learning whenever possible

Nature of long term memory

Long-term memory is where learners store their general knowledge and beliefs about the world, the things they've learned from formal instruction (e.g., the capital of France, the correct spelling of hors d'oeuvre), and their recollections of events in their personal lives. It's also where learners store their knowledge about how to perform various actions, such as how to dribble a basketball, use a cell phone, and do long division. Much of the information stored in long-term memory is interconnected. To see what we mean, try the next exercise The last word in your sequence might be one with little or no obvious relationship to horses. Yet you can probably see a logical connection between each pair of items in the sequence. Related pieces of information tend to be associated with one another in long-term memory, perhaps in a network similar to the one depicted in Figure 6.2. Information stored in long-term memory lasts much, much longer than information stored in working memory—perhaps it lasts a day, a week, a month, a year, or a lifetime, depending on a variety of factors that we'll examine in upcoming sections of the chapter. In addition to its indefinitely long duration, long-term memory seems to be capable of holding as much information as a learner needs to store there. There's probably no such thing as "running out of room." In fact, for reasons you'll discover shortly, the more information already stored in long-term memory, the easier it is to learn new things.

Internal Organization

On average, we humans learn and remember a body of new information more easily when we pull it together in some reasonable way (e.g., McNamara & Magliano, 2009; Nesbit & Adesope, 2006; D. H. Robinson & Kiewra, 1995). Such internal organization involves making connections among various pieces of new information and forming an overall cohesive structure. For example, a learner might group information into categories, as you probably did in the "Remembering 12 Words" exercise near the beginning of the chapter. An even better way of organizing information is to identify interrelationships among its various parts. For instance, when learning about velocity, acceleration, force, and mass in a physics class, a student might better understand these concepts by seeing how they're interconnected—perhaps by learning that velocity is the product of acceleration and time (v = a × t) and that an object's force is determined by both its mass and its acceleration ( f = m × a). The trick is not simply to memorize the formulas (that would be rote learning) but rather to make sense of the relationships that the formulas represent. It's often helpful to give students specific structures they can use to organize information. For example, the weblike notetaking form shown in Figure 6.5 can help elementary students organize what they learn about tarantulas. Another effective structure is a two-dimensional matrix or table that enables students to compare several items with respect to various characteristics—for instance, how various geographical regions differ in topography, climate, economic activities, and cultural practices (R. K. Atkinson et al., 1999; Kiewra, DuBois, Christian, & McShane, 1988; D. H. Robinson & Kiewra, 1995). A third approach is to teach students how to create concept maps—diagrams that depict the concepts of a unit and their interrelationships (Hattie, 2009; Nesbit & Adesope, 2006; Novak, 1998). Figure 6.6 shows concept maps that two different students might construct after a lesson about gorillas. The concepts themselves are circled, and their interrelationships are indicated by lines with words or short phrases. Several student-friendly concept-mapping software programs are available for creating and modifying concept maps quickly and easily; three examples are Coggle, Kidspiration, and MindMapper Jr. Not only can self-constructed organizational structures help students learn more effectively, but they can also help teachers assess students' learning. For example, the concept map on the left side of Figure 6.6 reveals only spotty, fragmented knowledge about gorillas. Furthermore, the student has two ideas that need correction. First, contrary to a common stereotype, gorillas don't regularly swing from trees, although young ones may occasionally climb a tree to escape danger. Second, gorillas aren't especially "fierce" creatures. For the most part, they live a peaceful existence within their family group; they get nasty (e.g., by beating their chests) only when an unfamiliar human, non-familymember gorilla, or other potential encroacher threatens their territory

Long term memory storage

Regardless of whether we have three distinctly different components of our memories, we human beings remember a great many things for a considerable length of time— often for our entire lifespans. In this sense, at least, a good deal of what we know and can do is in long-term memory. It appears that information stored in long-term memory can be encoded in a variety of forms. A good deal of it is probably encoded semantically—as relatively abstract meanings and understandings. Some of it may also be encoded in a verbal form, perhaps as actual words. Things you remember word for word (e.g., your name, your hometown, song lyrics) are all verbally encoded. Other information may be encoded as imagery— as it appears perceptually. For example, if, in your mind, you can see the face of a relative, hear that person's voice, or conjure up a mental whiff of the person's favorite perfume or aftershave lotion, you're retrieving images. And certain emotional reactions are apt to be closely connected with some of the things you've stored in memory. For example, the thought of a beloved family member or favorite pet might immediately evoke a smile or sense of contentment, whereas the memory of a horror movie you've recently seen might still send shivers up your spine. The preceding examples all illustrate declarative knowledge—knowledge that relates to the nature of how things are, were, will be, or might be. Declarative knowledge encompasses both general world knowledge (collectively known as semantic memory) and recollections of specific life experiences (collectively known as episodic memory). Yet people also acquire procedural knowledge; that is, they learn how to do things (e.g., J. R. Anderson, 1983; Phye, 1997; Tulving, 1983). You probably know how to ride a bicycle, wrap a birthday gift, and multiply a three-digit number by a two-digit number. To perform such actions successfully, you must adapt your behavior to changing conditions. For example, when you ride a bike, you must be able to turn left or right when an object blocks your path, and you must be able to come to a complete stop when you reach your destination. Accordingly, procedural knowledge often includes information about how to respond under different circumstances—it involves knowing when to do certain things either physically or mentally. In such instances it's also known as conditional knowledge. Most declarative knowledge is explicit knowledge: Once we recall it, we're quite conscious of what it is we know. But a good deal of procedural knowledge is implicit knowledge: We can't consciously recall or explain it, but it affects our thinking or behavior nonetheless (P. A. Alexander, Schallert, & Reynolds, 2009; J. R. Anderson, 2005; M. I. Posner & Rothbart, 2007). Another difference is that declarative knowledge can sometimes be learned very quickly, perhaps after a single presentation, whereas procedural knowledge is often acquired slowly and only with considerable practice

Long term memory retreival

Retrieving information from long-term memory appears to involve following a pathway of associations; it's a process of mentally going down Memory Lane. One idea reminds us of another idea—that is, one idea activates another—the second idea reminds us of a third idea, and so on. The process is similar to what happened when you followed your train of thought from the word horse earlier in the chapter. If the pathway of associations eventually leads us to what we're trying to remember, we do indeed remember it. If the path takes us in another direction, we're out of luck. Thus, we're more likely to remember something later on if, in the process of storing it, we connect it with one or more things already in our long-term memory. By making connections to existing knowledge—that is, by engaging in meaningful learning—we'll have some idea about how to "find" the information when we need it. Otherwise, we may never retrieve it again

moving inofrmation to working memory

Sensory information, such as the light cast by a sparkler, doesn't last very long no matter what we do. But we can preserve a memory of it by encoding it in some minimal way. In the model of memory presented in Figure 6.1, the first step in this process is attention: Whatever someone mentally pays attention to moves into working memory. If information in the sensory register doesn't get a person's attention, it presumably disappears from the memory system. Paying attention involves directing not only the appropriate sensory receptors (in the eyes, ears, etc.) but also the mind toward whatever needs to be learned and remembered. Imagine yourself reading a textbook for one of your classes. Your eyes are moving down each page, but you're thinking about something altogether different—a recent argument with a friend, a high-paying job advertised on the Internet, or your growling stomach. What will you remember from the textbook? Absolutely nothing. Even though your eyes have been focused on the words in the book, you haven't been mentally attending to the words. Young children's attention often moves quickly from one thing to another and is easily drawn to objects and events unrelated to the task at hand. For example, although decorating walls with colorful pictures and other images can make a kindergarten classroom more appealing for children, a lot of colorful décor is likely to distract many kindergartners from their lessons (Fisher, Godwin, & Seltman, 2014). As children grow older, they become better able to focus their attention on a particular task and keep it there, and they're less distracted by irrelevant thoughts and events. Yet even adult learners can't keep their minds entirely on a single task for a prolonged time period (S. M. Carlson & Moses, 2001; Immordino-Yang, Christodoulou, & Singh, 2012; Plebanek & Sloutsky, 2017). Even when learners are paying attention, they can attend to only a very small amount of information at any one time. In other words, attention has a limited capacity (Cherry, 1953; Cowan, 2007). For example, if you're sitting in front of your television set with your textbook open in your lap, you can attend to a Big Bang Theory rerun playing on TV or to your book, but not to both simultaneously. And if, in class, you're preoccupied with your instructor's desperate need for a fashion makeover, you're unlikely to be paying attention to the lecture itself. Exactly how limited is the limited capacity of human attention? People can often perform two or three well-learned, automatic tasks at once. For example, you can walk and chew gum simultaneously, and you can probably drive a car and drink a cup of coffee at the same time. But when a stimulus or event is detailed and complex (as both textbooks and television shows are) or when a task requires considerable thought (as understanding a lecture and driving a car on an icy mountain road would), then people can usually attend to only one thing at a time. Despite our best efforts, we humans are not very good at multitasking (Lien, Ruthruff, & Johnston, 2006; Ophir, Nass, & Wagner, 2009). As teachers, we must remember that attention isn't just an observable behavior; it's also a mental process. The Into the Classroom feature "Getting and Keeping Students' Attention" presents several effective strategies for keeping students' minds on classroom topics

Facilitating cognitive processing

Some diversity in learning and cognitive processes is the result of certain disabilities on the one hand, or giftedness on the other. For example, some students with disabilities have particular trouble attending to and effectively processing classroom subject matter. This is certainly true for students with learning disabilities (who, by definition, have students have a longer attention span and can process new ideas more rapidly and students have a longer attention span and can process new ideas more rapidly and elaboratively than many of their classmates. Table 6.5 identifies commonly observed cognitive processing differences in students who have special educational needs. As teachers, we must keep in mind that students with disabilities almost invariably have strengths as well as weaknesses. For example, some students with ADHD have a keen memory for events they've personally experienced and may generate more detailed narratives than their nondisabled classmates (Skowronek, Leichtman, & Pillemer, 2008). And some students with autism spectrum disorders notice and remember many subtle nuances in the things they see, and may produce highly detailed and skillful drawings that are unusual for their age group (I. L. Cohen, 2007; Kaku, 2014; S. Moran & Gardner, 2006). The far-right column of Table 6.5 presents many useful strategies for working with students who have special educational needs. The first of these—analyzing students' errors for clues about possible processing difficulties—is illustrated in 9-year-old Nicholas's lab report in Figure 6.13. Nick's description of what he observed can be translated as "We poured so many cubes [that] the cup overflowed. The blocks took up all the room." We can only speculate about why Nick wrote up the left side of the glass, across the top, and then down the other side. One possible explanation is that, with his limited language skills, Nick hadn't yet mastered the conventional direction of written English. This hypothesis seems unlikely, however, as other samples of Nick's writing (not shown here) correctly begin at the top of the page and proceed downward. Another possibility is that Nick was thinking about the direction of the water flow (up and out) as he wrote and either intentionally or unintentionally followed the water's direction in his writing. His limited working memory capacity may have been a factor here: Perhaps he had insufficient mental "room" to think simultaneously about his observations plus the spellings of words and conventions of written English. Virtually all students occasionally have trouble learning or remembering class material. Accordingly, many of the instructional strategies in Table 6.5 need not be limited to use with students with special needs. All students can benefit from guidance and support that enable them to process information more effectively.tively.

Obstacles to conmcpetual change

Some misconceptions are easily corrected. Yet students of all ages can hold quite stubbornly to certain counterproductive beliefs about the world, even after considerable instruction that explicitly contradicts those beliefs. Theorists have offered several possible explanations about why students' misconceptions can be so resistant to change

how procedural knowledge is learned

Some of the procedures people learn—for example, baking a cake, serving a volleyball, and driving a car—consist primarily of overt behaviors. Many others—for example, writing a persuasive essay, solving for x in an algebraic equation, and surfing the Internet—have a significant mental component as well. Many procedures involve a combination of physical behaviors and mental activities. Procedural knowledge ranges from relatively simple actions (e.g., using scissors or correctly holding a pencil) to far more complex skills. Complex procedures usually aren't learned in one fell swoop; instead, they're acquired slowly over a period of time, often only with a great deal of practice (Ericsson, 2003; Macnamara, Hambrick, & Oswald, 2014; Proctor & Dutta, 1995). People appear to learn simple physical procedures primarily as actual behaviors— in other words, as specific actions that, with practice, are strengthened and gradually refined (Ennis & Chen, 2011; Féry & Morizot, 2000; Willingham, 1999). Yet many complex skills, especially those that have a mental component, may also be learned as declarative knowledge—in other words, as information about how to do something (J. R. Anderson, 1983; Baroody, Eiland, Purpura, & Reid, 2013; Beilock & Carr, 2004). Learners may initially use their declarative knowledge to guide them as they perform a new skill, but to the extent that they must do so, their performance is apt to be slow and laborious and to require a lot of concentration—that is, it can consume considerable working memory capacity. As learners continue to practice the skill, their declarative knowledge gradually evolves into procedural knowledge, perhaps eventually to the point that they can perform the activity quickly, efficiently, and effortlessly (we'll look at such automaticity more closely a bit later). People who show exceptional talent in a particular skill domain—say, in figure skating or playing the piano—typically practice a great deal, often a minimum of 3 to 4 hours a day over a period of 10 years or more (Ericsson, 1996; Horn, 2008). Some of the storage processes we've already discussed play a role in acquiring procedural knowledge as well as declarative knowledge. For instance, verbally rehearsing a sequence of steps in a motor skill enhances people's ability to perform the skill (Vintere, Hemmes, Brown, & Poulson, 2004; Weiss & Klint, 1987). Illustrations or live demonstrations of a procedure, which presumably foster visual imagery, are also beneficial (Kitsantas, Zimmerman, & Cleary, 2000; SooHoo, Takemoto, & McCullagh, 2004). In fact, imagining oneself performing a new skill (e.g., executing a basketball shot or a gymnastics move) can enhance acquisition of the skill, although this strategy obviously isn't as effective as actual practice (Feltz, Landers, & Becker, 1988; Kosslyn, 1985; SooHoo et al., 2004). Perhaps the most effective way to teach new procedures is to model them for students, including both the overt behaviors and the internal thought processes involved. The Into the Classroom feature "Helping Students Acquire New Procedures" illustrates several additional strategies for facilitating procedural learning

Using mnemonics

Some things are hard to make sense of—that is, hard to learn meaningfully. For instance, why do bones in the human body have such names as fibula, humerus, and ulna? And why is Augusta the capital of the state of Maine, rather than, say, Portland or Bar Harbor? From most students' perspectives, there's no rhyme or reason to such facts. When students have trouble finding relationships between new material and their prior knowledge, or when a body of information seemingly has no organizational structure (as is true for many lists), special memory tricks known as mnemonics can Levin, & Delaney, 1982; Soemer & Schwan, 2012). Their effectiveness lies in their conformity with a basic principle of long-term memory storage: Learners find some sort of meaning—even if that "meaning" is a bit contrived—in what might otherwise be nonsensical information. The artificial organization structure that some mnemonics provide is an additional plus. Imposing rhythm on a body of information—for instance, embedding the information in a song or hip-hop lyrics—is one way of giving it structure and can be especially beneficial when music is a significant part of students' cultures (B. A. Allen & Boykin, 1991; Barton, Tan, & Rivet, 2008; Tyler, Uqdah, et al., 2008).

Multiple Connections with Existing Knowledge and a Variety of Contexts

Sometimes learners acquire and practice certain behaviors and ways of thinking in a very limited set of environments—say, in their science or civics classes. When this happens, the learners may associate those behaviors and ways of thinking only with those particular environments and thus fail to retrieve what they've learned when they're in other contexts (Day & Goldstone, 2012; Gresalfi & Lester, 2009; Kirsh, 2009). This tendency for some responses and cognitive processes to be associated with and retrieved in some contexts but not in others is often called situated learning or situated cognition. For example, if students associate principles of geometry only with math classes, they may not retrieve those principles at times when geometry would come in handy—say, when trying to determine whether a 10-inch pizza that costs $8.00 is a better value than an 8-inch pizza that costs $6.00. In general, learners are more likely to retrieve information when they have many possible pathways to it—in other words, when they have associated the information with many other things they know and with many different contexts in which they might use it. For instance, we might show students how classroom topics relate to some or all of the following • Concepts and ideas within the same subject area (e.g., showing how multiplication is related to addition) • Concepts and ideas in other subject areas, perhaps through interdisciplinary instruction that integrates two or more academic domains (e.g., talking about how various scientific discoveries have played important roles in historical events) • Students' general knowledge of the world (e.g., relating the concept of inertia to how passengers are affected when a car quickly turns a sharp corner) • Students' personal experiences (e.g., finding similarities between the family feud in Romeo and Juliet and students' own interpersonal conflicts) • Students' current activities and needs outside the classroom (e.g., showing how persuasive writing skills might be used to craft an essay for a college application)

Roles of prior knowledge and working memory in long term memory storage

Students are more likely to engage in meaningful learning when they have a relevant knowledge base—that is, when they have existing knowledge to which they can connect whatever new information and skills they're trying to master. When, in contrast, they have little relevant knowledge on which to build, they're apt to struggle in their efforts to make sense of new material, as Kanesha sometimes does while studying bone names for her biology quiz. Occasionally, students' prior knowledge interferes with something they need to learn; this is the case when Kanesha tries to remember where the sternum is located. In general, however, a relevant knowledge base helps students learn and remember new material more effectively than they would otherwise (e.g., P. A. Alexander, Kulikowich, & Schulze, 1994; Kintsch, 2009; P. L. Morgan, Farkas, Hillemeier, & Maczuga, 2016). For example, students will better understand scientific principles if they've already seen those principles in action, either in their personal lives or in the classroom, and they'll better understand how big some dinosaurs were if they have previously seen skeletons of, say, a Diplodocus and a Tyrannosaurus at a natural history museum. Children's knowledge about the world grows by leaps and bounds every year; on average, then, older students have more knowledge to help them understand and elaborate on new ideas and events than younger ones do. Children don't all acquire the same knowledge bases, of course, and their differing knowledge can lead them to construct different meanings from the same situation. The next exercise illustrates this point What did you think the passage was about? A prison escape? A wrestling match? Or perhaps something else altogether? When a longer version of this passage was used in an experiment with college students, many physical education majors interpreted it as a wrestling match, but music education majors—most of whom had little or no knowledge of wrestling—were more likely to think it was about a prison break (R. C. Anderson et al., 1977). Yet it isn't enough that students have the knowledge they need to make sense of new material. They must also be aware that some of their existing knowledge is relevant. They must retrieve that knowledge from long-term memory while thinking about the new material, so that they have both the old and the new in working memory at the same time and thus can make the appropriate connections (Bellezza, 1986; Glanzer & Nolan, 1986; Kalyuga, 2010). As teachers, we should use students' existing knowledge as a starting point whenever we introduce a new topic. Furthermore, we should explicitly remind students of things they know that bear directly on a topic of classroom study—an instructional strategy called prior knowledge activation. For instance, we might begin a first-grade unit about plants by asking students to describe what their parents do to keep flowers or vegetable gardens growing. In a secondary English literature class, we might introduce Sir Walter Scott's Ivanhoe (in which Robin Hood is a major character) by asking students to tell the tale of Robin Hood as they know it. We should also remember that students from diverse cultural backgrounds may have somewhat different knowledge bases, and adjust our starting points accordingly (E. Fox, 2009; L. E. Matthews, Jones, & Parker, 2013; McIntyre, 2010; Nelson-Barber & Estrin, 1995). Furthermore, we should encourage students to retrieve relevant knowledge on their own as they study. One approach is to model this strategy for students. For example, we might read aloud a portion of a textbook, stopping occasionally to tie an idea in the text to something previously studied in class or to something in our own personal experience. We can then encourage students to do likewise, giving suggestions and guiding their efforts as they proceed. Especially when working with students in the elementary grades, we might also want to provide specific questions that encourage students to reflect on their existing knowledge and beliefs as they read and study—for instance, asking themselves, "What do I already know about this topic?" and "Might I discover that something I think about this topic isn't correct?" (Baer & Garrett, 2010; Spires & Donley, 1998; H. Thompson & Carr, 1995)

Wait Time

Wait time is the length of time a teacher allows to pass after the teacher or a student says something before the teacher says something else. It's often unreasonable to expect students to formulate insightful, creative responses in a split second. When teachers allow several seconds of quiet wait time either after they have asked a question or a student has made a statement, more students participate in class—this is especially true for females and minority-group members—and students are more likely to comment to their classmates' answers and opinions. In addition, students are more likely to support their reasoning with evidence or logic and more likely to speculate when they don't know an answer (Moon, 2008; Rowe, 1974; Tobin, 1987). When our objective is simple recall—when students need to retrieve classroom material very quickly, to "know it cold"—then wait time should be short. Students may sometimes benefit from rapid-fire drill and practice to learn information and skills to automaticity. But when our instructional goals include more complex processing of ideas and issues, a longer wait time may give both our students and us the time everyone needs to think things through

Encouraging a Meaningful Learning Set

We can't always blame students when they take a relatively meaningless approach to their studies. Inadvertently, some teachers tend to encourage students to learn school subjects by rote. Think back to your own experiences in school. How many times were you allowed to define a word by repeating its dictionary definition, rather than being expected to explain it in your own words? In fact, how many times were you required to learn something word for word? And how many times did an exam assess your knowledge of facts or principles without ever assessing your ability to relate those facts and principles to everyday life or to things you learned in previous lessons or courses? When assignments and assessments require memory of isolated facts—and perhaps even require word-for-word recall—students are apt to engage in rote rather than meaningful learning, believing that a rote-learning approach will yield them better grades (Crooks, 1988; N. Frederiksen, 1984b; L. Shepard, Hammerness, Darling-Hammond, & Rust, 2005). Rather than inadvertently encouraging rote learning, we teachers should explicitly encourage students to adopt a meaningful learning set—an earnest effort to understand rather than simply memorize classroom material. For example, we might frequently ask students to explain their reasoning, and our assignments and assessment tasks should require true understanding rather than rote memorization (Ausubel, Novak, & Hanesian, 1978; Middleton & Midgley, 2002; L. Shepard et al., 2005). Ideally, students should acquire a conceptual understanding of classroom topics; that is, they should form logical connections among related concepts and principles. For example, rather than simply memorize basic mathematical computation procedures, students should learn how those procedures reflect underlying principles of mathematics. And rather than learn historical facts as lists of unrelated people, places, and dates, students should place those facts within the context of general social and religious trends, migration patterns, economic conditions, human personality characteristics, andother relevant phenomena. The more interrelationships students form within the subject matter they're learning—in other words, the better they internally organize it—the more easily they'll be able to remember and apply it later on (Baroody et al., 2013; M. C. Linn & Eylon, 2011; J. J. White & Rumsey, 1994). Constructing an integrated understanding of any complex topic inevitably takes time. Accordingly, many experts advocate the principle Less is more: Less material studied thoroughly (rather than superficially) is learned more completely and with greater understanding. Following are several more specific strategies for promoting conceptual understanding of classroom subject matter: Organize units around a few core ideas or themes, always relating specific content back to this core. Explore each topic in depth—for example, by considering many examples, examining cause-and-effect relationships, and discovering how specific details relate to more general principles. Regularly connect new ideas to students' personal experiences and to things students have previously learned at school. Emphasize that conceptual understanding is far more important than knowledge of specific facts—not only through the statements you make but also through the questions you ask, the assignments you give, and the criteria you use to evaluate achievement. Ask students to teach what they've learned to others. Teaching others encourages them to focus on and pull together main ideas in ways that make sense (Brophy, 2004; Brophy et al., 2009; Hatano & Inagaki, 1993; Leung, 2015; Middleton & Midgley, 2002; Perkins & Ritchhart, 2004; Roscoe & Chi, 2007; VanSledright & Brophy, 1992; J. J. White & Rumsey, 1994)

When knowledge construction goes awry addressing learners misconceptions

When learners construct their own understandings, there's no guarantee that they'll construct accurate ones. Occasionally, they may instead construct misconceptions— beliefs that are inconsistent with commonly accepted and well-validated explanations of phenomena or events. For example, in science, some of students' beliefs might be at odds with data collected over the course of decades or centuries of scientific research. And in history, students' understandings of certain events might be inconsistent with existing historical records and artifacts from the time period in question. Figure 6.10 presents misconceptions that researchers have often observed in students—not only in children and adolescents, but occasionally in college students as well. Existing misconceptions can sometimes wreak havoc on new learning. As a result of elaborating on new information—a process that usually facilitates learning—students may interpret or distort the information to be consistent with what they already "know" and thus continue to believe what they've always believed. For example, one eleventh-grade physics class was studying the idea that an object's mass and weight do not, by themselves, affect the speed at which the object falls. Students were asked to build egg containers that would keep eggs from breaking when dropped from a third-floor window. They were told that on the day of the egg drop, they would record the time it took for the eggs to reach the ground. Convinced that heavier objects fall faster, a student named Barry added several nails to his egg's container. Yet when he dropped it, classmates timed its fall at 1.49 seconds—a time very similar to that for other students' lighter containers. Rather than acknowledge that light and heavy objects fall at the same rate, Barry explained the result by rationalizing that "the people weren't timing real good" (Hynd, 1998a, p. 34). When students have misunderstandings such as Barry's, teachers must work hard to promote conceptual change, a process of revising or overhauling an existing theory or belief system in such a way that new, discrepant information can be better understood and explained. Don't let the term conceptual mislead you here: For the most part, we're talking about changing tightly interconnected sets of ideas rather than changing single, isolated concepts.

using technology to faciltate meaningful learning

Wide variability in students' current knowledge and skills often calls for some degree of individualized instruction. Different students may need different "starting points" in their explorations of new topics, and some students may require more practice than others before they can completely master new skills. Many instructional software programs—often referred to as computer-based instruction (CBI)—are specifically designed to take individual differences into account. Effective CBI programs embody many basic principles of cognitive psychology. For instance, they capture and hold students' attention with engaging tasks and graphics, encourage students to relate new ideas to things they already know, present diverse examples and practice exercises, and provide constructive feedback that explains why each student response is either good or not-as-good. Some forms of CBI are housed within a single computer; many others are available on the Internet. Some instructional software programs provide drill and practice of basic knowledge and skills (e.g., math facts, typing, fundamentals of music). Others, known as intelligent tutoring systems, skillfully guide students through complex subject matter and can anticipate and address a wide variety of misconceptions and learning difficulties. A good example of an intelligent tutoring system is My Science Tutor, or MyST, in which students in the upper elementary grades have one-on-one conversations with "Marni," a computer-animated woman who both talks to them and—through the software's voice recognition and language-processing components—also listens to and understands what they say in response to her questions (W. Ward et al., 2013). Marni typically begins a conversation with a student by activating the student's prior knowledge about the topic, saying something such as "What have you been studying in science recently?" Then, after the topic of the lesson has been identified, Marni presents a series of illustrations, animations, and interactive simulations and asks more specific questions—for instance, she might ask, "So, what's going on here?" or "What could you do to . . . ?" (W. Ward et al., 2013, pp. 1118-1119). She tailors subsequent instruction to the student's current understandings and addresses any misconceptions she "thinks" they might have. Figure 6.8 presents examples of what a student might see on the computer screen in lessons on electric circuits and electromagnetism, respectively. Well-designed CBI programs can be quite effective in helping students learn academic subject matter (e.g., Slavin & Lake, 2008; Steenbergen-Hu & Cooper, 2013; Tamin et al., 2011; Van der Kleij, Feskens, & Eggen, 2015). They can also be highly motivating, piquing students' interest, providing occasional opportunities for choice-making, and ensuring the kinds of success experiences that are likely to enhance students' self-confidence (Blumenfeld, Kempler, & Krajcik, 2006; Means, Bakia, & Murphy, 2014; Swan, Mitrani, Guerrero, Cheung, & Schoener, 1990). But one caveat is in order here: Too much independence in choice-making in a program can lead students to flounder aimlessly and not make much progress in their learning. Good CBI programs provide considerable guidance about what students should do at various steps along the way (Kanar & Bell, 2013; Karich, Burns, & Maki, 2014). Computer-based instructional programs offer several advantages that we sometimes don't have with more traditional forms of instruction. For one thing, CBI can seamlessly include animations, video clips, and spoken messages— components that aren't possible with traditional textbooks and other printed materials. Second, a computer can record and maintain ongoing data for every student, including such information as how far each of them has progressed in a program, how quickly they respond to questions, and how often they're right and wrong. With such data, we can monitor each student's progress and identify students who appear to be struggling with the material. Finally, a computer can be used to provide instruction when fleshand-blood teachers aren't available. For example, in many instances of online learning, learners receive instruction either at a physical location far away from an instructor or without any human instructor at all.

Nature of working short term memory

Working memory is the component of human memory where we hold attended-to information for a short time while we try to make sense of it. Working memory is also where much of our active cognitive processing occurs. For instance, it's where we think about the content of a lecture, analyze a textbook passage, or solve a problem. Basically, this is the component that does most of the mental work of the memory system—hence its name, working memory. Rather than being a single entity, working memory probably has several components for holding and working with different kinds of information—for example, visual information, auditory information, and the underlying meanings of events—as well as a component that integrates multiple kinds of information. As shown in Figure 6.1, working memory may also include a central executive that focuses attention, oversees the flow of information throughout the memory system, selects and controls complex voluntary behaviors, and inhibits counterproductive thoughts and actions (Baddeley, 2001; Krakowski et al., 2016; Logie, 2011; H. L. Swanson, 2017). Such processes—collectively known as executive functions—improve over the course of childhood and adolescence (partly as a result of brain maturation) and significantly enhance students' academic performance (Atkins et al., 2012; J. R. Best & Miller, 2010; Masten et al., 2012). Information stored in working memory doesn't last very long—perhaps 5 to 20 seconds at most—unless the learner consciously does something with it (e.g., Baddeley, 2001; Camos, 2015; W. Zhang & Luck, 2009). Accordingly, this component is sometimes called short-term memory. For example, imagine that you need to call a neighbor, so you look up the neighbor's number in a telephone directory. Because you've paid attention to the number, it's presumably in your working memory. But then you discover that you can't find your cell phone. You have no paper and pencil handy. What do you do to remember the number until you have access to a phone? If you're like most people, you probably repeat it to yourself over and over again. This process, known as maintenance rehearsal, keeps information in working memory for as long as you're willing to continue talking to yourself. BSut once you stop, the number will disappear fairly quickly. The amount of information children can hold in working memory increases a bit with age, probably as a result of both brain maturation and the acquisition of more effective cognitive processes (Barrouillet & Camos, 2012; Kail, 2007; Sørensen & Kyllingsbæk, 2012). Yet even adults have only so much "room" to simultaneously hold and think about information. To see what we mean, put your working memory to work for a moment in the following exercise Did you find yourself having trouble remembering some parts of the problem while you were dealing with other parts? Did you ever arrive at the correct answer of 837? Most people can't solve a division problem with this many digits unless they write it down. Working memory just doesn't have enough space both to hold all that information and to perform mathematical calculations with it. Like attention, working memory has a limited capacity—perhaps just enough for a telephone number or very short grocery list (Cowan, 2010; Logie, 2011; G. A. Miller, 1956). Virtually any learning activity imposes a cognitive load—a certain amount of information that learners must simultaneously think about, along with certain ways that they must think about it, in order to make sense of and remember what they're studying (R. E. Mayer, 2011b; Plass, Moreno, & Brünken, 2010; Sweller, 1988, 2008). As teachers, then, when we design and conduct lessons, we must consider just how much of a load students' working memories can reasonably handle at any given time. For example, we should minimize information that's irrelevant to the topic at hand. We should pace the presentation of important information slowly enough that students have time to effectively process what they're seeing and hearing. And we might repeat the same idea several times (perhaps rewording it each time), stop to write important points on the board, and provide several examples and illustrations. We authors sometimes hear students talking about putting class material in "short-term memory" so that they can do well on an upcoming exam. Such a statement reflects the common misconception that this component of memory lasts for several hours, days, or weeks. Now you know otherwise. Working memory is obviously not the place to leave information that you need for an exam later in the week or even information you need for a class later in the day. For such information, storage in long-term memory—the final component of the memory system—is in order.


Conjuntos de estudio relacionados

Life practice questions (florida)All of the following provisions must be included in group life insurance policies issued in this state EXCEPT:

View Set

Science semester 2 multiple choice

View Set

MBA 600: Ch12 Video - Warby Parker

View Set

Futur II (Vermutungen über die Vergangenheit)

View Set

Ethos, Pathos, Logos, Ethos, Pathos, Logos

View Set