A teacher stands at a white board in front of her fourth-grade class and begins teaching one of math’s most fundamental concepts: the meaning of an equal sign in the middle of an equation. This is not easy. Young students tend to think of the equal sign as the endpoint of a problem. Now, instead of the usual 8 + 4 = ?, they are asked to ponder 8 + 4 = ? + 6. Mastering this concept will open the door to algebra and higher math.
Almost any teacher giving this lesson will instinctively move her hands in predictable ways, pointing to the equal sign, sweeping her hand toward the left side of the equation and then sweeping it toward the right. She might hold both hands palms-up in a balancing gesture to suggest equivalency.
Now imagine the teacher giving the same lesson, using the same words, but with her hands flat on her desk or arms at her side. Turns out, her students will be much less likely to grasp the concept.
Susan Wagner Cook, an associate professor of psychological and brain sciences at the University of Iowa, has conducted numerous studies with scenarios like these – both with live teachers and with animated avatars (see video). Whether it’s a lesson in math, foreign language vocabulary or science, the result is the same: kids learn better with gesture.
“Gesture seems to help build understanding across really abstract things and really concrete things – numbers, words, a whole bunch of stuff,” Cook says.
Why this is so is not entirely clear, but gesture seems to lighten the load on our cognitive systems. Cook has shown, for instance, that if you ask people to do two things at once — explain a math problem while remembering a sequence of letters — they do a far better job if permitted to gesture while explaining.
Research suggests that when we see and use gestures, we recruit more parts of the brain than when we use language alone, and we may activate more memory systems – such as procedural memory (the type that stores automatic processes such as how to type or ride a bike) in addition to our memory for events and experiences.
Cook is among a cadre of researchers who study learning in the context of “embodied cognition” – the theory that our thoughts are shaped by the physical experiences of our body. According to this view, even when we think about abstract ideas, our brains link them to concrete, physical things that we experience through our hands, our senses and other body parts.
Studies that use functional magnetic resonance imaging (fMRI) and other brain imaging techniques provide fascinating evidence for embodied cognition. For instance, when we hear verbs such as lick, pick and kick, they activate parts of the brain associated with the tongue, the hands and the legs, respectively. When we read about a happy event, there is greater activity in the nerves and muscles that control smiling.
The gestures used by this animated figure help children learn math, according to research by Susan Wagner Cook at the University of Iowa. Video courtesy of Voicu Popescu and Susan Wagner Cook.
One of the more remarkable findings in this field is that people who get Botox injections to reduce frown lines actually take longer to read sad and angry passages right after the injections than before, although there is no change of pace for reading happy tales.
Arthur Glenberg, a professor of psychology at Arizona State University, one of the authors of the Botox study and many others on embodied cognition, is applying the theory to help struggling readers succeed.
For more than a decade, Glenberg and colleagues have been developing systems that allow novice readers to physically simulate the content of books to enhance their understanding. The latest version is an iPad-based system called EMBRACE in which children can move characters and props around on a touch screen to bring the text alive. Unlike some multimedia picture books in which bells and whistles can distract from the story, the EMBRACE actions are tightly aligned with the text. If the story says that a farmer puts a pig in the pen, the child can slide a finger to do the same. If the text explains how blood flows from the heart’s right ventricle to the lungs, the reader can make it happen onscreen.
Glenberg has tested this system and an earlier version called Moved by Reading with struggling readers, including kids with learning disabilities, and has found sizeable increases in comprehension. The kids begin by acting out what they are reading — with support from a teacher or from the EMBRACE programming. Later they learn to simply “imagine” the physical actions.
Gesture seems to help build understanding across really abstract things and really concrete things – numbers, words, a whole bunch of stuff.
Susan Wagner Cook, associate professor of psychological and brain sciences, University of Iowa
The approach works across a variety of content areas — including story problems in math. In a 2011 study with 97 third- and fourth-graders, kids trained in the method solved 44 percent of math problems versus 33 percent for those in a control group. The trained kids were also much less likely (38 percent versus 61 percent) to mistakenly use irrelevant information in their calculations.
Word problems are notoriously hard for many students. “Kids sort of give up on trying to figure out what the meaning is and go right to playing with the numbers,” Glenberg explains. What the embodied approach does, he says, is help them develop “a sensorimotor representation” of the math problem. It “forces you to imagine the situation and that makes doing the math much easier.”
The same is true in reading. Many kids are able to sound out the text, but don’t actually understand it. This is particularly true of English language learners, Glenberg says. He has been testing the EMBRACE system for such students in the U.S. and in China. In a 2017 study with 93 native Spanish-speaking children in Arizona, he reports a “large positive benefit in story comprehension.” An enhanced version of the system offers some basic support in child’s native language.
A big question about the approach is whether kids who learn to read on this platform can make the leap to reading fluently without its support, internalizing the habit of picturing the story in their mind’s eye. Glenberg is in the process of studying this.
Using our bodies and gesture to teach is something parents and preschool teachers do instinctively (just think about rhymes like the “The Eensy-weensy Spider”). But work by Glenberg, Cook and many others indicates that the benefits can go far beyond preschool and extend to teaching advanced and abstract concepts.
Cook’s quick advice to teachers: “Use your hands. Make sure you don’t always have your smartboard controller in your hand. And if the students have their backs to you, it’s not as good.” She hopes that her work with gesturing avatars will eventually improve digital instruction, much of which makes poor use of body language.
As more and more of education comes to depend on technology and virtual instruction, it will be vital to capture under-appreciated aspects of human interaction that engage both body and mind.
We cover inequality and innovation in education with in-depth journalism that uses research, data and stories from classrooms and campuses to show the public how education can be improved and why it matters.