Carnegie Mellon University’s Lori Holt and Sung-Joo Lim and Stockholm University’s Francisco Lacerda are using video game training with a mock "alien" language to replicate the challenges of learning languages as an infant. The research found that listeners were to quick recognize word-like units. The study was funded by grants from the National Science Foundation, National Institutes of Health and Riksbankens Jubileumsfond.
To uncover how spoken sounds are decoded by the brain, the research team designed a video game narrated in deliberately distorted speech. The soundtrack (unintelligible babble in any language) was the only source of instruction for the 77 adult players in the study. After only two hours of play, the participants could reliably extract world-length sound categories from continuous alien sounds and apply that learning to advance through the game.
"Traditionally, when we study adult learning in the lab, it’s nothing like how infants learn language," said Holt, professor of psychology at CMU and a specialist in auditory cognitive neuroscience. "This video game models for adults the challenge language learning poses to infants. This presents the opportunity to study learning in ways that are just not feasible with infants."
Lacerda, professor of phonetics and an expert in language acquisition, agrees that using video games is a promising new way to explore language learning.
"This is a wonderful opportunity to approximate the task facing infants by creating a setting where adults are forced to infer what the meaning of different sound elements might be, and to do it in a functional way."
The research has the potential to help researchers better understand and effectively treat a number of conditions including dyslexia and improving second language learning.
Lim, a graduate student in psychology at CMU and lead author of the study, has used the game to help adults learn English.
"Native speakers of Japanese can use this type of training to learn English consonants they have difficulty distinguishing," she said
Holt, director of CMU’s Speech Perception and Learning Laboratory, is interested in taking the study further to determine how the video game and its alien soundtrack engage different areas of the brain to produce rapid learning. The next step is to investigate this by observing players with functional magnetic resonance imaging (fMRI) to view real-time brain reactions to the video game.
Researchers will present their findings at the Acoustical Society of America’s annual meeting May 23-27 in Seattle.
For more information on research at Carnegie Mellon, visit www.cmu.edu/research/brain.