Catching a brain wave

When you’re editing the final essay draft it might take you a second or two to recognize that you’ve used the wrong tense — but how long does it take your brain?

Researchers at the UW are mapping the brain’s response to errors in language, and studying how this can vary based on a person’s language background. By attaching electrodes to a subject’s head, they are able to receive real-time, unfiltered results of how the brain processes mistakes in writing.

An ongoing study examined both hearing impaired and deaf participants in order to compare how the brains of ASL speakers who also read and write in English recognize language errors compared with the brains of native English speakers. The goal of the study was to compare how a brain reacts to mistakes in grammar or meaning based on its proficiency in that language.

“We’re really interested in looking at how the brain processes and responds to language, but we need to look at all language environments,” said Alison Mehravari, the lead researcher of the project. “So we’re looking at how does the brain respond differently based on someone’s language background, or [how] proficiently they’re reading.”

For the study, researchers measured the electrical activity along the scalp — or “brain waves,” as they’re colloquially known — while participants read sentences presented one word at a time. When the sentence reached a part with a mistake, the electrodes were able to measure the brain’s reaction about 400 milliseconds after the word was shown.

“[It’s measuring] what the brain is doing in real time,” said Lee Osterhout, a professor of psychology at UW. “It’s not mediated by conscious thinking or behavior. Language is a reflex; the input comes in and you’ve got it. But it’s not that way for a deaf person learning to read a language, so how does that vary?”

American Sign Language (ASL) presents a unique case: Although ASL typically uses English to communicate in written language, ASL and English remain separate languages. This means many everyday situations for ASL speakers involve them switching between two languages.

“With people who are using sign language, or who might be deaf, their brains work differently than those who might just be getting two auditory languages,” said Lindsay Klarman, who collaborated on the research and assisted with sign language interpreting. “They’re using a visual language, but then they’re reading and writing in English. So they’re bilingual, but they’re a different kind of bilingual than someone who’s just an auditory bilingual.”

For the research, Mehravari measured responses from native English speakers who are hearing, and people who had severe hearing loss before they were two years old. Mehravari said there is typically a lot more variation in the language background of a deaf person that affects how their brain processes language.

“For someone whose first language is ASL, reading English is reading in a second language,” Mehravari said. “Even if people grow up with spoken and written English as their first language, it’s still a different language environment than [for] someone who’s hearing, because someone who’s deaf isn’t getting the same auditory information about that language.”

Mehravari says she hopes the results of this study will help educators better understand how to teach deaf students to read English.

“Most of the ways we teach reading in schools are geared towards hearing children,” Mehravari said. “There may be better ways to teach reading to someone who’s deaf, depending on how they access information about language … We’re just trying to figure out how language is processed by deaf people, and hope to provide some useful information on what is helpful for [learning to read].”

Although they are continuing to collect data, the next step in the research project is to analyze the results of the brain waves and draw connections. Overall, though, Klarman is happy for the willingness of the deaf subjects to participate in the study.

“It’s good to be able to collect data on diverse populations,” Klarman said. “For a long time science was only collecting data on right-handed monospeakers, but our world is changing and we’re getting more people who are speaking different languages and learning more about the brain. So I’m really happy that the Deaf community can be part of that and be accepting of this research. We couldn’t do it without them.”

Reach reporter Zosha Millman at science@dailyuw.com. Twitter: @zosham

Please read our Comment policy.