Scientists can now read minds. An animated dragon helped prove it

Save articles for later

Add articles to your saved list and come back to them any time.

Researchers have used machine learning, MRI scans and an animated clip of an evil dragon to peer inside people’s minds and deduce the meaning of stories they heard or watched – and even ones they imagined.

A language decoder developed at the University of Texas can analyse brain activity and predict, to varying levels of accuracy, what’s happening in stories played to study participants.

Previous technology that decodes language from brain activity relies on invasive neurosurgery to insert electrodes into the brain, or can only interpret one word or phrase at a time, the researchers said.

“The goal of language decoding is to take recordings of a user’s brain activity, and predict the words that the user was hearing, saying or imagining,” Jerry Tang, co-author of the research published in Nature Neuroscience, said.

“Our study is the first to decode continuous language, meaning more than single words or sentences, from non-invasive brain recordings, which we collect using functional MRI.”

Three participants each listened to stories for 16 hours while having an MRI for the study. The researchers fed the MRI data to a model, which they trained to recognise the links between brain activity and narrative meaning.

Then, the participants were scanned as they listened to new stories. The decoder analysed their brain activity and wrote a kind of script that predicted the meaning of the stories the participants had listened to and, at times, produced exact words and phrases used in the original stories.

In another part of the experiment, participants watched a clip from animated dragon movie Sintel that had the sound cut off. Based on the viewers’ brain activity, the decoder produced a basic description of what was happening on screen.

Co-author of the research, neuroscientist Dr Alexander Huth, said their approach differed from “mind reading” implants that have allowed vocally paralysed stroke victims to spell out words.

“They record from the motor areas of the brain, the areas that control the mouth, larynx, tongue, etc,” Huth said. “What they can decode is: how is the person trying to move their mouth to say something?

“Our system works at a very different level. Our system really works at the level of ideas, of semantics, of meaning.”

That’s why the text generated by the decoder doesn’t exactly reproduce the words a participant imagines or hears.

“It’s the gist. It’s like the same idea, but expressed in different words.”

The study generated excitement in the research world, said cognitive scientist at the Queensland University of Technology Professor Greig de Zubicaray, but there are elements of the approach that limit the method’s practical applications.

“In English, in particular, you have a very predictable subject, verb, object word order, or SVO,” de Zubicaray said. “They’ve acknowledged that their decoder struggled with figuring out who was actually performing an action or who was the subject of the sentences.”

The decoder was also better at interpreting thoughts about concrete objects as opposed to abstract ideas.

“Recovering meaning about something more abstract, such as truth, happiness or love, would be far more difficult for this particular approach.”

The researchers noted the technology could become more useful – and contribute to devices that allow people who can’t talk or use sign language to communicate – if it can be applied using cheaper, more portable scanning techniques such as EEG or MEG scans.

The researchers proved their technique could only decode thoughts with the participant’s full cooperation and depended on gathering 16 hours of data first, so the technology couldn’t be used to tune in to private thoughts.

“Of course, this could all change as tech gets better, so we believe it’s important to keep researching the privacy implications of brain decoding and enact policies that protect each person’s mental privacy,” Tang said.

Liam Mannix’s Examine newsletter explains and analyses science with a rigorous focus on the evidence. Sign up to get it each week.

Most Viewed in Technology

From our partners

Source: Read Full Article