Scientists at the University of California, Berkeley, believe they are on their way to learning, and possibly seeing, what exactly is going on in the human mind through advancements in brain imaging.
Researchers at the university were able to peer into their brain activity of subjects who were watching Hollywood movie trailers and reconstruct what the viewers saw.
Currently, researchers can only reconstruct movie clips people have already seen. However, with the latest breakthrough in brain imaging, they are a step closer to reproducing dreams, memories and other mind movies that people have, but have never been seen by anyone else.
They also believe this may one day help victims of coma, strokes and other neurodegenerative disease to communicate what's on their mind. Also, to practically apply this technology may also lay the foundation for brain machine interface, which could benefit people with cerebral palsy or paralysis. This type of technology will enable them to guide computers with their minds, researchers say.
This is a major leap toward reconstructing internal imagery, Jack Gallant, neuroscientist at UC Berkeley, said in a press statement. We are opening a window into the movies in our minds.
He is the coauthor of the study published online Thursday in the journal Current Biology.
Gallant and his team managed to decode brain signals generated by moving pictures by using brain imaging technologies and computational models.
Our natural visual experience is like watching a movie, said lead author Shinji Nishimoto, who was one of three research team members who served as subjects for the experiment. In order for this technology to have wide applicability, we must understand how the brain processes these dynamic visual experiences.
Researchers watched two different sets of Hollywood movie trailers while fMRI measured blood flow through the visual cortex. It is the part of the brain that processes visual information. While the subjects watched the movie trailers, researchers were recording brain activity, and the data sent to a computer program that - second by second - learned to associate visual patterns in the movie with the corresponding brain activity.
Researchers used computer models to predict the brain activity of 18 million seconds of random YouTube videos. The resulting computer program then picked the 100 YouTube clips most similar in brain-activity profile to the test subjects. The 100 best clips were then merged into a blurry reconstruction of the original movie, researchers said.
Most previous attempts to decode brain activity have focused on photographs instead of moving pictures.
Nishimoto said researchers need to know how the brain works in what he calls naturalistic conditions. And for that they must first understand how the brain works while people are watching movies.
Learn more about the experiment from the YouTube video below.
The left clip is a segment of the film trailer subjects viewed while their brain activity was being recorded. The right clip shows the reconstruction of this movie from brain activity. The reconstruction was obtained using only each subject's brain activity and a library of 18 million seconds of random YouTube video that did not include the movies used as stimuli. Brain activity was sampled every one second, and each one-second section of the viewed movie was reconstructed separately.