Research & Development World

  • Home Page
  • Topics
    • Aerospace
    • Archeology
    • Automotive
    • Biotech
    • Chemistry
    • COVID-19
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Market Pulse
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
      • Software
    • Semiconductors
  • 2021 R&D 100 Award Winners
    • R&D 100 Awards
    • 2020 Winners
    • Winner Archive
  • Resources
    • Digital Issues
    • Podcasts
    • Subscribe
  • Global Funding Forecast
  • Webinars

Brain imaging reveals the movies in our mind

By R&D Editors | September 22, 2011

BrainMovie1

The approximate reconstruction (right) of a movie clip (left) is achieved through brain imaging and computer simulation

Imagine tapping into the mind of a coma patient, or watching one’s own dream on YouTube. With a cutting-edge blend of brain imaging and computer simulation, scientists at the University of California, Berkeley, are bringing these futuristic scenarios within reach.

Using functional magnetic resonance imaging (fMRI) and computational models, UC Berkeley researchers have succeeded in decoding and reconstructing people’s dynamic visual experiences—in this case, watching Hollywood movie trailers.

As yet, the technology can only reconstruct movie clips people have already viewed. However, the breakthrough paves the way for reproducing the movies inside our heads that no one else sees, such as dreams and memories, according to researchers.

“This is a major leap toward reconstructing internal imagery,” said Professor Jack Gallant, a UC Berkeley neuroscientist and coauthor of the study published online this week in the journal Current Biology. “We are opening a window into the movies in our minds.”

Eventually, practical applications of the technology could include a better understanding of what goes on in the minds of people who cannot communicate verbally, such as stroke victims, coma patients and people with neurodegenerative diseases.

It may also lay the groundwork for brain-machine interface so that people with cerebral palsy or paralysis, for example, can guide computers with their minds.

However, researchers point out that the technology is decades from allowing users to read others’ thoughts and intentions, as portrayed in such sci-fi classics as “Brainstorm,” in which scientists recorded a person’s sensations so that others could experience them.

Previously, Gallant and fellow researchers recorded brain activity in the visual cortex while a subject viewed black-and-white photographs. They then built a computational model that enabled them to predict with overwhelming accuracy which picture the subject was looking at.

In their latest experiment, researchers say they have solved a much more difficult problem by actually decoding brain signals generated by moving pictures.

BrainMovie2-250

Mind-reading through brain imaging technology is a common sci-fi theme

“Our natural visual experience is like watching a movie,” said Shinji Nishimoto, lead author of the study and a post-doctoral researcher in Gallant’s lab. “In order for this technology to have wide applicability, we must understand how the brain processes these dynamic visual experiences.”  

Nishimoto and two other research team members served as subjects for the experiment, because the procedure requires volunteers to remain still inside the MRI scanner for hours at a time.

They watched two separate sets of Hollywood movie trailers, while fMRI was used to measure blood flow through the visual cortex, the part of the brain that processes visual information. On the computer, the brain was divided into small, three-dimensional cubes known as volumetric pixels, or “voxels.”

“We built a model for each voxel that describes how shape and motion information in the movie is mapped into brain activity,” Nishimoto said.

The brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity.

Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.

Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie.

Reconstructing movies using brain scans has been challenging because the blood flow signals measured using fMRI change much more slowly than the neural signals that encode dynamic information in movies, researchers said. For this reason, most previous attempts to decode brain activity have focused on static images.

“We addressed this problem by developing a two-stage model that separately describes the underlying neural population and blood flow signals,” Nishimoto said.

Ultimately, Nishimoto said, scientists need to understand how the brain processes dynamic visual events that we experience in everyday life.

“We need to know how the brain works in naturalistic conditions,” he said. “For that, we need to first understand how the brain works while we are watching movies.”

Other coauthors of the study are Thomas Naselaris with UC Berkeley’s Helen Wills Neuroscience Institute; An T. Vu with UC Berkeley’s Joint Graduate Group in Bioengineering; and Yuval Benjamini and Professor Bin Yu with the UC Berkeley Department of Statistics.

Gallant Lab

Reconstructing Visual Experiences from Brain Activity Evoked by Natural Movies

SOURCE

Related Articles Read More >

New dangers in the woods — and the hope that research offers us
Novel mass spectrometry solution simplifies insight gathering into macromolecular complexes
ENPICOM launches display solution to accelerate antibody selection while maximizing precision
Thermo Fisher Scientific autoimmune-testing instruments now available in the U.S.
2021 R&D Global Funding Forecast

Need R&D World news in a minute?

We Deliver!
R&D World Enewsletters get you caught up on all the mission critical news you need in research and development. Sign up today.
Enews Signup

R&D World Digital Issues

February 2020 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R& magazine today.

Research & Development World
  • Subscribe to R&D World Magazine
  • Enews Sign Up
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • 2022 Global Funding Forecast

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • Home Page
  • Topics
    • Aerospace
    • Archeology
    • Automotive
    • Biotech
    • Chemistry
    • COVID-19
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Market Pulse
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
      • Software
    • Semiconductors
  • 2021 R&D 100 Award Winners
    • R&D 100 Awards
    • 2020 Winners
    • Winner Archive
  • Resources
    • Digital Issues
    • Podcasts
    • Subscribe
  • Global Funding Forecast
  • Webinars