Washington: The human brain is constantly bombarded with a jumble of information from sensory organs -- the eyes, ears, nose, mouth and skin.
Now, scientists at the Universities of Rochester and Washington and Baylor College have unravelled how the brain manages to process those complex, rapidly changing, and often conflicting sensory signals to make sense of our world.
The discovery may eventually lead to new therapies for people with Alzheimer`s disease and other disorders that impair a person`s sense of self-motion, says study co-author Greg DeAngelis, professor in brain and cognitive sciences at Rochester.
The answer lies in a rather simple computation done by single nerve cells (neurons), an operation that can be described mathematically as a straight-forward weighted average, the journal Nature Neuroscience reports.
The key is that the neurons have to apply the correct weights to each sensory cue, and the authors reveal how this is done, according to a Rochester statement.
It represents the first direct evidence of how the brain combines multiple sources of sensory information to form as accurate a perception as possible of its environment, the researchers report.
For example, during IMAX theatre footage of an aircraft rolling into a turn, "you may find yourself grabbing the seat," says DeAngelis.
The large visual input makes you feel like you are moving, but the balance cues conveyed by sensors in your inner ear indicate that your body is in fact safely glued to the theatre seat. So how does your brain decide how to interpret these conflicting inputs?
The study shows that low-level computations performed by single neurons in the brain, when repeated by millions of neurons performing similar computations, account for the brain`s complex ability to know which sensory signals to weigh as more important.