How does the brain decide which sensory input is relevant and which is not? A study conducted by Oldenburg hearing reseachers shows that the rules that apply for humans are not the same for all living beings.
If we close our eyes and concentrate on the sounds around us, we perceive the world in a special way: we receive signals from all directions. From the right, we may be able to hear the noise of the traffic on the street below, while from the front we hear the tapping of the keyboard, and from the left the sound of laughter from the next room, or the humming of the laser printer in the hallway – this is what everyday life at the office sounds like.
Under normal circumstances we are able to ignore background noise in such an environment and concentrate on those sounds that are important to us – for example the voice of a colleague with whom we are having a conversation. But exactly how we manage to focus our attention on specific sounds in the hearing process is currently the subject of heated debate among scientists. “This is an interesting question, among other things because many people who no longer hear so well have big problems with this,” says Prof. Dr. Jannis Hildebrandt, director of the “Auditory Neuroscience” division at the University of Oldenburg’s Department of Neuroscience. He and his team have now carried out a study to find out whether mice also have the ability to concentrate on specific sounds.
Different priorities to humans
Hildebrandt, his colleague Meike Rogalla and several other researchers presented the results of the study in the Royal Society’s biological research journal, Proceedings B. According to the paper, rodents are indeed able to focus on relevant sounds while ignoring less relevant sounds – but they set different priorities to humans when they do this. “Mice can perceive unfamiliar, surprising signals better than they perceive more familiar stimuli,” Hildebrandt explains.
With humans the opposite is the case, as hearing researchers have known for some time. Tests have shown that test persons subconsciously filter signals according to the probability of their occurrence. In these tests, the test subjects listen to two different noise sources simultaneously, for example two female speakers or two alarm clocks beeping at different pitches. The task is to tune into a specific signal among the confusion of sounds – a certain word, a louder beep or a sequence of notes with a specific rhythm.
The cocktail party effect
“People subconsciously track how often the designated signal occurs in each of the two sources and adjust their attention to these statistics,” explains Hildebrandt. Test persons then concentrate for example on the speaker who mentions the designated signal word most frequently, and systematically tune out the other voice. “We shift our resources to the frequency band on which this speaker is talking, in order to be able to hear their voice better,” he explains.
This is a skill that humans usually master very well. However, people who no longer hear well often have difficulties with this adjustment – a phenomenon known as the “cocktail party effect”. Hildebrandt explains that the cause is not always damaged hearing, but sometimes lies directly in the brain. In order to find out how the direction of attention to a specific source works in principle, the Oldenburg researchers carried out similar experiments with mice. They trained the rodents to sit on a small platform and simultaneously played two sources of noise to them. One was played on a frequent basis, the other only rarely. The task of the mouse was to jump off the platform whenever it detected a noise source.
Focus on the less familiar signal
If the mice completed this task correctly, they received a small scoop of dry food as a reward. The results of the experiment surprised the team: “The mice systematically paid attention to the acoustic channel on which the designated signal was played less frequently,” Hildebrandt reports. The surprising thing about this is that from the mouse’s point of view it would have been more worthwhile to direct their attention to the noise source on which the designated signal was played more frequently, because then they would have received far more rewards.
After confirming this behaviour in the mice in several tests using different configurations, the team concluded that the mice, just like humans, subconsciously monitor the noise frequency statistics for several minutes. However, they do not seem to be able to focus their attention on the source on which the probability of the designated signal occurring is higher – even if they stand to benefit from doing so.
Switching modes as required
The team believes that the reason for this is that mice, as potential prey, need above all to be ready for surprises: “They divide up their acoustic environment differently than we humans do,” Hildebrandt suspects. Hearing unfamiliar noises can be a matter of life and death for mice, and therefore such noises stand out in their perception. “For a mouse, it is a matter of survival to perceive every little snap or click that might indicate a cat moving towards them, or the quiet rustling of leaves when an owl takes flight,” Hildebrandt points out. All other sounds – familiar noises such as birds’ twittering – are of less interest to them.
Hildebrandt and Rogalla have found indications in the specialised literature that humans react like mice when required to listen for threatening, non-neutral stimuli in experiments. “So it could be that we possess a similar mechanism to that in mice, but are able to switch modes as required – whereas mice can’t,” Hildebrandt concludes.
Sensory impressions not necessarily objective
For the neurobiologist, the study first of all confirms that mice can be used as model animals to research the phenomenon of selective attention – and to develop potential therapies for people with impaired hearing. Secondly, it sheds light on how every living being perceives the world in its own special way.
“We tend to believe that our sensory impressions are objective: what we see is the way the world is. And what we hear is how the world sounds,” says Hildebrandt. But this is not necessarily the case: other animals can construct an entirely different picture on the basis of the same information, despite having a similar sensory system, the researcher notes. “Our perception is specific to how we deal with the world – and what our needs are.”