Research goal
Research goal
Project A - Research goal
Aim of this subproject is to assess methods to adopt an individualized hearing system to the acoustic environment by means of gestural interaction techniques. Three different levels of control are targeted:
- Detection of specific (spatial) acoustic situations. By providing sub-conscious and intentional intuitive gestures as further input features to classification algorithms the classification performance might be extended beyond the technical limits.
- Localization of sources, i.e., control of directional filters by eye-, head- or hand movements.
- Selection of acoustic objects based on their content in the case of simultaneous, acoustically equivalent sources.
The assessment of gestural interactive control techniques is considering four major components: In the center is the hearing aid user ("subject"), who reacts to acoustic events with sub-conscious body movements or learned intuitive gestures. These reactions will be recorded by a body sensor network, and the derived information will be passed to the hearing aid. In laboratory experiments the acoustic environment has to be simulated by a loudspeaker array.