Head of division

Prof. Dr. Dr. Birger Kollmeier

+49 (0)441 798 5466 oder 5470

W30 3-313


Katja Warnken

+49 (0)441 798 5470

+49 (0)441 798-3902

W30 3-312

Kirsten Scheel

+49 (0)441 798-3813

+49 (0)441 798-3902

W30 3-312

Address (Mail address)

Medizinische Physik, Fakultät VI
Universität Oldenburg
26111 Oldenburg

Location / How to find us

For specific questions regarding one of our research topics, please contact the respective people directly (see staff list).

Paper Wendt Brand Kollmeier 2014

An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities

Dorothea Wendt, Thomas Brand, Birger Kollmeier (2014) PLoS One. 2014 Jun 20;9(6):e100186

An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

Link to the publication

(Changed: 05 Apr 2023)  | 
Zum Seitananfang scrollen Scroll to the top of the page