Terminansicht

Veranstaltung

Gaze-based Multimodal Interaction: Methods for Active and Passive Gaze-based Interaction

Am Montag, den 26. August 2024, um 16:15 Uhr hält
Michael Barz
Universität Oldenburg
im Rahmen seiner beabsichtigten Dissertation einen Vortrag mit dem Titel
Gaze-based Multimodal Interaction: Methods for Active and Passive
Gaze-based Interaction
Der Vortrag findet hybrid statt:
OFFIS, Escherweg 2, Raum F02 und https://meeting.uol.de/rooms/j8q-yqj-ku8-jtn/


Abstract:
Humans use their sense of vision to perceive their environment and make situation-aware decisions. The
eyes naturally coordinate with, for instance, hand movements and speech generation to focus on relevant
visual information for that task. People tend to look at objects when they aim to grasp or refer to them
in a dialog. Modern eye tracking technology allows developers to incorporate real-time gaze information
as input to multimodal human-computer interfaces. It can be used as an active or passive input modality
in multimodal interfaces: a user can influence a system via explicit eye movements (active), and a system
can implicitly derive information about the user and its environment by observing the eye movement
behavior and fixated objects in the environment (passive). However, many issues remain underexplored,
so the technology cannot be widely deployed in interactive intelligent systems.
This thesis aims to investigate new approaches and develop new methods to enable effective and efficient
gaze-based user interfaces. We contribute by developing two methods for active gaze-based interaction
and three approaches for passively interpreting the human gaze signal. Concerning active gazebased
interaction, we develop a methodology for modeling the gaze estimation error in head-mounted
eye trackers and demonstrate the positive effects of incorporating error estimates in an error-aware gazebased
interface for object selection. Further, we present a method for calibration-free authentication via
PIN entry based on saccadic eye movements, which does not require accurate gaze estimates, using a
remote eye tracker. Concerning passive gaze-based interaction, we investigate novel approaches for
interpreting the human gaze signal. We present techniques for interpreting human eye movements in the
presence of image and text content: we aim to infer the search target in an ongoing visual search process
and estimate the perceived relevance of a paragraph read by a user. Further, we develop a method for
automatic detection of visual attention to ambient objects, which can be used to reduce the human effort
in annotating mobile eye tracking recordings and for developing situation-aware human-computer interfaces.
Further, we outline a framework for gaze-informed multimodal interaction relating to multimodal-
multisensor and intelligent user interfaces.
Betreuer: Prof. Dr. Daniel Sonntag

26.08.2024 16:15 – 17:45

(Stand: 20.06.2024)  | 
Zum Seitananfang scrollen Scroll to the top of the page