Modelling Timing of Explanations in Autonomous Vehicles

Am Montag, den 29. April 2024, um 16:15 Uhr hält

Akhila Bairy
Universität Oldenburg

im Rahmen ihrer beabsichtigten Dissertation einen Vortrag mit dem Titel

Modelling Timing of Explanations in Autonomous Vehicles

Der Vortrag findet hybrid statt:
OFFIS F02, Escherweg 2 und

The emergence of Automated Cyber-Physical Systems (ACPS) has introduced Autonomous Driving as a prominent
application. However, a significant challenge facing the widespread adoption of Autonomous Vehicle (AV)
lies in societal acceptance and trust. A way to mitigate this challenge is by making the AV provide explanations
of its decisions.
There are three essential dimensions to designing explanations, namely content, frequency, and timing. Our goal
is to develop an algorithm that optimises explanation in AVs. While existing research predominantly focuses on
the content aspect, the fine-grained aspects of frequency and timing remain underexplored. Previous studies concerning
”when to explain” have typically categorized the timing of explanations into broad phases — before,
during or after an action is performed. For AVs, studies have shown that passengers prefer to receive an explanation
before an autonomous action takes place.
This dissertation focuses on the fine-granular timing through the modelling of the cognitive impact of explanation
timing, specifically tailored for AVs. The primary objective is to devise an algorithm that generates optimally
timed explanations to minimize passengers’ cognitive load, leveraging the SEEV (Salience, Effort, Expectancy,
Value) attention model within a probabilistic reactive game framework. The present work further investigates
the effect of multi-step explanations on the timing of the explanations.
Additionally, the research extends its scope to explore how multiple subjects influence the timing of both singlestep
and multi-step explanations. The efficacy of the generated timing strategy is evaluated through real-world
experiments employing a game-based setup.
In conclusion, this dissertation contributes to narrowing the gap in explainability research, specifically concerning
the timing of explanations. While the focus of this dissertation lies on AVs, the insights gained can be potentially
extended to other domains as well.
Betreuer: Prof. Dr. Martin Fränzle

29.04.2024 16:15 – 17:45

(Stand: 19.01.2024)  | 
Zum Seitananfang scrollen Scroll to the top of the page