Contact

Director

Prof. Dr. Ernst-Rüdiger Olderog

Department of Computing Science
FK II
University of Oldenburg
D-26111 Oldenburg, Germany

Coordinator

Ira Wempe

Department of Computing Science
FK II
University of Oldenburg
D-26111 Oldenburg, Germany

[Colloquium 25.2.2019] Worzyk

Internal Colloquium

Nils Worzyk:
Adversarial Inputs and How to Threat Them

Adversarial examples are slightly perturbed inputs, which are not perceptible for humans, aiming at fooling a neural network classifier to misclassify the given input, e.g. classifying a stop sign as a priority sign. This poses risks, especially in safety critical environments.

In my thesis, I present a defending technique, which is based on attacking the input and tracking the differences between input and manipulated image. Based on their differences we are able to detect adversarial inputs and to restore the original image class.

A further approach for defending is based on the use of different architectures to classify images. I present a comparison between different architectures and propose a new architecture, which can be described as mixture of ensemble and hierarchical architectures. In the remaining time of the PhD I plan to apply the introduced approaches to sound and speech recognition.

(Changed: 19 Jan 2024)  | 
Zum Seitananfang scrollen Scroll to the top of the page