Navigation

Contact

EMail: scare@usbmkjol.dehg

DIRECTOR

Prof. Dr. Ernst-Rüdiger Olderog,

Department of Computing Science, FK II, University of Oldenburg,

D-26111 Oldenburg, Germany

olzmjderoldng@bxinfoxkftarmatik.u11cni-oldeuo7dfnburg.depcmy4

COODINATOR

Ira Wempe,

Department of Computing Science, FK II, University of Oldenburg,

D-26111 Oldenburg, Germany

ira2o.wempe7b@informatik.u9wmzni-old1aeptenburg.nue67de

Internal Colloquium

Nils Worzyk:
Adversarial Inputs and How to Threat Them

Adversarial examples are slightly perturbed inputs, which are not perceptible for humans, aiming at fooling a neural network classifier to misclassify the given input, e.g. classifying a stop sign as a priority sign. This poses risks, especially in safety critical environments.

In my thesis, I present a defending technique, which is based on attacking the input and tracking the differences between input and manipulated image. Based on their differences we are able to detect adversarial inputs and to restore the original image class.

A further approach for defending is based on the use of different architectures to classify images. I present a comparison between different architectures and propose a new architecture, which can be described as mixture of ensemble and hierarchical architectures. In the remaining time of the PhD I plan to apply the introduced approaches to sound and speech recognition.

Olwwemiver T4x1dheelg1i (oliv4wer.thrpheejzcnhl@dnuol.de) (Changed: 2020-01-23)