TUM Logo

Robust Physical Attacks on Neural Networks

Robust Physical Attacks on Neural Networks

Supervisor(s): Jan-Philipp Schulze, Philip Sperl
Status: finished
Topic: Others
Author: Paul Andrei Sava
Submission: 2021-11-15
Type of Thesis: Bachelorthesis
Thesis topic in co-operation with the Fraunhofer Institute for Applied and Integrated Security AISEC, Garching

Description

Neural networks are known to be susceptible to adversarial attacks,
i.e., inputs specifically crafted to fool the network’s prediction.
Recent studies have shown that such attacks can also be performed
"over-the-air", for example, by using a video camera.
One approach to creating physical adversarial attacks is the
"Expectation over Transformation" method, which translates physical
transformations into mathematical functions and incorporates them
during the adversarial training process. We investigate the impact of
the chosen transformations on the physical robustness of the attack on
a state-of-the-art object detector. We show that some of them are
essential for achieving a robust attack, while others control the
position in which the attack is most effective. Additionally, we
provide an outlook on simulated physical environments and how they
could be used as a reliable test setup for physical adversarial
attacks.