TUM Logo

Adversarial Machine Learning on Capsule Theory

Adversarial Machine Learning on Capsule Theory

Supervisor(s): Dr. Huang Xiao
Status: finished
Topic: Machine Learning Methods
Author: Andreas Hörmandinger
Submission: 2018-07-16
Type of Thesis: Masterthesis
Thesis topic in co-operation with the Fraunhofer Institute for Applied and Integrated Security AISEC, Garching

Description

In times, where autonomous cars are already being tested, the security of the used
technology, like machine learning, is a major concern. As of today there are few
remedies to attacks on deep neural networks. There is a new concept called ’Capsule
Theory’ which claims to be more robust to such attacks.
This thesis aims to explore Capsule Theory, especially the newly introduced routing
algorithms in an Adversarial Machine Learning context.
Through analysis of the differences between neural networks using Capsule Theory
and the state-of-the-art Convolutional Neural Networks, different parameters of the
experiments are defined. The experiments include different configurations of the
Capsule Networks, popular attacks, two popular datasets and one dataset closer to the
real world. The results of the experiments are analyzed and it is shown that Capsule
Networks achieve a higher robustness. Additionally the most robust configuration of a Capsule Network is singled out and it is shown that even that configuration can be successfully attacked.