TUM Logo

Development of a continuously differentiable Model of arbitrary Instruction Sequences

Development of a continuously differentiable Model of arbitrary Instruction Sequences

Supervisor(s): Daniel Kowatsch
Status: finished
Topic: Others
Author: Salim Hertelli
Submission: 2022-08-15
Type of Thesis: Bachelorthesis
Thesis topic in co-operation with the Fraunhofer Institute for Applied and Integrated Security AISEC, Garching

Description

Solving optimization problems can be done in various ways. For 
instance, evolutionary optimization algorithms such as fuzzing can be 
used to solve such problems. Gradient based optimization is another 
approach that is considered to be faster than evolutionary methods. 
However, the gradient-based approaches require gradients to be 
efficiently calculated or approximated, which requires the problem to be 
differentiable. We propose to create a neural network model to predict 
the effects of a given arbitrary instruction sequence on the registers, 
given the initial state of the registers prior to executing the 
instruction sequence. Having such a model allows us to consider 
arbitrary instruction sequences as differentiable sequences, enabling us 
to calculate their gradient. This property allows gradient-based 
optimization techniques to be used on various problems related to 
instruction sequences, such as bug or vulnerability detection. In this 
thesis, we develop such a model using TREX as an embedding model for 
instruction sequences. We also present the performance of our developed 
model alongside our methodology and discuss some limitations to our 
approach.