Frederik was a visiting researcher at the University of Edinburgh collaborating with Ben Leimkuhler on adapting sampling methods to neural networks and exploring their loss manifolds. He received his PhD in Applied Mathematics at the Rheinische Friedrich-Wilhelms-Universität, Bonn in Germany, in 2014 under the supervision of Michael Griebel. There, he learned a lot about material science, ab-initio methods in quantum chemistry, C++ programming, third-party projects, and proper experimental practice in computational sciences.

He has held a post doc position in Thomas Schuster's group on inverse problems in Saarbrücken from 2013 till 2016, where he worked on matrix factorization and hyperspectral imaging.

Research interests

Frederik's research interest are multitudinous but the common theme is the underlying problem of optimization of non-convex functions. He worked on extending the well-known Conjugate Gradient method to Banach spaces and applied the resulting method to low-rank matrix factorization in the context of hyperspectral images, see the library BASSO at

He has a strong interest in material sciences and biomolecules. He helped to realise a software library of various fast solvers for the Coulomb problem ( Moreover, he maintains a large software suite ( on constructing and equilibrating molecular systems and boot-strapping molecular simulations. There, the holy grail is to explore the potential energy surface whose knowledge would allow to assess material stability, devise chemical reactions, and ascertain reaction paths in biological processes. While local minima are found by optimization, global minima require enhanced sampling and other strategies.

At the moment, he is bringing these sampling methods to the realm of data science, where they will (hopefully) help to explore the loss manifold of (deep) neural networks and unravel the baffling mystery why neural networks work so well.