My Profile Photo

Personal Webpage of Max Horn


I'm a PhD Student in Machine Learning and Computational Biology at ETH Zürich and work on the development of deep learning methods for real world medical time series.
My interests include but are not limited to: Machine Learning for Healthcare, Probabilistic Modelling, Time Series Modelling and Interpretable Machine Learning.
Here I write about stuff I care about in the realm of science, programming and technology.


Two of our papers were accepted at ICML 2020

This week we presented two of our papers at ICML 2020, it was a great experience to talk with others about our research and to think of future directions and applications of our methods. I want to use this page to point out reference materials for these publications.

Set Functions for Time Series

“Set Functions for Time Series” is work done together with Michael Moor, Christian Bock, Bastian Rieck and my PhD Advisor Karsten Borgwardt.

In the core we propose to rephrase learning on irregularly sampled time series data as a set classification problem. This mitigates the necessity of imputing time series prior to the application of Deep Learning models and allows their direct application. Michael Larionov wrote a nice summary of our paper on towards data science where he explains the core components of the model.

For further details please see the links below:

Paper: ICML 2020, arXiv

Talk: ICML 2020, Slides

Code: Models, Datasets

Topological Autoencoders

The work “Topological Autoencoders” was joint work with my colleagues Michael Moor and Bastian Rieck and my PhD Advisor Karsten Borgwardt.

Training visualization Topological AutoencoderTraining visualization Vanilla Autoencoder

In this work we propose to constrain the topology of the latent representation of an autoencoder using methods from topological data analysis. Michael wrote a wonderful blog post about the paper giving an intuitive introduction here. Below you can see our devised approach in action compared to a vanilla autoencoder.

Paper: ICML 2020, arXiv

Talk: ICML 2020, Slides

Code: GitHub

comments powered by Disqus