Abstract [eng] |
Sequence data is a common pattern in real world observational datasets. The extra temporal dimension poses new challenges when applying deep learning techniques. Research leaps were made possible with the introduction of Long-Short Term Memory cells, as well as Gated Recurrent Units. However, these classical sequence data models all are based on the implicit assumption that sequences are sampled in regular intervals and are fully observed. In real world scenarios however, this is rarely the case. Most time series are irregularly sampled and often only partially observed. It must therefore be the goal to have a universal model capable of handling arbitrary sequential data at low computational cost. Most of the proposed approaches in literature are heavily over-engineered in order to cope with certain properties. While working on this thesis however, a new method was proposed that utilizes controlled differential equations (CDEs) to create a continuous time recurrent neural network that is indifferent to partial observations. In this thesis, I apply the idea of CDEs to create a continuous time echo state network, which inherits the advantageous properties that echo state networks have in comparison to standard recurrent neural networks. This is the main contribution of this work. As a secondary contribution, a non-conclusive analysis of the most popular and powerful methods from literature is given. |