Skip to yearly menu bar Skip to main content


Poster

Extended Deep Adaptive Input Normalization for Preprocessing Time Series Data for Neural Networks

Marcus September · Francesco Sanna Passino · Leonie Goldmann · Anton Hinel

MR1 & MR2 - Number 56
[ ]
[ Poster
Thu 2 May 8 a.m. PDT — 8:30 a.m. PDT

Abstract:

Data preprocessing is a crucial part of any machine learning pipeline, and it can have a significant impact on both performance and training efficiency. This is especially evident when using deep neural networks for time series prediction and classification: real-world time series data often exhibit irregularities such as multi-modality, skewness and outliers, and the model performance can degrade rapidly if these characteristics are not adequately addressed. In this work, we propose the EDAIN (Extended Deep Adaptive Input Normalization) layer, a novel adaptive neural layer that learns how to appropriately normalize irregular time series data for a given task in an end-to-end fashion, instead of using a fixed normalization scheme. This is achieved by optimizing its unknown parameters simultaneously with the deep neural network using back-propagation. Our experiments, conducted using synthetic data, a credit default prediction dataset, and a large-scale limit order book benchmark dataset, demonstrate the superior performance of the EDAIN layer when compared to conventional normalization methods and existing adaptive time series preprocessing layers.

Chat is not available.