Sebastian Raubitzek and Thomas Neubauer. A fractal interpolation approach to improve neural network predictions for difficult time series data. Expert Systems with Applications, 169:114474, May 2021. ISSN 0957-4174. Doi: URL

Abstract: Deep Learning methods, such as Long Short-Term Memory (LSTM) neural networks prove capable of predicting real-life time series data. Crucial for this technique to work is a sufficient amount of data. This can either be very long time-series data or fine-grained time-series data. If the data is insufficient, in terms of length or complexity, LSTM approaches perform poorly. We propose a fractal interpolation approach to generate a more fine-grained time series out of insufficient data sets. The interpolation is dynamically adapted to the time series using a time-dependent complexity feature so that the complexity properties of the interpolated time series are related to that of the original one. Also we perform a linear interpolation with the same number of interpolation points to compare results. This paper shows that predictions of fractal interpolated and linear interpolated time series clearly outperform the ones of the original data for a test fit on unknown data. Though predictions of linear and fractal interpolated time series data perform very similar, the fractal interpolated ones outperform the linear interpolated ones on difficult time series data. Also, though the complexities of sub-intervals are tailored to match the one from the original data, the interpolated time series shows a much higher degree of persistency and (in terms of the Hurst exponent) a much higher degree of long term memory.