On the Approximation Capabilities of Deep Neural Networks for Multivariate Time Series Modeling

Mohammad Jamhuri, Mohammad Isa Irawan, Ari Kusumastuti, Kartick Chandra Mondal, Juhari Juhari

Abstract


Multivariate time series forecasting plays a crucial role in various domains, including finance, where accurate stock price prediction supports strategic decision-making. Traditional methods such as Autoregressive Integrated Moving Average (ARIMA), Exponential Smoothing (ETS), and Vector Autoregression (VAR) often fall short when dealing with complex, non-linear data—particularly those exhibiting long-term temporal dependencies. This study evaluates deep learning approaches, namely Multilayer Perceptron (MLP), Convolutional Neural Networks (CNN), and Long Short-Term Memory (LSTM), using daily AAPL stock price data from January 2020 to November 2024. The results show that the MLP model with a 10-day time window achieves the best accuracy, yielding lower values in Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), and Mean Absolute Percentage Error (MAPE) compared to CNN, LSTM, and VAR. The findings suggest that MLP is particularly effective in capturing complex patterns in multivariate time series forecasting.

Keywords


Time series forecasting; temporal dependency; non-linear modeling; artificial neural networks; multivariate prediction

Full Text:

PDF

References


[1] M. A. Villegas and D. J. Pedregal, “Automatic selection of unobserved components models for supply chain forecasting,” International Journal of Forecasting, vol. 35, no. 1, pp. 157–169, 2019.

[2] E. F. Fama and K. R. French, “Multifactor explanations of asset pricing anomalies,” The Journal of Finance, vol. 51, no. 1, pp. 55–84, 1996.

[3] S. Fan and L. Chen, “Short-term load forecasting based on an adaptive hybrid method,” IEEE Transactions on Power Systems, vol. 21, no. 1, pp. 392–401, 2006.

[4] G. E. Box, G. M. Jenkins, G. C. Reinsel, and G. M. Ljung, Time Series Analysis: Forecasting and Control, John Wiley & Sons, 2015.

[5] G. P. Zhang, “Time series forecasting using a hybrid ARIMA and neural network model,” Neurocomputing, vol. 50, pp. 159–175, 2003.

[6] I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, MIT Press, 2016.

[7] H. Hewamalage, C. Bergmeir, and K. Bandara, “Recurrent neural networks for time series forecasting: Current status and future directions,” International Journal of Forecasting, vol. 37, no. 1, pp. 388–427, 2021.

[8] Y. Bengio, P. Simard, and P. Frasconi, “Learning long-term dependencies with gradient descent is difficult,” IEEE Transactions on Neural Networks, vol. 5, no. 2, pp. 157–166, 1994.

[9] G. Lai, W.-C. Chang, Y. Yang, and H. Liu, “Modeling long- and short-term temporal patterns with deep neural networks,” in Proceedings of the 41st International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 95–104, 2018.

[10] B. Curry, P. Morgan, and M. Silver, “Neural networks and non-linear statistical methods: An application to the modelling of price–quality relationships,” Computers & Operations Research, vol. 29, no. 8, pp. 951–969, 2002.

[11] S. Zargar, “Introduction to sequence learning models: RNN, LSTM, GRU,” Department of Mechanical and Aerospace Engineering, North Carolina State University, 2021.

[12] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.

[13] S. Hayou, A. Doucet, and J. Rousseau, “On the impact of the activation function on deep neural network training,” in Proceedings of the International Conference on Machine Learning, pp. 2672–2680, PMLR, 2019.

[14] A. Zaras, N. Passalis, and A. Tefas, “Neural networks and backpropagation,” in Deep Learning for Robot Perception and Cognition, pp. 17–34, Elsevier, 2022.

[15] L. Sadouk, “CNN approaches for time series classification,” in Time Series Analysis – Data, Methods, and Applications, vol. 5, pp. 57–78, 2019.

[16] M. Sun, Z. Song, X. Jiang, J. Pan, and Y. Pang, “Learning pooling for convolutional neural network,” Neurocomputing, vol. 224, pp. 96–104, 2017.

[17] P. K. Diederik, “Adam: A method for stochastic optimization,” 2014.

[18] G. Van Houdt, C. Mosquera, and G. Nápoles, “A review on the long short-term memory model,” Artificial Intelligence Review, vol. 53, no. 8, pp. 5929–5955, 2020.

[19] F. M. Shiri, T. Perumal, N. Mustapha, and R. Mohamed, “A comprehensive overview and comparative analysis on deep learning models: CNN, RNN, LSTM, GRU,” arXiv preprint arXiv:2305.17473, 2023.




DOI: https://doi.org/10.18860/cauchy.v10i2.32760

Refbacks

  • There are currently no refbacks.


Copyright (c) 2025 Mohammad Jamhuri, Mohammad Isa Irawan, Ari Kusumastuti

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Editorial Office
Mathematics Department,
Universitas Islam Negeri Maulana Malik Ibrahim Malang
Gajayana Street 50 Malang, East Java, Indonesia 65144
Faximile (+62) 341 558933
e-mail: cauchy@uin-malang.ac.id

Creative Commons License
CAUCHY: Jurnal Matematika Murni dan Aplikasi is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.