Beyond Predictive Accuracy: Enhancing Parameter Stability in Multicollinear Time Series Forecasting via Regularisation

Authors

DOI:

https://doi.org/10.29408/edumatic.v10i1.33925

Keywords:

elasticnet regularization, electricity consumption, multicollinearity, ridge regression, time series regression

Abstract

Multicollinearity in feature-based time series regression arises as a structural consequence of lagged and rolling feature construction. However, existing studies on Ridge and ElasticNet regularization adopt an accuracy-driven evaluation paradigm, with limited attention to parameter stability, shrinkage behavior, and sensitivity to regularization strength. This study shifts the evaluation of regularized linear models from predictive accuracy toward stability-oriented assessment. Using daily electricity consumption data from the UCI Repository, Linear Regression, Ridge, and ElasticNet models are examined under engineered temporal features derived from stability-based lag pruning, rolling statistics, and correlation-informed feature selection. Model evaluation focuses on bias–variance behavior, coefficient shrinkage, regularization sensitivity, and training–testing performance gaps. The results show that regularization improves stability, with the performance gap decreasing from 0.0961 in Linear Regression to 0.0608 under ElasticNet. These comparisons show that regularization stabilizes regression models via distinct shrinkage mechanisms, informing model selection beyond accuracy. Ridge exhibits conservative shrinkage averaging 6.06%, whereas ElasticNet induces stronger shrinkage averaging 46.32% and shows higher sensitivity to penalty strength. These findings provide methodological evidence that regularization in feature-based time series regression should be treated as a stability strategy rather than an accuracy optimization tool, offering guidance for electricity load forecasting under structurally redundant temporal features.

References

Akhtar, N., & Alharthi, M. F. (2024). A comparative study of the performance of new ridge estimators for multicollinearity: Insights from simulation and real data application. AIP Advances, 14(11). https://doi.org/10.1063/5.0236631

Al-Essa, L. A., Ebrahim, E. A., & Mergiaw, Y. A. (2024). Bayesian regression modeling and inference of energy efficiency data: the effect of collinearity and sensitivity analysis. Frontiers in Energy Research, 12. https://doi.org/10.3389/fenrg.2024.1416126

Alharthi, M. F., & Akhtar, N. (2025). Modified Two-Parameter Ridge Estimators for Enhanced Regression Performance in the Presence of Multicollinearity: Simulations and Medical Data Applications. Axioms, 14(7), 527. https://doi.org/10.3390/axioms14070527

Al-Momani, M., Yüzbaşı, B., Bataineh, M. S., Abdallah, R., & Moideenkutty, A. (2025). Shrinkage Approaches for Ridge-Type Estimators Under Multicollinearity. Mathematics, 13(22). https://doi.org/10.3390/math13223733

Ariens, S., Adolf, J. K., & Ceulemans, E. (2022). Collinearity Issues in Autoregressive Models with Time-Varying Serially Dependent Covariates. Multivariate Behavioral Research, 58(4), 687–705. https://doi.org/10.1080/00273171.2022.2095247

Basagaña, X., & Barrera-Gómez, J. (2022). Reflection on modern methods: Visualizing the effects of collinearity in distributed lag models. International Journal of Epidemiology, 51(1), 334–344. https://doi.org/10.1093/ije/dyab179

Bodinier, B., Filippi, S., Nøst, T. H., Chiquet, J., & Chadeau-Hyam, M. (2023). Automated calibration for stability selection in penalised regression and graphical models. Journal of the Royal Statistical Society. Series C: Applied Statistics, 72(5), 1375–1393. https://doi.org/10.1093/jrsssc/qlad058

Demateis, D., Keller, K. P., Rojas-Rueda, D., Kioumourtzoglou, M. A., & Wilson, A. (2024). Penalized distributed lag interaction model: Air pollution, birth weight, and neighborhood vulnerability. Environmetrics, 35(4). https://doi.org/10.1002/env.2843

Dertli, H. I., Hayes, D. B., & Zorn, T. G. (2024). Effects of multicollinearity and data granularity on regression models of stream temperature. Journal of Hydrology, 639, 131572. https://doi.org/10.1016/j.jhydrol.2024.131572

Goulet Coulombe, P., Leroux, M., Stevanovic, D., & Surprenant, S. (2022). How is machine learning useful for macroeconomic forecasting? Journal of Applied Econometrics, 37(5), 920–964. https://doi.org/10.1002/jae.2910

Guan, X., & Burton, H. (2022, December). Bias-variance tradeoff in machine learning: Theoretical formulation and implications to structural engineering applications. Structures 46, 17-30. Elsevier. https://doi.org/10.1016/j.istruc.2022.10.004

Haftorn, K. L., Romanowska, J., Lee, Y., Page, C. M., Magnus, P. M., Håberg, S. E., Bohlin, J., Jugessur, A., & Denault, W. R. P. (2023). Stability selection enhances feature selection and enables accurate prediction of gestational age using only five DNA methylation sites. Clinical Epigenetics, 15(1). https://doi.org/10.1186/s13148-023-01528-3

Hewamalage, H., Ackermann, K., & Bergmeir, C. (2023). Forecast evaluation for data scientists: common pitfalls and best practices. Data Mining and Knowledge Discovery, 37(2), 788–832. https://doi.org/10.1007/s10618-022-00894-5

Kariya, T., Kurata, H., & Hayashi, T. (2024). A modelling framework for regression with collinearity. Journal of Statistical Planning and Inference, 228, 95–115. https://doi.org/10.1016/j.jspi.2023.07.001

Koukaras, P., Mustapha, A., Mystakidis, A., & Tjortjis, C. (2024). Optimizing Building Short-Term Load Forecasting: A Comparative Analysis of Machine Learning Models. Energies, 17(6). https://doi.org/10.3390/en17061450

Shirato, K., Oba, K., Matsuyama, Y., & Hagiwara, Y. (2024). Association of longitudinal pet ownership with wheezing in 3-year-old children using the distributed lag model: the Japan Environment and Children’s Study. Environmental Health: A Global Access Science Source, 23(1). https://doi.org/10.1186/s12940-024-01087-x

Sztepanacz, J. L., & Houle, D. (2024). Regularized regression can improve estimates of multivariate selection in the face of multicollinearity and limited data. Evolution Letters, 8(3), 361–373. https://doi.org/10.1093/evlett/qrad064

Tay, J. K., Narasimhan, B., & Hastie, T. (2023). Elastic Net Regularization Paths for All Generalized Linear Models. Journal of Statistical Software, 106. https://doi.org/10.18637/jss.v106.i01

Uniejewski, B. (2024). Regularization for electricity price forecasting. Operations Research and Decisions, 34(3), 267–286. https://doi.org/10.37190/ord240314

Wang, W., & Ruf, J. (2022). A note on spurious model selection. Quantitative Finance, 22(10), 1797–1800. https://doi.org/10.1080/14697688.2022.2097120

Downloads

Published

2026-03-04

How to Cite

Faisa, D. K. K., & Salam, A. (2026). Beyond Predictive Accuracy: Enhancing Parameter Stability in Multicollinear Time Series Forecasting via Regularisation. Edumatic: Jurnal Pendidikan Informatika, 10(1), 70–79. https://doi.org/10.29408/edumatic.v10i1.33925