CrowdSense: Interpretable and Efficient Multivariate Crowd Forecasting with Active Learning

Authors

Wachsenegger, Anahid; Graser, Anita; Weißenfeld, Axel; Dragaschnig, Melitta

Abstract

Accurate forecasting of multivariate time series is essential for high-stakes industrial applications, where real-time decisions rely not only on predictive accuracy but also on transparency and human oversight. In this work, we present a novel Explainable Active Learning (XAL) framework for multivariate time series forecasting that integrates human expertise into the learning loop while enhancing interpretability. Our approach is specifically designed for complex and dynamic environments, such as crowd density prediction in urban settings, where high- impact decisions depend on anticipating critical events. We combine classical and deep learning models—including XGBoost, Temporal Convolutional Networks, Temporal Fusion Transformers, and TimeGPT—within an active learning loop that selects the most informative data points for expert review. Using SHAP-based explanations, our framework provides actionable insights into model behavior, allowing domain experts to iteratively refine predictions through guided feedback. Applied to real-world crowd density data over an 11-day horizon, our method demonstrates superior performance: XGBoost augmented with XAL achieves an 𝑅2 of 0.8491 and the lowest RMSE of 0.3126, while increasing recall for high-density events by 27%. By bringing humans into the loop and ensuring explainability in multivariate forecasting, this work addresses key challenges in industrial domains, where understanding why a model makes a prediction is as important as the prediction itself. The proposed XAL framework offers a promising direction for deploying trustworthy AI in environments where safety, efficiency, and accountability are paramount.

Scientific Publications
CEUR Workshop Proceedings
Yes