Poster
VA-MoE: Variables-Adaptive Mixture of Experts for Incremental Weather Forecasting
Hao Chen · Tao Han · Song Guo · Jie ZHANG · Yonghan Dong · Yunlong Yu · LEI BAI
This paper presents Variables-Adaptive Mixture of Experts (VA-MoE), a novel framework for incremental weather forecasting that dynamically adapts to evolving spatiotemporal patterns in real-time data. Traditional weather prediction models often struggle with exorbitant computational expenditure and the need to continuously update forecasts as new observations arrive. VA-MoE addresses these challenges by leveraging a hybrid architecture of experts, where each expert specializes in capturing distinct sub-patterns of atmospheric variables (e.g., temperature, humidity, wind speed). Moreover, the proposed method employs a variable-adaptive gating mechanism to dynamically select and combine relevant experts based on the input context, enabling efficient knowledge distillation and parameter sharing. This design significantly reduces computational overhead while maintaining high forecast accuracy. Experiments on real-world ERA5 dataset demonstrate that VA-MoE performs comparable against state-of-the-art models in both short-term (e.g., 1–3 days) and long-term (e.g., 5 days) forecasting tasks, with only about 25\% of trainable parameters and 50\% of the initial training data.
Live content is unavailable. Log in and register to view live content