← All publications

Green-NAS: A Global-Scale Multi-Objective Neural Architecture Search for Robust and Efficient Edge-Native Weather Forecasting

MMM Fahim, SH Yesmin, S Islam, MPB Faruque, MA Salam, MM Uddin, S Islam, T Ahmed, M Binyamin, MR Karim

arXiv preprint · 2026 · preprint · arXiv:2602.00240

TL;DR

239× fewer parameters than GraphCast at near-identical accuracy — principled multi-objective NAS can find truly deployable models. Transfer learning adds ~5% accuracy gains when historical data is scarce.

Abstract

We introduce Green-NAS, a multi-objective neural architecture search (NAS) framework designed for low-resource environments using weather forecasting as a case study. Adhering to Green AI principles, the framework explicitly minimizes computational energy costs and carbon footprints, prioritizing sustainable deployment over raw computational scale. The search simultaneously optimizes model accuracy and efficiency to find lightweight architectures with very few parameters. Our best-performing model, Green-NAS-A, achieved an RMSE of 0.0988 (within 1.4% of a manually tuned baseline) using only 153k parameters—239 times fewer than globally deployed models such as GraphCast. Transfer learning improves forecasting accuracy by approximately 5.2% compared to training a new model per city when historical data is limited.

Neural Architecture SearchWeather ForecastingEdge ComputingGreen AITransfer LearningMulti-Objective Optimization

BibTeX

@article{fahim2026greennas,
  title   = {Green-NAS: A Global-Scale Multi-Objective Neural Architecture Search for Robust and Efficient Edge-Native Weather Forecasting},
  author  = {MMM Fahim and SH Yesmin and S Islam and MPB Faruque and MA Salam and MM Uddin and S Islam and T Ahmed and M Binyamin and MR Karim},
  year    = {2026},
  journal = {arXiv preprint},
  eprint  = {2602.00240},
  archivePrefix = {arXiv},
  url     = {https://arxiv.org/abs/2602.00240},
}