TY - JOUR
T1 - Update hydrological states or meteorological forcings? Comparing data assimilation methods for differentiable hydrologic models
AU - Jamaat, Amirmoez
AU - Song, Yalan
AU - Rahmani, Farshid
AU - Liu, Jiangtao
AU - Lawson, Kathryn
AU - Shen, Chaopeng
N1 - Publisher Copyright:
© 2025 Elsevier B.V.
PY - 2025/12
Y1 - 2025/12
N2 - Data assimilation (DA) enables hydrologic models to update their internal states using near-real-time observations for more accurate forecasts. With deep neural networks like long short-term memory (LSTM), using variational DA has shown success in improving forecasts. However, it remains unclear whether this method is also effective for physics-informed machine learning (“differentiable”) models, which represent only a small amount of physically-meaningful states while using deep networks to supply parameters or missing processes. Here we developed variational DA methods for differentiable models, including optimizing adjusters for just precipitation data, just model internal hydrological states, or both. Our results demonstrated that differentiable streamflow models using the CAMELS dataset can benefit strongly and equivalently from variational DA as compared to LSTM, with one-day lead time median Nash-Sutcliffe efficiency (NSE) elevated from 0.75 to 0.82. The resulting forecast matched or outperformed LSTM with DA in the eastern, northwestern, and central Great Plains regions of the conterminous United States. Both precipitation and state adjusters were needed to achieve these results, with the latter being substantially more effective on its own, and the former adding moderate benefits for high flows. Our DA framework does not need systematic training data and could serve as a practical DA scheme for whole river networks.
AB - Data assimilation (DA) enables hydrologic models to update their internal states using near-real-time observations for more accurate forecasts. With deep neural networks like long short-term memory (LSTM), using variational DA has shown success in improving forecasts. However, it remains unclear whether this method is also effective for physics-informed machine learning (“differentiable”) models, which represent only a small amount of physically-meaningful states while using deep networks to supply parameters or missing processes. Here we developed variational DA methods for differentiable models, including optimizing adjusters for just precipitation data, just model internal hydrological states, or both. Our results demonstrated that differentiable streamflow models using the CAMELS dataset can benefit strongly and equivalently from variational DA as compared to LSTM, with one-day lead time median Nash-Sutcliffe efficiency (NSE) elevated from 0.75 to 0.82. The resulting forecast matched or outperformed LSTM with DA in the eastern, northwestern, and central Great Plains regions of the conterminous United States. Both precipitation and state adjusters were needed to achieve these results, with the latter being substantially more effective on its own, and the former adding moderate benefits for high flows. Our DA framework does not need systematic training data and could serve as a practical DA scheme for whole river networks.
UR - https://www.scopus.com/pages/publications/105014738296
UR - https://www.scopus.com/pages/publications/105014738296#tab=citedBy
U2 - 10.1016/j.jhydrol.2025.134137
DO - 10.1016/j.jhydrol.2025.134137
M3 - Article
AN - SCOPUS:105014738296
SN - 0022-1694
VL - 663
JO - Journal of Hydrology
JF - Journal of Hydrology
M1 - 134137
ER -