Application of ML algorithms to infer streambed flux from subsurface pressure and temperature observations

Mohammad Moghaddam, P.A. Ty Ferre, Xingyuan Chen1, Kewei Chen1, Xuehang Song1, and Glenn Hammond2
Department of Hydrology and Atmospheric Sciences
The University of Arizona

We demonstrate the application of two simple machine learning tools—regression tree and gradient boosting analyses—to a hydrologic inference problem to address two objectives. The first goal was to infer the flux between a river and the subsurface based on high temporal resolution (5-minute) observations of subsurface pressure and temperature. The second goal was to identify an optimal set of observations to support these inferences. Specifically, we examine how many and what type of observations (pressure and/or temperature) were necessary and at what depths. Using synthetic observations and surface fluxes provided by a fully resolved three-dimensional flow and heat transport model, we found that both machine learning tools could identify the flux well using pressure and temperature measurements collected at three depths, even when considerable noise was added to the synthetic observations. Neither method could provide reasonable flux estimates given only noisy temperature data. A shallow, collocated temperature and pressure observations performed as well as the complete data set. The results show the promise of using machine learning tools to design hydrologic measurement networks, both for determining whether a proposed data set can constrain inversion and for optimizing monitoring networks comprised of multiple measurement types.

1Pacific Northwest National Laboratory, Richland, WA
2Sandia National Laboratory, Albuquerque, NM

Poster Image | Video Presentation

Back to Agenda | Back to All Abstracts

File Upload