The FAMU-FSU College of Engineering and Florida State University’s Resilient Infrastructure and Disaster Response Center released research on Mar. 11 that examines various flood modeling methods and proposes a path toward integrating these approaches to improve predictions. The study, published in Reviews of Geophysics, brings together insights from multiple institutions to address the growing complexity and divergence in flood modeling techniques.
Flood models are essential tools for guiding decisions related to infrastructure design, emergency response, land-use planning, insurance, agriculture, water quality, and public safety. As new models have been developed for specific applications or used beyond their intended scope, opportunities to combine their strengths have often been missed.
Assistant Professor Ebrahim Ahmadisharaf, a co-author of the study, said: “As scientists and engineers pushed forward innovation in flood modeling, their work has diverged into a variety of methods, each with their own strengths and weaknesses. But integrating the improvements of various models is where we can really make the most impact across applications.”
The research categorizes flood models into four types: physics-based, data-driven, observational and experimental, and conceptual. While data-driven models are easier to implement and useful for analyzing complex patterns in data, they may lack reliability when applied outside their training conditions due to limited physical constraints. Ahmadisharaf said: “These patterns have inherent limitations. As new methods have developed in isolation from older paradigms, their improvements are siloed within their domains. That limits our ability to better prevent flood events.”
To address these challenges, the researchers recommend focusing on hybrid frameworks that combine different modeling approaches; enhancing physical representation within models; integrating data-based methods; and bridging the gap between scientific development and practical application. Ahmadisharaf said: “We have high-performance computing resources, which could overcome barriers for flood inundation modeling, but there is a trend of using simplified models that don’t take advantage of these new advancements.” He added: “People use simplified methods because they are faster and easier to implement. With data-driven models, however, there is a greater risk when there are data limitations… Integrating these different models would lead to improvements for both methods.”
The study emphasizes that refining flood modeling systems is crucial so they are not overextended beyond what they can reliably predict. Ahmadisharaf concluded: “Flood modeling supports decisions for damage reduction, infrastructure design and more. We aim to make scientists rethink the direction that flood modeling is going… We need to use these models to support each other so that we can better predict flooding events and protect our infrastructure and communities.”
Researchers from Bristol University; University of Alabama; University of Central Florida; Purdue University; University of California, Irvine; U.S. Army Engineer Research Development Center; the University of Tokyo; Tallahassee-based company Halff; and UK-based company Fathom contributed to this study.



