A near real-time flood-mapping approach by integrating social media and post-event satellite imagery

Xiao Huang, Cuizhen Wang, Zhenlong Li

Research output: Contribution to journalArticlepeer-review

79 Scopus citations

Abstract

Rapid flood mapping is critical for timely damage assessment and post-event recovery support. Remote sensing provides spatially explicit information for the mapping process, but its real-time imagery is often not available due to bad weather conditions during the event. Using the 2015 South Carolina Flood in downtown Columbia as a case study, this article proposes a novel approach to retrieve near real-time flood probability map by integrating the post-event remote sensing data with the real-time volunteered geographic information (VGI). Relying on each VGI point, an inverse distance weighted height filter was introduced to build a probability index distribution (PID) layer from the high-resolution digital elevation model (DEM) data. For each PID layer, a Gaussian kernel was developed to extract its moisture weight from the normalized difference water index (NDWI) of an EO-1 Advanced Land Imager (ALI) image. Finally, a normalized flood probability map was produced by chaining the moisture weighted PIDs in a Python environment. Results indicate that, by adding the wetness information from post-event satellite observations, the proposed model could provide near real-time flood probability distribution with real-time social media, which is of great importance for emergency responders to quickly identify areas in need of immediate attention.

Original languageEnglish (US)
Pages (from-to)113-123
Number of pages11
JournalAnnals of GIS
Volume24
Issue number2
DOIs
StatePublished - Apr 3 2018

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • General Earth and Planetary Sciences

Fingerprint

Dive into the research topics of 'A near real-time flood-mapping approach by integrating social media and post-event satellite imagery'. Together they form a unique fingerprint.

Cite this