NEAR-REALTIME FLOOD DETECTION FROM MULTI-TEMPORAL SENTINEL RADAR IMAGES USING ARTIFICIAL INTELLIGENCE

de la Cruz, R. M.; Olfindo Jr., N. T.; Felicen, M. M.; Borlongan, N. J. B.; Difuntorum, J. K. L.; Marciano Jr., J. J. S.

Flood extent delineation from RADAR images usually entails manual thresholding per scene, which is not feasible when tackling large-scale floods that often covers multiple RADAR scenes. It is also computationally intensive when processed through traditional remote sensing techniques that limit its use during emergency situations. To hasten the production of flood maps from RADAR images during flooding incidents, a deep learning model using Fully connected Convolutional Neural Network (FCNN) has been developed to delineate flooded areas with minimal human intervention. The model was formulated from the data gathered during a flooding event captured by both Sentinel-1A SAR satellite and Planet’s Dove optical satellites. Two pre-flood and one post-flood SAR scenes were used to detect the occurrence of water by analysing drops in backscatter values. The potential flood extents were verified using optical images which were then used to train the AI model. The model is currently being used operationally to map flood extent across the Philippines with no human intervention from data download to detection of flooded areas. The technique can detect floods across five Sentinel 1 scenes in less than four hours upon download of new satellite data.

Zitieren

Zitierform:

de la Cruz, R. M. / Olfindo Jr., N. T. / Felicen, M. M. / et al: NEAR-REALTIME FLOOD DETECTION FROM MULTI-TEMPORAL SENTINEL RADAR IMAGES USING ARTIFICIAL INTELLIGENCE. 2020. Copernicus Publications.

Zugriffsstatistik

Gesamt:
Volltextzugriffe:
Metadatenansicht:
12 Monate:
Volltextzugriffe:
Metadatenansicht:

Grafik öffnen

Rechte

Rechteinhaber: R. M. de la Cruz et al.

Nutzung und Vervielfältigung:

Export