APPLICATION OF U-NET CONVOLUTIONAL NEURAL NETWORK TO BUSHFIRE MONITORING IN AUSTRALIA WITH SENTINEL-1/-2 DATA
This paper aims to define a pipeline architecture for near real-time identification of bushfire impact areas using Geoscience Australia Data Cube (AGDC). A series of catastrophic bushfires from late 2019 to early 2020 have captured international attention with their scale of devastation across four of the most populous states across Australia; New South Wales, Queensland, Victoria and South Australia. The extraction of burned areas using multispectral Sentinel-2 observations are straightforward when no cloud or haze obstruction are present. Without clear-sky observations, precisely locating the bushfire affected regions are difficult to achieve. Sentinel-1 C-band dual-polarized (VH/VV) Synthetic Aperture Radar (SAR) data is introduced to effectively elicit and analyse useful information based on backscattering coefficients, unaffected by adverse weather conditions and lack of sunlight. Burned vegetation results in significant volume scattering; co-/cross-polarised response decreases due to leafless trees, as well as coherence change over fire-disturbed areas; two sensors acquired images in a shortened revisit time over the same effected areas; all of which provided discriminative features for identifying burnt areas. Moreover, applying U-Net deep learning framework to train the recent and historical satellite data leads to an effective pre-trained segmentation model of burnt and non-burnt areas, enabling more timely emergency response, more efficient hazard reduction activities and evacuation planning during severe bushfire events. The advantages of this approach could have profound significance for a more robust, timely and accurate method of bushfire detection, utilising a scalable big data processing framework, to predict the bushfire footprint and fire spread model development.