Effective detection and discrimination of surface deformation features in Synthetic Aperture Radar imagery is one of the most important applications of the data. Areas that undergo surface deformation can pose health and safety risks which necessitates an automatic and reliable means of surface deformation discrimination. Due to the similarities between subsidence features and false positives, advanced discrimination methods are necessary in order to obtain meaningful results. Convolutional neural networks have shown to be effective discriminators in other image processing tasks by making use of the spatial relations and underlying characteristics of images in order to classify inputs into classes. This paper presents a Convolutional Neural Network tailored to process interferometric Synthetic Aperture Radar imagery to identify subsidence features. Initial results indicate that the network and its trained kernel weights are able to effectively identify false positives in a small dataset due to careful network selection. Future work includes improving the initial detections to reduce false alarms and making use of multi-channel Synthetic Aperture Radar data directly into the network.
Reference:
Schwegmann, C.P. et al. 2017. Subsidence feature discrimination using deep convolutional neral networks in synthetic aperture radar imagery. 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 23-28 July 2017, Fort Worth, TX, USA
Schwegmann, C. P., Kleynhans, W., Engelbrecht, J., Mdakane, L. W., & Meyer, R. G. (2017). Subsidence feature discrimination using deep convolutional neral networks in synthetic aperture radar imagery. IEEE. http://hdl.handle.net/10204/9955
Schwegmann, Colin P, Waldo Kleynhans, Jeanine Engelbrecht, Lizwe W Mdakane, and Rory GV Meyer. "Subsidence feature discrimination using deep convolutional neral networks in synthetic aperture radar imagery." (2017): http://hdl.handle.net/10204/9955
Schwegmann CP, Kleynhans W, Engelbrecht J, Mdakane LW, Meyer RG, Subsidence feature discrimination using deep convolutional neral networks in synthetic aperture radar imagery; IEEE; 2017. http://hdl.handle.net/10204/9955 .
Copyright: 2017 IEEE. Due to copyright restrictions, the attached PDF file only contains the abstract of the full text item. For access to the full text item, please consult the publisher's website.