Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks
Abstract
:1. Introduction
- We propose the approach for feature-engineering based on the NIR channel generation via cGANs.
- We investigate the impact of artificially generated and real NIR data on the model performance in the satellite image segmentation task. We also examine the NIR channel contribution in reducing labeled dataset size with minimum quality loss. The NIR channel for satellite cross-domain stability is considered.
2. Materials and Methods
2.1. Dataset
2.2. Artificial NIR Channel Generation
2.3. Forest Segmentation Task
2.4. NIR Channel Usage
2.5. Training Setup
3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Maxwell, A.E.; Warner, T.A.; Fang, F. Implementation of machine-learning classification in remote sensing: An applied review. Int. J. Remote Sens. 2018, 39, 2784–2817. [Google Scholar] [CrossRef] [Green Version]
- Huete, A.; Justice, C.; Van Leeuwen, W. MODIS vegetation index (MOD13). Algorithm Theor. Basis Doc. 1999, 3, 295–309. [Google Scholar]
- Li, W.; Dong, R.; Fu, H.; Yu, L. Large-scale oil palm tree detection from high-resolution satellite images using two-stage convolutional neural networks. Remote Sens. 2019, 11, 11. [Google Scholar] [CrossRef] [Green Version]
- Illarionova, S.; Trekin, A.; Ignatiev, V.; Oseledets, I. Neural-Based Hierarchical Approach for Detailed Dominant Forest Species Classification by Multispectral Satellite Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 14, 1810–1820. [Google Scholar] [CrossRef]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Navarro, P.J.; Pérez, F.; Weiss, J.; Egea-Cortines, M. Machine learning and computer vision system for phenotype data acquisition and analysis in plants. Sensors 2016, 16, 641. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Scott, G.J.; England, M.R.; Starms, W.A.; Marcum, R.A.; Davis, C.H. Training deep convolutional neural networks for land–cover classification of high-resolution imagery. IEEE Geosci. Remote Sens. Lett. 2017, 14, 549–553. [Google Scholar] [CrossRef]
- Fan, J.; Chen, T.; Lu, S. Unsupervised Feature Learning for Land-Use Scene Recognition. IEEE Trans. Geosci. Remote Sens. 2017, 54, 2250–2261. [Google Scholar] [CrossRef]
- Flood, N.; Watson, F.; Collett, L. Using a U-net convolutional neural network to map woody vegetation extent from high resolution satellite imagery across Queensland, Australia. Int. J. Appl. Earth Obs. Geoinf. 2019, 82, 101897. [Google Scholar] [CrossRef]
- Alias, B.; Karthika, R.; Parameswaran, L. Classification of High Resolution Remote Sensing Images using Deep Learning Techniques. In Proceedings of the 2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Bangalore, India, 19–22 September 2018; pp. 1196–1202. [Google Scholar]
- Satellite Imagery for Natural Disasters|Digital Globe. Available online: https://www.digitalglobe.com/ecosystem/open-data (accessed on 6 February 2021).
- De Lima, D.C.; Saqui, D.; Ataky, S.; Jorge, L.A.d.C.; Ferreira, E.J.; Saito, J.H. Estimating Agriculture NIR Images from Aerial RGB Data. In International Conference on Computational Science; Springer: Cham, Switzerland, 2019; pp. 562–574. [Google Scholar]
- Gravey, M.; Rasera, L.G.; Mariethoz, G. Analogue-based colorization of remote sensing images using textural information. ISPRS J. Photogramm. Remote Sens. 2019, 147, 242–254. [Google Scholar] [CrossRef]
- Abady, L.; Barni, M.; Garzelli, A.; Tondi, B. GAN generation of synthetic multispectral satellite images. In Image and Signal Processing for Remote Sensing XXVI. International Society for Optics and Photonics; SPIE: Bellingham, WA, USA, 2020; Volume 11533, p. 115330L. [Google Scholar]
- Mohandoss, T.; Kulkarni, A.; Northrup, D.; Mwebaze, E.; Alemohammad, H. Generating synthetic multispectral satellite imagery from sentinel-2. ar**. Int. J. Remote Sens. 2014, 35, 3237–3253. [Google Scholar] [CrossRef]
- Sedano, F.; Lisboa, S.N.; Sahajpal, R.; Duncanson, L.; Ribeiro, N.; Sitoe, A.; Hurtt, G.; Tucker, C.J. The connection between forest degradation and urban energy demand in sub-Saharan Africa: A characterization based on high-resolution remote sensing data. Environ. Res. Lett. 2021, 16, 064020. [Google Scholar] [CrossRef]
- He, Q.; Sun, X.; Yan, Z.; Fu, K. DABNet: Deformable Contextual and Boundary-Weighted Network for Cloud Detection in Remote Sensing Images. IEEE Trans. Geosci. Remote. Sens. 2021, 1–16. [Google Scholar] [CrossRef]
- Illarionova, S.; Nesteruk, S.; Shadrin, D.; Ignatiev, V.; Pukalchik, M.; Oseledets, I. MixChannel: Advanced Augmentation for Multispectral Satellite Images. Remote Sens. 2021, 13, 2181. [Google Scholar] [CrossRef]
- Pal, M. Random forest classifier for remote sensing classification. Int. J. Remote Sens. 2005, 26, 217–222. [Google Scholar] [CrossRef]
- GBDX. Available online: https://gbdxdocs.digitalglobe.com/ (accessed on 17 August 2021).
- Optical and Radar Data|SPOT. Available online: https://www.intelligence-airbusds.com/optical-and-radar-data/#spot (accessed on 6 February 2021).
- Satellite Imagery and Archive|Planet. Available online: https://www.planet.com/products/planet-imagery/ (accessed on 6 February 2021).
- Salehi, P.; Chalechale, A. Pix2Pix-based Stain-to-Stain Translation: A Solution for Robust Stain Normalization in Histopathology Images Analysis. In Proceedings of the 2020 International Conference on Machine Vision and Image Processing (MVIP), Tehran, Iran, 18–20 February 2020; pp. 1–7. [Google Scholar]
- Ren, H.; Li, J.; Gao, N. Two-stage sketch colorization with color parsing. IEEE Access 2019, 8, 44599–44610. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical Image Computing and Computer-Assisted Intervention; Springer: Berlin/Heidelberg, Germany, 2015; pp. 234–241. [Google Scholar]
- Szegedy, C.; Ioffe, S.; Vanhoucke, V.; Alemi, A.A. Inception-v4, inception-resnet and the impact of residual connections on learning. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Keras. 2019–2020. Available online: https://keras.io/ (accessed on 20 August 2020).
- TensorFlow. 2019–2020. Available online: https://github.com/tensorflow/tensorflow (accessed on 20 August 2020).
- Securewatch. Available online: https://www.maxar.com/products/securewatch (accessed on 17 August 2021).
- OneAtlas. Available online: https://www.intelligence-airbusds.com/imagery/oneatlas/ (accessed on 17 August 2021).
- Yu, X.; Wu, X.; Luo, C.; Ren, P. Deep learning in remote sensing scene classification: A data augmentation enhanced convolutional neural network framework. GIScience Remote Sens. 2017, 54, 741–758. [Google Scholar] [CrossRef] [Green Version]
- Illarionova, S.; Nesteruk, S.; Shadrin, D.; Ignatiev, V.; Pukalchik, M.; Oseledets, I. Object-Based Augmentation Improves Quality of Remote SensingSemantic Segmentation. ar**v 2021, ar**v:2105.05516. [Google Scholar]
- Nesteruk, S.; Shadrin, D.; Pukalchik, M.; Somov, A.; Zeidler, C.; Zabel, P.; Schubert, D. Image Compression and Plants Classification Using Machine Learning in Controlled-Environment Agriculture: Antarctic Station Use Case. IEEE Sens. J. 2021. [Google Scholar] [CrossRef]
MAE | RMSE | Mean Bias | |
---|---|---|---|
WorldView | 0.09 | 0.31 | 0.058 |
SPOT | 0.037 | 0.194 | −0.0029 |
Planet | 0.16 | 0.41 | 0.088 |
U-Net | RF | |||||
---|---|---|---|---|---|---|
Test images | RGB | RGB | RGB and | RGB | RGB | RGB and |
and NIR | artificial NIR | and NIR | artificial NIR | |||
SPOT | 0.954 | 0.961 | 0.96 | 0.874 | 0.892 | 0.889 |
Planet | 0.857 | 0.939 | 0.936 | 0.815 | 0.863 | 0.861 |
SPOT + Planet | 0.932 | 0.96 | 0.945 | 0.836 | 0.876 | 0.872 |
Average | 0.914 | 0.953 | 0.947 | 0.841 | 0.877 | 0.874 |
(+0.039) | (+0.033) | (+0.036) | (+0.033) |
Bands | All Data | 1/2 | 1/3 | |
---|---|---|---|---|
SPOT | RGB | 0.97 | 0.956 | 0.942 |
RGB and NIR | 0.97 | 0.963 | 0.961 | |
Planet | RGB | 0.939 | 0.933 | 0.874 |
RGB and NIR | 0.95 | 0.942 | 0.927 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Illarionova, S.; Shadrin, D.; Trekin, A.; Ignatiev, V.; Oseledets, I. Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks. Sensors 2021, 21, 5646. https://doi.org/10.3390/s21165646
Illarionova S, Shadrin D, Trekin A, Ignatiev V, Oseledets I. Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks. Sensors. 2021; 21(16):5646. https://doi.org/10.3390/s21165646
Chicago/Turabian StyleIllarionova, Svetlana, Dmitrii Shadrin, Alexey Trekin, Vladimir Ignatiev, and Ivan Oseledets. 2021. "Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks" Sensors 21, no. 16: 5646. https://doi.org/10.3390/s21165646