|Recent advances in optical sensor technologies and Geoinformatics, can support very large scale high definition, used for multispectral and panchromatic images. This capability allows the use of remote sensing for the observation of complex earth ecosystems. Application areas include, sustainability of biodiversity, precision agriculture, land, crops and parasites management. Moreover, it supports advanced quantitative studies of biophysical and biogeochemical cycles, in costal or inland waters. The requirement for precise and effective scene classification, can significantly contribute towards the development of new types of decision support systems. This offers considerable advantages to business, science and engineering. This research paper proposes a novel and effective approach based on geographic object-based scene classification in remote sensing images. More specifically, it introduces an important upgrade of the well-known Residual Neural Network (ResNet) architecture. The omission of some layers in the early stages of training, achieves an effective simplification of the network, by eliminating the “Vanishing Gradient Problem” (VGP) which causes efficiency limitations in other “Deep Learning” (DEL) architectures. The use of the Softmax activation function instead of the Sigmoid in the last layer, is the most important innovation of the proposed system. The ResNet has been trained using the novel AdaBound algorithm that employs dynamic bounds on the employed learning rates. The result is the employment of a smooth transition of the stochastic gradient descent, tackling the noise dispersed points of misclassification with great precision. This is something that other spectral classification methods cannot handle. The proposed algorithm was successfully tested, in scene identification from remote sensing images. This confirms that it could be further used in advanced level processes for Large-Scale Geospatial Data Analysis, such as cross-border classification, recognition and monitoring of certain patterns and multi-sensor data fusion.|
*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.