21st EANN 2020, 5 -7 June 2020, Greece

Generalized entropy loss function in neural network: variable’s importance and sensitivity analysis

Krzysztof Gajowniczek, Tomasz Ząbkowski


  Artificial neural networks are powerful tools for data analysis and are particularly suitable for modelling relationships between variables for best prediction of an outcome. A large number of error functions have been proposed in the literature to achieve a better predictive power of a neural network. Only a few works employ Tsallis statistics, although the method itself has been successfully applied in other machine learning techniques. This paper undertakes the effort to examine various characteristics of the q-generalized function based on Tsallis entropy as an alternative loss measure in neural networks. To achieve this goal, we will explore various methods that can be used to interpret supervised neural network models. These methods can be used to visualize the model using the neural network interpretation diagram, assess the importance of variables by disaggregating the model’s weights (Olden’s and Garson’s algorithms) and perform a sensitivity analysis of model responses to input variables (Lek’s profile).  

*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.