On Tsallis extropy with an application to pattern recognition
Abstract
Recently, a new measure of information called extropy has been introduced by
Lad, Sanfilippo and Agrò as the dual version of Shannon entropy. In the
literature, Tsallis introduced a measure for a discrete random variable, named
Tsallis entropy, as a generalization of Boltzmann-Gibbs statistics. In this
work, a new measure of discrimination, called Tsallis extropy, is introduced
and some of its properties are then discussed. The relation between Tsallis
extropy and entropy is given and some bounds are also presented. Finally, an
application of this extropy to pattern recognition is demonstrated.