“School of Mathematics”
Back to Papers HomeBack to Papers of School of Mathematics
Paper IPM / M / 15544  


Abstract:  
This paper presents the Bayes Fisher information
measures, defined by the expected Fisher information under a
distribution for the parameter, for the arithmetic, geometric, and
generalized mixtures of two probability density functions. The
Fisher information of the arithmetic mixture about the mixing
parameter is related to chisquare divergence, Shannon entropy,
and the JensenShannon divergence. The Bayes Fisher measures
of the three mixture models are related to the KullbackLeibler,
Jeffreys, JensenShannon, Renyi, and Tsallis divergences. These
measures indicate that the farther away are the components
from each other, the more informative are data about the
mixing parameter. We also unify three different relative entropy
derivations of the geometric mixture scattered in statistics and
physics literatures. Extensions of two of the formulations to the
minimization of Tsallis divergence give the generalized mixture
as the solution.
Download TeX format 

back to top 