Skip to main content

Jun Zhang

University of Michigan, USA

Probability normalization, marginalization, and maximum entropy inference: An information geometric approach

Information Geometry is the differential geometric study of the manifold of probability models, where each probability distribution is just a point on the manifold. Instead of using metric for measuring distances on such manifolds, these applications often use “divergence functions” for measuring proximity of two points (that do not impose symmetry and triangular inequality), for instance Kullback-Leibler divergence, Bregman divergence, f-divergence, etc. In this talk, I will present an information geometric analysis of probability normalization, marginalization, and maximum entropy inference. Normalization and marginalization of probability measures will be analyzed using the KL divergence function, and discussed in the context of “probability transport” where joint distributions of random variables are order-dependent in general. Maximum entropy inference will be shown to lead to exponential family of probability distributions in the classical case (for Shannon entropy), and to their deformed-exponential family for more general entropy functions. The IG perspective highlights the duality of “natural parameter” (parameter that defines a probability model) and “expectation parameter.”

References

  • Zhang, J. (2004). Divergence function, duality, and convex analysis. Neural Computation, 16, 159-195.
  • Zhang, J. (2013). Nonparametric information geometry: from divergence function to referential representational biduality on statistical manifolds. Entropy, 15, 5384-5418.
  • Zhang, J. (2015). Reference duality and representation duality in information geometry. In MaxEnt2014. Vol. 1641 (pp130-146). AIP Publishing.
  • Zhang, J. (2015). On monotone embedding in information geometry. Entropy, 17, 4485-4489.
  • Zhang, J. and Naudts, J. (2017). Information geometry under monotone embedding. Part I: divergence functions. GSI’17, (pp 205-214).
  • Naudts, J. and Zhang, J. (2017). Information geometry under monotone embedding. Part II: geometry. GSI’17, (pp 215-222).
  • Naudts, J. and Zhang, J. (2018). Rho-tau embedding and gauge freedom in information geometry. Information Geometry, 1, 79-115.

Previous abstract

Purdue University, 610 Purdue Mall, West Lafayette, IN 47907, (765) 494-4600

© 2015 Purdue University | An equal access/equal opportunity university | Copyright Complaints | Maintained by ORGANIZATION NAME HERE

Trouble with this page? Disability-related accessibility issue? Please contact us at organization-email@purdue.edu.