Previous |  Up |  Next

Article

Keywords:
exponential family; variance function; Kullback–Leibler divergence; relative entropy; information divergence; mean parametrization; convex support
Summary:
This work studies the standard exponential families of probability measures on Euclidean spaces that have finite supports. In such a family parameterized by means, the mean is supposed to move along a segment inside the convex support towards an endpoint on the boundary of the support. Limit behavior of several quantities related to the exponential family is described explicitly. In particular, the variance functions and information divergences are studied around the boundary.
References:
[1] Ay, N.: An information-geometric approach to a theory of pragmatic structuring. The Annals of Probability 30 (2002), 416-436. DOI 10.1214/aop/1020107773 | MR 1894113 | Zbl 1010.62007
[2] Barndorff-Nielsen, O.: Information and Exponential Families in Statistical Theory. Wiley, New York 1978. MR 0489333 | Zbl 1288.62007
[3] Brown, L. D.: Fundamentals of Statistical Exponential Families. Inst. of Math. Statist. Lecture Notes - Monograph Series 9 (1986). MR 0882001 | Zbl 0685.62002
[4] Chentsov, N. N.: Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs, Amer. Math. Soc., Providence - Rhode Island 1982 (Russian original: Nauka, Moscow, 1972). MR 0645898 | Zbl 0484.62008
[5] Csiszár, I., Matúš, F.: Closures of exponential families. The Annals of Probability 33 (2005), 582-600. DOI 10.1214/009117904000000766 | MR 2123202 | Zbl 1068.60008
[6] Csiszár, I., Matúš, F.: Generalized maximum likelihood estimates for exponential families. Probability Theory and Related Fields 141 (2008), 213-246. DOI 10.1007/s00440-007-0084-z | MR 2372970 | Zbl 1133.62039
[7] Graham, R., Knuth, D., Patashnik, O.: Concrete Mathematics. Second edition. Addison-Wesley, Reading, Massachusetts 1994, p. 446. MR 1397498
[8] Letac, G.: Lectures on Natural Exponential Families and their Variance Functions. Monografias de Matemática 50, Instituto de Matemática Pura e Aplicada, Rio de Janeiro 1992. MR 1182991 | Zbl 0983.62501
[9] Matúš, F., Ay, N.: On maximization of the information divergence from an exponential family. In: Proc. WUPES'03 (J. Vejnarová, ed.), University of Economics, Prague 2003, pp. 99-204.
[10] Matúš, F.: Optimality conditions for maximizers of the divergence from an EF. Kybernetika 43 (2007), 731-746. MR 2376334
[11] Matúš, F.: Divergence from factorizable distributions and matroid representations by partitions. IEEE Trans. Inform. Theory 55 (2009), 5375-5381. DOI 10.1109/tit.2009.2032806 | MR 2597169
[12] F., F.Matúš, Rauh, J.: Maximization of the information divergence from an exponential family and criticality. In: Proc. IEEE ISIT 2011, St. Petersburg 2011, pp. 809-813. DOI 10.1109/isit.2011.6034269
[13] Montúfar, G., J., J. Rauh, Ay, N.: Maximal information divergence from statistical models defined by neural networks. In: Proc. GSI 2013, Paris 2013, Lecture Notes in Computer Science 8085 (2013), 759-766. DOI 10.1007/978-3-642-40020-9_85 | Zbl 1322.62060
[14] Rauh, J.: Finding the maximizers of the information divergence from an exponential family. IEEE Trans. Inform. Theory 57 (2011), 3236-3247. DOI 10.1109/tit.2011.2136230 | MR 2817016
[15] Rockafellar, R. T.: Convex Analysis. Princeton University Press, 1970. DOI 10.1017/s0013091500010142 | MR 0274683 | Zbl 1011.49013
Partner of
EuDML logo