Previous |  Up |  Next

Article

Keywords:
Shannon entropy; alternative Shannon entropy; power entropies; alternative power entropies; Bayes error; Bayes risk; sub-Bayes risk
Summary:
This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power $\alpha =1$. It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power $\alpha =2$. The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than the classical power entropies. Concrete examples, tables and figures illustrate the obtained results.
References:
[1] M. Ben Bassat: $f$-entropies, probability of error, and feature selection. Inform. Control 39 (1978), 227-242. DOI 10.1016/S0019-9958(78)90587-9 | MR 0523439 | Zbl 0394.94011
[2] M. Ben Bassat, J. Raviv: Rényi's entropy and probability of error. IEEE Trans. Inform. Theory 24 (1978), 324-331. DOI 10.1109/TIT.1978.1055890 | MR 0484747
[3] J. O. Berger: Statistical Decision Theory and Bayesian Analysis. Second edition. Springer, Berlin 1986. MR 0804611
[4] T. M. Cover, P. E. Hart: Nearest neighbor pattern classification. IEEE Trans. Inform. Theory 13 (1967), 21-27. DOI 10.1109/TIT.1967.1053964 | Zbl 0154.44505
[5] P. Devijver, J. Kittler: Pattern Recognition. A Statistical Approach. Prentice Hall, Englewood Cliffs, New Jersey 1982. MR 0692767 | Zbl 0542.68071
[6] L. Devroye, L. Györfi, G. Lugosi: A Probabilistic Theory of Pattern Recognition 1996. Springer, Berlin 1996. MR 1383093
[7] D. K. Faddeev: Zum Begriff der Entropie einer endlichen Wahrscheinlichkeitsschemas. Vol. I. Deutscher Verlag der Wissenschaften, Berlin 1957.
[8] M. Feder, N. Merhav: Relations between entropy and error probability. IEEE Trans. Inform. Theory 40 (1994), 259-266. DOI 10.1109/18.272494 | Zbl 0802.94004
[9] P. Harremoës, F. Topsøe: Inequalities between entropy and index of coincidence derived from information diagrams. IEEE Trans. Inform. Theory 47 (2001), 2944-2960. DOI 10.1109/18.959272 | MR 1872852
[10] J. Havrda, F. Charvát: Concept of structural $a$-entropy. Kybernetika 3 (1967), 30-35. MR 0209067 | Zbl 0178.22401
[11] L. Kanal: Patterns in pattern recognittion. IEEE Trans. Inform. Theory 20 (1974), 697-707. MR 0356609
[12] V. A. Kovalevsky: The problem of character recognition from the point of view of mathematical statistics. In: Reading Automata and Pattern Recognition (in Russian) (Naukova Dumka, Kyjev, ed. 1965). English translation in: Character Readers and Pattern Recognition, Spartan Books, New York 1968, pp. 3-30.
[13] D. Morales, L. Pardo, I. Vajda: Uncertainty of discrete stochastic systems: general theory and statistical inference. IEEE Trans. System, Man and Cybernetics, Part A 26 (1996), 1-17.
[14] A. Rényi: Proceedings of 4th Berkeley Symp. on Probab. Statist. University of California Press, Berkeley, California 1961. MR 0132570
[15] N. P. Salikhov: Confirmation of a hypothesis of I. Vajda (in Russian). Problemy Peredachi Informatsii 10 (1974), 114-115. MR 0464476
[16] D. L. Tebbe, S. J. Dwyer III: Uncertainty and probability of error. IEEE Trans. Inform. Theory 14 (1968), 516-518. DOI 10.1109/TIT.1968.1054135
[17] G. T. Toussaint: A generalization of Shannon's equivocation and the Fano bound. IEEE Trans. System, Man and Cybernetics 7 (1977), 300-302. DOI 10.1109/TSMC.1977.4309705 | MR 0453269 | Zbl 0363.94024
[18] I. Vajda: Bounds on the minimal error probability and checking a finite or countable number of hypotheses. Inform. Transmission Problems 4 (1968), 9-17. MR 0267685
[19] I. Vajda: A contribution to informational analysis of patterns. In: Methodologies of Pattern Recognition (M. S. Watanabe, ed.), Academic Press, New York 1969.
[20] I. Vajda, K. Vašek: Majorization, concave entropies and comparison of experiments. Problems Control Inform. Theory 14 (1985), 105-115. MR 0806056 | Zbl 0601.62006
[21] I. Vajda, J. Zvárová: On generalized entropies, Bayesian decisions and statistical diversity. Kybernetika 43 (2007), 675-696. MR 2376331 | Zbl 1143.94006
Partner of
EuDML logo