[1] Blackwell D. (1951): 
Comparison of experiments. Proc. 2nd Berkeley Symp. Berkeley: University of California Press, 93-102. 
MR 0046002 
[2] Burbea J. (1984): 
The Bose-Einstein Entropy of degree a and its Jensen Difference. Utilitas Math. 25, 225-240. 
MR 0752861 
[3] Burbea J., Rao C . R. (1982): 
Entropy Differential Metric, Distance and Divergence Measures in Probability Spaces: A Unified Approach. J. Multi. Analy. 12, 575 - 596. 
DOI 10.1016/0047-259X(82)90065-3 | 
MR 0680530 
[4] Burbea J., Rao C. R. (1982): 
On the Convexity of some Divergence Measures based on Entropy Functions. IEEE Trans. on Inform. Theory IT-28, 489-495. 
MR 0672884 
[5] Capocelli R. M., Taneja I. J. (1984): Generalized Divergence Measures and Error Bounds. Proc. IEEE Internat. Conf. on Systems, man and Cybernetics, Oct. 9-12, Halifax, Canada, pp. 43 - 47.
[6] Campbell L. L. (1986): 
An extended Čencov characterization of the Information Metric. Proc. Ann. Math. Soc., 98, 135-141. 
MR 0848890 
[7] Čencov N. N. (1982): 
Statistical Decisions Rules and Optimal Inference. Trans. of Math. Monographs, 53, Am. Math. Soc., Providence, R. L. 
MR 0645898 
[8] De Groot M. H. (1970): 
Optimal Statistical Decisions. McGraw-Hill. New York. 
MR 0356303 
[11] Kullback S., Leibler A. (1951): On information and sufficiency. Ann. Math Stat. 27, 986-1005.
[13] Marshall A. W., Olkin I. (1979): 
Inequalities: Theory of Majorization and its Applications. Academic Press. New York. 
MR 0552278 
[14] Morales D., Taneja I. J., Pardo L.: Comparison of Experiments based on $\phi$-Measures of Jensen Difference. Communicated.
[15] Pardo L., Morales D., Taneja I. J.: 
$\lambda$-measures of hypoentropy and comparison of experiments: Bayesian approach. To appear in Statistica. 
MR 1173196 | 
Zbl 0782.62011 
[17] Rao C. R., Nayak T. K. (1985): 
Cross Entropy, Dissimilarity Measures and characterization of Quadratic Entropy. IEEE Trans, on Inform. Theory, IT-31(5), 589-593. 
DOI 10.1109/TIT.1985.1057082 | 
MR 0808230 
[18] Sakaguchi M. (1964): Information Theory and Decision Making. Unpublished Lecture Notes, Statist. Dept., George Washington Univ., Washington DC.
[19] Sanťanna A. P., Taneja I. J.: 
Trigonometric Entropies, Jensen Difference Divergence Measures and Error Bounds. Information Sciences 25, 145-156. 
MR 0794765 
[22] Taneja I. J.: 1(983): 
On characterization of J-divergence and its generalizations. J. Combin. Inform. System Sci. 8, 206-212. 
MR 0783757 
[23] Taneja I. J. (1986): 
$\lambda$-measures of hypoentropy and their applications. Statistica, anno XLVI, n. 4, 465-478. 
MR 0887303 
[24] Taneja I. J. (1986): 
Unified Measure of Information applied to Markov Chains and Sufficiency. J. Comb. Inform. & Syst. Sci., 11, 99-109. 
MR 0966074 
[26] Taneja I. J. (1989): On Generalized Information Measures and their Applications. Adv. Elect. Phys. 76, 327 - 413. Academic Press.
[27] Taneja I. J. (1990): Bounds on the Probability of Error in Terms of Generalized Information Radius. Information Sciences. 46.
[28] Taneja I. J., Morales D., Pardo L. (1991): 
$\lambda$-measures of hypoentropy and comparison of experiments: Blackwell and Lehemann approach. Kybernetika, 27, 413 - 420. 
MR 1132603 
[29] Vajda I. (1968): 
Bounds on the Minimal Error Probability and checking a finite or countable number of Hypothesis. Inform. Trans. Problems 4, 9-17. 
MR 0267685 
[30] Vajda I. (1989): Theory of Statistical Inference and Information. Kluwer Academic Publishers, Dordrecht/Boston/London/.