A symmetric divergence measure and its bounds

Main Article Content

K. C. Jain
Ruchi Mathur

Abstract

A new symmetric divergence measure is proposed which is useful in comparing two probability distributions. This non-parametric measure belongs to the Csiszar's $f$ divergence class. Its properties are studied and bounds are obtained in terms of some well known divergence measures. A numerical illustration based on the probability distribution is carried out.

Article Details

How to Cite
Jain, K. C., & Mathur, R. (2011). A symmetric divergence measure and its bounds. Tamkang Journal of Mathematics, 42(4), 493–503. https://doi.org/10.5556/j.tkjm.42.2011.1034
Section
Papers

References

S. M. Ali and S. D. Silvey, A general class of coefficients of divergence of one distribution from another, J. Roy. Statist. Soc. B, 28(1966), 131-142.

J. Burbea and C. R. Rao, Entropy differential metric, distance and divergence measures in probability spaces: a unified approach, J. Multi. Analysis, 12(1982), 575-596.

J. Burbea and C. R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Trans. on Infor. Theory, IT-28(1982), 489-495.

L. Csiszar, Information measures: A critical survey, In Trans. Seventh Prague Conf. On Information Theory, A, Academia, Prague (1967), 73-86.

L. Csiszar, Information-type measure of difference of probability distributions and indirect observations, Studia Sci.Math. Hungar., 2(1974), 299-318.

S. S. Dragomir, A converse inequality for the Csiszar's $phi$ divergence, Inequalities for the Csiszar's $phi$ $f$-divergence in Information Theory, Chapter 1, Article 2 (Edited by S. S. Dragomir), http://rgmia.vu.edu.au/monographs/ csiszar.htm

S. S. Dragomir, Some inequalities for $(m, M)$-convex mappings and applications for the Csiszar's $phi$ divergence, in Information theory, Inequalities for the Csiszar's $f$-divergence in Information Theory, (Edited by S. S. Dragomir), http://rgmia.vu.edu.au/monographs/csiszar.htm, 2000.

S. S. Dragomir, Some inequalities for the Csiszar's $phi$ divergence, Inequalities for the Csizar's $f$-divergence in Information Theory, (Edited by S. S. Dragomir), http://rgmia.vu.edu.au/monographs/ciszar.htm,2000.

S. S. Dragomir, Upper and lower bounds for Csiszar's $f$-divergence in terms of the Kullback-Leibder distance and applications, Inequalities for the Csiszar's $f$-divergence in Information Theory,(Edited by S. S. Dragomir),

http://rgrnia.vu.edu.au/monographs/ciszar.htm, 2000.

S. S. Dragomir, Upper and lower bounds for Csiszar's $f$-divergence in terms of the Hellinger discrimination and applications, Inequalities for the Csiszar's $f$-divergence in Information Theory, (Edited by S. S. Dragomir),

http://rgmia.vu.edu.au/monographs/csiszar.htm, 2000.

S. S. Dragomir, V. Gluseceic and C. E. M. Pearce, Inequality Theory and Applications, (Edited by Y. L. Cho, J. K. Kim and S. S. Dragomir), Nova Science Publishes, Huntington, New York, 2001.

S. S. Dragomir, S. Sunde and C. Buse, New inequalities for Jeffrey's divergence measures, Tamsui Oxford Journal of Mathematical Sciences, 16(2000), 295-309.

K. Feretimos and T. Papaiopannou, New parametric measure of information, Information and Control, 51(1981), 193-208.

E. Hellinger, Neue Berunduring der Theorie der Quadratischen Formen von Un endlichen Vieden Veran derliehen, J. Reine Aug. Math., 136(1909), 210-271.

H. Jefferey, An invariant form for the prior probability in estimation problems, Proc. Roy. Soc. Lon., Ser. A, 186(1946), 453-461.

R. A. Fisher, Theory of statistical estimation, Proe. Cambridge Philos. Soc.,22, 700-772.

S. Kullback and A. S. Leibler, On information and sufficiency, Ann. Math. Stat., 22(1951), 79-86

P. Kumar and A. Johnson, On a symmetric divergence measure and information inequalities, Journal of Inequalities in Pure and Applied Mathematics, 16(2005), article 65.

F. Osterreircher, Csiszar's $f$-divergences-Basic properties, Res. Report

Collection http://rgmia.vu.edu.au/monographs/csiszar.htm, 2002.

Renyi A., "On measures of entropy and information", In Proc. Fourth Berkeley Symp. on Math. Statist. And Prob., 1, University of California Press, Berkeley, CA, U.S.A (1961), 547-561.

C. E. Shannon, A mathematical theory of communication, Bell Syst. Tech. Jour., 2(1948), 623-659.

R. Sibson, Information Radius, Z. Wahrsch. Verw. Gebiete, 14(1969), 149-160

L. J. Taneja, New developments in generalized information measures, Advances in Electronics and Electron Physics (Edited by P.W. Hawkes), 91(1995), 37-135.

L. J. Taneja and P. Kumar, Relative information of type-$s$, Csiszar's $f$-divergence and information inequalities, Information Sciences, 2003.

L. J. Taneja, Relative Divergence Measures and Information inequalities, Inequality theory and applications, Vol.4, Y. J.Cho., J. K. Kim and S. S. Dragomir (Eds.), Nova science Publishers, inc Huntington (2004), New York.

L. J. Taneja, New developments in generalized information measures, Chapter in Advances in Imaging and Electron Physics, Ed.P.W. Hawkes, 91(1995), 37-135.

F. Topsoe, Res. Rep. Collection, RGMIA 2 (1999), 85-98.