Some coding theorem on generalized Havrda-Charvat and Tsallis's entropy

Main Article Content

Satish Kumar
Arun Choudhary

Abstract

A new measure $L_{\alpha }^{\beta } $, called average code word length of order $\alpha $ and type $\beta $ has been defined and its relationship with a result of generalized Havrda-Charvat and Tsallis's entropy has been discussed. Using $L_{\alpha }^{\beta } $,some coding theorem for discrete noiseless channel has been proved.

Article Details

How to Cite
Kumar, S., & Choudhary, A. (2012). Some coding theorem on generalized Havrda-Charvat and Tsallis’s entropy. Tamkang Journal of Mathematics, 43(3), 437–444. https://doi.org/10.5556/j.tkjm.43.2012.711
Section
Papers
Author Biographies

Satish Kumar, Geeta Educational Trust

Department of Mathematics, Geeta Institute of Management & Technology, Kanipla-136131, Kurukshetra (Haryana) India

Arun Choudhary, Geeta Educational Trust

Department of Mathematics, Geeta Institute of Management & Technology, Kanipla-136131, Kurukshetra (Haryana) India

References

J. Aczel and Z. Daroczy, Uber Verallgemeinerte quasilineare mittelwerte, die mit Gewichtsfunktionen gebildet sind, Publ. Math. Debrecen, 10 (1963), 171--190.

J. Aczel and Z. Daroczy, On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115., Academic Press, New York-London, 1975.

L. L. Campbell, A coding theorem and Renyi's entropy, Information and Control 8(1965), 423--429.

Z. Daroczy, Generalized information functions, Information and Control 16 (1970), 36--51.

B. Ebanks, P. Sahoo and W. Sander, Characterizations of information measures, World Scientific Publishing Co., Inc., River Edge, NJ, 1998.

A. Feinstein, Foundation of Information Theory, McGraw Hill, New York, 1956.

Gurdial and F. Pessoa, On Useful Information of order $alpha $, J. Comb. Information and Syst. Sci. 2 (1977), 30--35.

J.F. Havrda and F. Charvat, Qualification method of classification process, the concept of structural $alpha $-entropy, Kybernetika 3 (1967), 30--35.

D. S. Hooda and U. S. Bhaker, A generalized 'useful' information measure and coding theorems, Soochow J. Math. 23(1997), 53--62.

F. Jelinek, Buffer overflow in variable lengths coding of fixed rate sources, IEEE 3(1980), 490--501.

J. N. Kapur, Generalized entropy of order $alpha $and type $beta $, Maths. Seminar, Delhi, 4(1967), 78--94.

A. B. Khan, B. A. Bhat and S. Pirzada, Some results on a generalized useful information measure, Journal of Inequalities in Pure and Applied Mathematics 6(4) (2005), Art. 117 .

J. C. Kieffer, Variable lengths source coding with a cost depending only on the codeword length, Information and Control 41 (1979), 136--146.

G. Longo, A noiseless coding theorem for sources having utilities, Siam J. Appl. Math.30(1976), 739--748.

B. Mc-Millan, Two inequalities implied by unique decipherability, IEEE Trans. Inform. Theory IT-2 (1956), 115--116.

J. Mitter and Y. D. Mathur, Comparison of entropies of power distribution, ZAMM 52 (1972), 239--240.

Om Parkash and P. K. Sharma, Noiseless coding theorems corresponding to fuzzy entropies, Southeast Asian Bulletin of Mathematics 27(2004), 1073--1080.

]A. Renyi, On Measure of entropy and information, Proc. 4th Berkeley Symp. Maths. Stat. Prob. 1(1961),547--561.

C.E. Shannon, A Mathematical Theory of Communication, Bell System Tech. J. 27(1948), 379--423, 623--656.

R. P. Singh, R. Kumar and R. K. Tuteja, Application of Holder's inequality in information theory, Information Sciences 152 (2003), 145--154.

C. Tsallis, Possible generalization of Boltzmann Gibbs statistics, J. Stat. Phy. 52 (1988), 479.

I. Vajda, Axioms for $alpha$-entropy of a generalized probability scheme, Kybernetika (Prague) (1968), 105--112.