LOWER BOUNDS ON $L^t_{1:1}(D)$ IN TERMS OF RENYI ENTROPY
Main Article Content
Abstract
In this paper we obtain the lower bounds for the exponentiated mean codeword length (as defined by Campbell [4]) for one-one codes of size $D$ by using the functions which represent possible transformations from one-one codes of size $D$ to uniquely decodable codes.
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
References
Abramson, N: Information Theory and Coding, McGraw-Hill. New York (1963).
Ash, R: "Information Theory," lnterscience Pub. New York (1965).
Leung-Yan-Cheong, S. K. and Cover, T.: "Some Equivalence Between Shannon Entropy and Kolmogrov Complexity." Vol. IT-24, No. 3, p.331-338, May, 1978.
Campbell, L. L.: "A Coding Theorem and Renyi's Entropy." Information and Control, Vol. 8, p.423-429, (1965).
Renyi, A.: "On Measures of Entropy and lnformation" Proc. 4th Berk. Symp. Math. Stat. Prob. Vol. I. P. 546-561, (1961).
Kieffer, J. C.: "Variable length Source Coding with a Cost Depending only on the Codeword Length." Information and Control. Vol. 41, p. 136-146, (1979).
Jelinek, F.: "Buffer Overflow in Variable Length Coding of Fixed Rate Sources." IEEE Trans. on Information Theory, Vol. IT-14. No. 3, p. 490-501, May, 1980.
Parker Jr., D. S.: "Conditions for Optimally of the Huffman Algorithm." Siam J. Comput. Vol. 9, No. 3, p. 470-489, (1980).
Aczel, J.: "Determination of all Additive Quasiarithmeti Mean Codeword Lengths." Z. Wahrs. Verw Gd . Vol. 29, p. 351-360, (1974).