A BAYESIAN APPROACH TO DETECT INFORMATIVE OBSERVATIONS IN A REGRESSION EXPERIMENT BASED ON GENERALIZED ENTROPY MEASURES
Main Article Content
Abstract
In this paper we identify subsets of the data that appear to have a disproportionate influence on the estimated normal regression model in a Bayesian context. Generalized entropy measures are used to detect a set of most informative observations in a given design.
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
References
M . A. Ali, "A Bayesian Approach to detect Informative Observations in an Experiment", Commun. Statist Theory and Meth., 19 (7) {1990), 2567-2575.
S. Arimoto, "Information-Theoretic Consideration on Estimation Problems", Information and Control, 19 (1971), 181-191.
S. Chatterjee and A. S. Hadi, "Influential Observations, High Leverege Points, and Outliers in Linear Regression", Statistical Science, 1 (1986), 379-416.
R. D. Cook and S. Weisberg, "Characterization of an empirical Influence Function for Detecting Influential Cases in Regression", Technometrics, 22 (1980), 495-508.
R. D. Cook and S. Weisberg, "Residuals and Influence in Regression", Biometrika, 70 (1982), 1-10.
R. D. Cook, "Assessment of Local Influence", Journal of the Royal Statistical Society B, 48 (1986), 133-169.
M. H. DeGroot, "Optimal Statistical Decisions", McGraw-Hill, 1970.
S. Ghosh, "Information in an observation in robust Designs", Comm. Statist. Theor. Meth., 11 (1982), 1173-1184.
S. Ghosh, "Influential Observations in view of Design and Inference", Comm. Statist. Theor. Meth., 12 (1983), 1675-1683.
S. Gosh and H. Narnini, "Influential Observations under robust designs", Proceeding of Design and Coding Theory Conference, Institute of Mathematics and its Applications, University of Minnesota, Minneapolis, U. S. A. (1989).
I. Havrda and F. Charvat, "Quantification method of classification processes: concept of structural a-entropy", Kybernetika, 3 (1967), 30-35.
A. J. Lawrence, "Regression Transformation Diagnostics Using Local Influence", Journal of the American Statistical Association, 83 (1988), 1067-1072.
A. Renyi, "On Measures of Entropy and Information", Proced. 4th Berkeley Symp. Math. Statist. and Prob., 1 (1961), 547-561.
C. Shannon, "A Mathematical Theory of Communications", Bell. System Tech. J., 27 (1948), 379-423.
B. D. Sharma and D. P. Mittal, "New Nonadditive measures of Entropy for Discrete Probability Distributions", J. Math. Sci., 10 (1975), 28-40.
I. J. Taneja, L. Pardo, D. Morales and M. L. Menendez, "On Generalized Information and Divergence Measures and their Applications: A Brief Review", Qiiestii6, Vol. 13, N. 1,2,3, (1989), 47-73.