This paper makes use of information theoretic methods, in the form of the Cressie-Read (CR) family of divergence measures, to introduce a new class of probability distributions and estimators for competing explanations of the data in the binary choice model. No explicit parameterization of the function connecting the data to the Bernoulli probabilities is stated in the specification of the statistical model. A large class of probability density functions emerges that includes the conventional logit model. The resulting new class of statistical models and estimators requires minimal a priori model structure and non-sample information, and provides the basis for a range of model and estimator extensions.
Keywords: semiparametric binary response models and estimators, conditional moment equations, squared error loss, Cressie-Read statistic, information theoretic methods, minimum power divergence
AMS 1991 Classification Primary 62E20
JEL Classifications: C10, C2.
Article Full Text
All articles in this volume
1 Ron C. Mittelhammer, Regents Professor of Economic Sciences and Statistics, Washington State University, Pullman, WA, 99164
2 George G. Judge, Professor in the Graduate School, 207 Giannini Hall, University of California, Berkeley, Berkeley, CA, 94720