Statistical Adequacy and Reliability of Inference in Regression-like Models

TR Number
Date
2010-05-04
Journal Title
Journal ISSN
Volume Title
Publisher
Virginia Tech
Abstract

Using theoretical relations as a source of econometric specifications might lead a researcher to models that do not adequately capture the statistical regularities in the data and do not faithfully represent the phenomenon of interest. In addition, the researcher is unable to disentangle the statistical and substantive sources of error and thus incapable of using the statistical evidence to assess whether the theory, and not the statistical model, is wrong. The Probabilistic Reduction Approach puts forward a modeling strategy in which theory can confront data without compromising the credibility of either one of them. This approach explicitly derives testable assumptions that, along with the standardized residuals, help the researcher assess the precision and reliability of statistical models via misspecification testing. It is argued that only when the statistical source of error is ruled out can the researcher reconcile the theory and the data and establish the theoretical and/or external validity of econometric models.

Through the approach, we are able to derive the properties of Beta regression-like models, appropriate when the researcher deals with rates and proportions or any other random variable with finite support; and of Lognormal models, appropriate when the researcher deals with nonnegative data, and specially important of the estimation of demand elasticities.

Description
Keywords
probabilistic reduction approach, reliability, statistical misspecification, monte carlo, beta models, elasticities, gasoline demand
Citation