Regarding the current econometric scene, in this review I argue that () traditional econometric modeling approaches do not provide a reliable basis for making inferences about the causal effect of a supposed treatment of data in observational and quasi-experimental settings; and () the focus on conventional reductionist models and information recovery methods has led to irrelevant economic theories and questionable inferences and has failed in terms of prediction and the extraction of information relative to the nature of underlying economic behavior systems. Looking ahead, a nontraditional econometric approach is outlined. This method recognizes that our knowledge regarding the underlying behavioral system and observed data process is complex, partial, and incomplete. It then suggests a self-organized, agent-based, algorithmic-representation system that involves networks, machine learning, and an information theoretic basis for estimation, inference, model evaluation, and prediction.


Article metrics loading...

Loading full text...

Full text loading...


Literature Cited

  1. Caballero RJ. 2010. Macroeconomics after the crisis: time to deal with the pretense-of-knowledge syndrome NBER Work. Pap. 16429 [Google Scholar]
  2. Cho W. T, Judge G. 2014. An information theoretic approach to network tomography. Appl. Econ. Lett. 22:1–6 [Google Scholar]
  3. Cressie N, Read T. 1984. Multinomial goodness-of-fit tests. J. R. Stat. Soc. Ser. B 46:440–64 [Google Scholar]
  4. Gorban A, Gorban P, Judge GG. 2010. Entropy: the Markov ordering approach. Entropy 12:51145–93 [Google Scholar]
  5. Judge G. 2013. Fellow's opinion corner: econometric information recovery. J. Econ. 176:1–2 [Google Scholar]
  6. Judge GG. 2015. Entropy maximization as a basis for information recovery in dynamic economic behavioral systems. Econometrics 3:91–100 [Google Scholar]
  7. Judge GG, Mittelhammer RC. 2011. An Information Theoretic Approach to Econometrics New York: Cambridge Univ. Press [Google Scholar]
  8. Judge GG, Mittelhammer RC. 2012. Implications of the Cressie-Read family of additive divergences for information recovery. Entropy 14:122427–38 [Google Scholar]
  9. Miller D, Judge GG. 2015. Information recovery in a dynamic statistical Markov model. Econometrics 3:187–98 [Google Scholar]
  10. Mittelhammer R, Judge G. 2011. A family of empirical likelihood functions and estimators for the binary response model. J. Econ. 164:207–17 [Google Scholar]
  11. Read T, Cressie N. 1988. Goodness-of-Fit Statistics for Discrete Multivariate Data New York: Springer [Google Scholar]
  12. Wissner-Gross A, Freer CE. 2013. Causal entropic forces. Phys. Rev. Lett. 110:168702 [Google Scholar]
  13. Zanin M, Zunino L, Rosso O, Papo D. 2012. Permutation entropy and its main biomedical and econophysics applications: a review. Entropy 14:1553–77 [Google Scholar]
  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error