Introduction to Bayesian Statistics
Bayesian methods are a formal, quantitative way of combining empirical evidence with prior knowledge to solve real and practical problems. These methods are becoming increasingly important to fields as disparate as clinical decision-making, personnel management, user-interface design, and cognitive and neural modeling. Much of this growth can be attributed to shortcomings in classical statistical methods and the ubiquity of computational power at your own desktop. In this course we will introduce Bayesian methods and provide a bridge between these methods and the classical methods typically taught in graduate social science programs. Through this method, we hope to convey how a Bayesian thinks about an inference problem, and show how this thought process relates to intuitive reasoning. We will cover the key concepts associated with Bayes theory, including prior probabilities, likelihoods, conditional and marginal probabilities, posterior probabilities, and the decision process. With hands on experience using data from a variety of applications, you will learn how to apply these concepts to the design, implementation, interpretation, and communication of Bayesian analysis.
Specifically, we aim to:
- Contrast classical/frequentist approaches and Bayesian approaches, including the problems/shortcomings of both approaches
- Describe how a Bayesian thinks about inference, and why this is more or less like intuitive inference
- Explain key concepts including priors, likelihoods, conditionals & marginals, posteriors, decisions, Gibbs sampling, Markov Chains, Markov Chain Monte-Carlo methods, and Bayesian networks
- Use data from different research designs (observational, experimental, quasi-experimental) and from a broad array of scientific areas (health/clinical, cognitive, physiological, and physical sciences)
- Engage students with hands on computational tools such as Treeage, WinBUGS, and OpenBUGS
By the end of this course, you will fully appreciate the humor in this picture.
Initial Reading List
Students unfamiliar with Bayesian methods are encouraged to refer to the following articles that document the advantages of Bayesian methods as both a complement and a replacement for classical frequentist (Null Hypothesis Significance Testing or NHST) methods.
- Gerd Gigerenzer’s article describing the foundation of NHST presents a compelling case for Bayesian methods
- G.A. Barnard, in his presidential address to the Royal Statistical Society, offered what we shall refer to as the “middle-ground” approach. He advocates for the judicious application of both methods. The paper is written in a very clear, non-technical style and offers the novice some insights into the struggles between the frequentist and subjectivist camps.
- Noted scholar and statistician Bradley Efron gave a remarkable presidential address to the American Statistical Association. In that address, he forecasts a new era whereby science may be the driver of statistical inquiry rather than the other way around. He also portends that a combination of both Bayesian and frequentist methods will dominate the successful approaches to solving complex problems.
We assume all students enrolled in the course will be familiar with the frequentist perspective. In particular, we expect that students know the GLM, parameter estimation procedures (in general), and standard NHST methods. Successful completion of PSYC 611/612 along with independent use of statistical packages should suffice for this prerequisite. Students enrolled and actively engaged in research will find these requirement trivial.
There will be no exams for this course. Instead, we intend to have students complete two projects - one at the beginning of the semester and one at the end. The project at the beginning is to orient and familiarize students to a dataset. Each student will have his/her own data (preferably a dataset related to his/her topic area). The project will entail a traditional analysis. What we mean by the qualifier “traditional” is whatever analysis best reflects the standard of the field today. The second project will be a revision of the first but instead of using traditional analytic procedures, the student will employ a Bayesian procedure and contrast the results. The aim of the two projects is not to put undue burden on the students but rather to ensure that students get the full complement of knowledge and skills to apply Bayesian methods in the future.
Course Outline By Topic
Introduction to Bayes Theorem, Probability Theory, and Conditional Probability
D’Agostini (1995) offers us an excellent introduction to the world of Bayesian methods. Do not worry that the text was written by a physicist. It is a clear document that walks the reader through the basic principles of Bayesian methods.
Eugene Charniak’s article “Bayesian Networks without Tears” offers an introduction to Bayesian networks.
Cook (1995) provides a nice description of how Bayesian methods may be implemented in Mathematica.