SUSTAINABLE DEVELOPMENT
Innovate. Sustain. Transform.
Save up to 30% on top Physical Sciences & Engineering titles!

Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan, Second Edition provides an accessible approach for conducting Bayesian data analysis, as material is explained… Read more
SUSTAINABLE DEVELOPMENT
Save up to 30% on top Physical Sciences & Engineering titles!
Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan, Second Edition provides an accessible approach for conducting Bayesian data analysis, as material is explained clearly with concrete examples. Included are step-by-step instructions on how to carry out Bayesian data analyses in the popular and free software R and WinBugs, as well as new programs in JAGS and Stan. The new programs are designed to be much easier to use than the scripts in the first edition. In particular, there are now compact high-level scripts that make it easy to run the programs on your own data sets.
The book is divided into three parts and begins with the basics: models, probability, Bayes’ rule, and the R programming language. The discussion then moves to the fundamentals applied to inferring a binomial probability, before concluding with chapters on the generalized linear model. Topics include metric-predicted variable on one or two groups; metric-predicted variable with one metric predictor; metric-predicted variable with multiple metric predictors; metric-predicted variable with one nominal predictor; and metric-predicted variable with multiple nominal predictors. The exercises found in the text have explicit purposes and guidelines for accomplishment.
This book is intended for first-year graduate students or advanced undergraduates in statistics, data analysis, psychology, cognitive science, social sciences, clinical sciences, and consumer sciences in business.
Chapter 1: What's in This Book (Read This First!)
Part I: The Basics: Models, Probability, Bayes’ Rule, and R
Introduction
Chapter 2: Introduction: Credibility, Models, and Parameters
Chapter 3: The R Programming Language
Chapter 4: What is This Stuff Called Probability?
Chapter 5: Bayes' Rule
Part II: All the Fundamentals Applied to Inferring a Binomial Probability
Introduction
Chapter 6: Inferring a Binomial Probability via Exact Mathematical Analysis
Chapter 7: Markov Chain Monte Carlo
Chapter 8: JAGS
Chapter 9: Hierarchical Models
Chapter 10: Model Comparison and Hierarchical Modeling
Chapter 11: Null Hypothesis Significance Testing
Chapter 12: Bayesian Approaches to Testing a Point (“Null”) Hypothesis
Chapter 13: Goals, Power, and Sample Size
Chapter 14: Stan
Part III: The Generalized Linear Model
Introduction
Chapter 15: Overview of the Generalized Linear Model
Chapter 16: Metric-Predicted Variable on One or Two Groups
Chapter 17: Metric Predicted Variable with One Metric Predictor
Chapter 18: Metric Predicted Variable with Multiple Metric Predictors
Chapter 19: Metric Predicted Variable with One Nominal Predictor
Chapter 20: Metric Predicted Variable with Multiple Nominal Predictors
Chapter 21: Dichotomous Predicted Variable
Chapter 22: Nominal Predicted Variable
Chapter 23: Ordinal Predicted Variable
Chapter 24: Count Predicted Variable
Chapter 25: Tools in the Trunk
JK
After attending the Summer Science Program as a high school student and considering a career in astronomy, Kruschke earned a bachelor's degree in mathematics (with high distinction in general scholarship) from the University of California at Berkeley. As an undergraduate, Kruschke taught self-designed tutoring sessions for many math courses at the Student Learning Center. During graduate school he attended the 1988 Connectionist Models Summer School, and earned a doctorate in psychology also from U.C. Berkeley. He joined the faculty of Indiana University in 1989. Professor Kruschke's publications can be found at his Google Scholar page. His current research interests focus on moral psychology.
Professor Kruschke taught traditional statistical methods for many years until reaching a point, circa 2003, when he could no longer teach corrections for multiple comparisons with a clear conscience. The perils of p values provoked him to find a better way, and after only several thousand hours of relentless effort, the 1st and 2nd editions of Doing Bayesian Data Analysis emerged.