# An Introductory Handbook of Bayesian Thinking

- 1st Edition - April 17, 2024
- Author: Stephen C. Loftus
- Language: English
- Paperback ISBN:9 7 8 - 0 - 3 2 3 - 9 5 4 5 9 - 4
- eBook ISBN:9 7 8 - 0 - 4 4 3 - 2 9 1 1 1 - 1

As Bayesian techniques become more common across a variety of fields, it becomes important for experts in those fields to understand those methods. An Introductory Handbook of Ba… Read more

## Purchase options

## Institutional subscription on ScienceDirect

Request a sales quoteAs Bayesian techniques become more common across a variety of fields, it becomes important for experts in those fields to understand those methods.

*An Introductory Handbook of Bayesian Thinking*brings Bayesian thinking and methods to a wide audience beyond the mathematical sciences. Appropriate for students with some background in calculus and introductory statistics, particularly for nonstatisticians with a sufficient mathematical background, the text provides a gentle introduction to Bayesian ideas with a wide array of supporting examples from a variety of fields.- Utilizes real datasets to illustrate Bayesian models and their results
- Guides readers on coding Bayesian models using the statistical software R, including a helpful introduction and supporting online resource
- Appropriate for an undergraduate statistics course, as well as for non-statisticians with sufficient mathematical background (integral and differential Calculus and an introductory Statistics course)
- Covers any more advanced topics which readers may not be familiar with—such as the basic idea of vectors and matrices—as much as needed in order to foster understanding of core concepts

Students in undergraduate programs learning about Bayesian Statistics Professionals / researchers / academics applying Bayesian principles in research and applied settings, who require an introduction or refresher to the subject

- Cover image
- Title page
- Table of Contents
- Copyright
- Dedication
- About the author
- Stephen Loftus
- Preface
- Purpose of this book
- Format of this book
- Expectations of the reader
- My history with Bayes
- Part 1: An introduction to probability and R
- Chapter 1: Probability and random variables
- 1.1. Introduction
- 1.2. Probability and random events
- 1.3. Sets and combining events
- 1.4. The axioms of probability
- 1.5. Calculating probabilities
- 1.6. Random variables
- 1.7. Conclusion
- References
- Chapter 2: Probability distributions, expected value, and variance
- 2.1. Introduction
- 2.2. Probability distributions
- 2.3. Cumulative distribution functions
- 2.4. Probability mass functions
- 2.5. Probability distribution functions
- 2.6. Expected values
- 2.7. Variance
- 2.8. Conclusion
- References
- Chapter 3: Common probability distributions
- 3.1. Introduction
- 3.2. Discrete probability distributions
- 3.3. Continuous distributions
- 3.4. Conclusion
- References
- Chapter 4: Conditional probability and Bayes' Rule
- 4.1. Introduction
- 4.2. Conditional probability
- 4.3. Bayes' Rule
- 4.4. Conclusion
- References
- Chapter 5: Finding and using the distributions of data
- 5.1. Introduction
- 5.2. Joint distributions
- 5.3. Likelihood functions
- 5.4. Estimating parameters using the likelihood
- 5.5. Conclusion
- References
- Chapter 6: Marginal and conditional distributions
- 6.1. Introduction
- 6.2. Marginal distributions
- 6.3. Conditional distributions
- 6.4. Conditional distributions and independence
- 6.5. Conclusion
- References
- Chapter 7: The Bayesian switch
- 7.1. Introduction
- 7.2. The goal of statistics
- 7.3. The random parameter
- 7.4. Finding the posterior distribution
- 7.5. Why Bayes?
- 7.6. Conclusion
- References
- Chapter 8: A brief review of R
- 8.1. Introduction
- 8.2. Calculations
- 8.3. Data types and storing data
- 8.4. Subsetting data
- 8.5. Libraries and loading data
- 8.6. Data summaries
- 8.7. Creating graphs
- 8.8. Probability distributions
- 8.9. For loops
- 8.10. Custom functions
- 8.11. Practice problems
- 8.12. Conclusion
- References
- Part 2: Basic Bayesian models
- Chapter 9: Single-parameter inference
- 9.1. Introduction
- 9.2. Posterior distribution review
- 9.3. Finding our posterior
- 9.4. Posterior distributions calculation: fir seedlings
- 9.5. Conjugate prior distributions
- 9.6. Point summaries for the posterior
- 9.7. Practice problems
- 9.8. Conclusion
- References
- Chapter 10: Two-parameter inference
- 10.1. Introduction
- 10.2. Likelihoods of multiple parameters
- 10.3. Finding our posterior for multiple parameters
- 10.4. Monte Carlo integration
- 10.5. Gibbs sampling
- 10.6. Full conditionals: soil nitrogen example
- 10.7. Practice problems
- 10.8. Conclusion
- References
- Chapter 11: Gibbs Sampling in R
- 11.1. Introduction
- 11.2. Setup of Gibbs Samplers
- 11.3. Visualizing convergence and the posterior
- 11.4. Finding posterior summaries
- 11.5. Practice problems
- 11.6. Conclusion
- References
- Chapter 12: Bayesian linear regression
- 12.1. Introduction
- 12.2. Likelihood for regression data
- 12.3. Priors for parameters
- 12.4. Gibbs sampler for regression model
- 12.5. Extending to multiple regression
- 12.6. Practice problems
- 12.7. Conclusion
- References
- Chapter 13: Bayesian binary regression
- 13.1. Introduction
- 13.2. Generalized linear models
- 13.3. Regression for binary data
- 13.4. Data augmentation: likelihoods and priors
- 13.5. Gibbs sampler for data augmentation
- 13.6. An extension to ordinal regression
- 13.7. Practice problems
- 13.8. Conclusion
- References
- Chapter 14: Probabilistic clustering
- 14.1. Introduction
- 14.2. Univariate clustering with two groups: likelihoods and priors
- 14.3. Univariate clustering with two groups: Gibbs Sampler
- 14.4. Univariate clustering: beyond two groups
- 14.5. Practice problems
- 14.6. Conclusion
- References
- Chapter 15: Nonconjugate prior models
- 15.1. Introduction
- 15.2. When priors don't match
- 15.3. Metropolis-Hastings algorithm
- 15.4. Coding up Metropolis-Hastings
- 15.5. Metropolis-Hastings: calcium concentration in urine
- 15.6. Practice problems
- 15.7. Conclusion
- References
- Chapter 16: Models for count data
- 16.1. Introduction
- 16.2. Models for overdispersed data
- 16.3. Zero-inflated Poisson models
- 16.4. Poisson regression
- 16.5. Conclusion
- References
- Chapter 17: Testing hypotheses with Bayes
- 17.1. Introduction
- 17.2. Hypotheses and models
- 17.3. Calculating posterior model probabilities
- 17.4. Models with unknown parameters
- 17.5. Bayes Factors
- 17.6. Practice problems
- 17.7. Conclusion
- References
- Chapter 18: Bayesian inference beyond this book
- 18.1. The world of statistical challenges
- 18.2. The answers Bayesian methodology provide
- 18.3. Bringing everything back to basics
- References
- Appendix A: Matrix-form of Bayesian linear regression
- A.1. Introduction
- A.2. Multivariate normal likelihood
- A.3. Extension to data augmentation
- Appendix B: Multivariate clustering
- B.1. Introduction
- B.2. Multivariate normal likelihood
- B.3. The Wishart distribution
- B.4. Gibbs Sampler for bivariate clustering
- B.5. Gibbs Sampler for multivariate clustering
- Appendix C: List of probability distributions
- Discrete distributions
- Continuous distributions
- Appendix D: Solutions to practice problems
- Chapter 1
- Chapter 2
- Chapter 3
- Chapter 4
- Chapter 5
- Chapter 6
- Chapter 8
- Chapter 9
- Chapter 10
- Chapter 11
- Chapter 12
- Chapter 13
- Chapter 14
- Chapter 15
- Chapter 16
- Chapter 17
- References
- References
- Index

- No. of pages: 350
- Language: English
- Edition: 1
- Published: April 17, 2024
- Imprint: Academic Press
- Paperback ISBN: 9780323954594
- eBook ISBN: 9780443291111

SL

### Stephen C. Loftus

Dr. Stephen Loftus is an Analyst in Research & Development for the Atlanta Braves. Prior to this, he held academic positions at Randolph-Macon College and Sweet Briar College. In his experience in academia and industry, Dr. Loftus has spent a great deal of time studying and developing Bayesian models for a variety of projects. These highly collaborative projects range from analysis in baseball to studies in numerical ecology. In developing these models, he found himself, on many occasions, needing to explain not only the decisions made in making these models, but also the rationale behind the Bayesian philosophy of statistics to individuals with diverse mathematical backgrounds.

Affiliations and expertise

Analyst, Research & Development, Atlanta Braves Baseball Club