First Course in Bayesian Statistical Methods 2009th edition by Peter Hoff – Ebook PDF Instant Download/Delivery: 0387922997, 978-0387922997
Full dowload First Course in Bayesian Statistical Methods 2009th edition after payment
Product details:
ISBN 10: 0387922997
ISBN 13: 978-0387922997
Author: Peter Hoff
-
A self-contained introduction to probability, exchangeability and Bayes’ rule provides a theoretical understanding of the applied material.
-
Numerous examples with R-code that can be run “as-is” allow the reader to perform the data analyses themselves.
-
The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods.
First Course in Bayesian Statistical Methods 2009th Table of contents:
Introduction and examples
1.1 Introduction
1.2 Why Bayes?
1.2.1 Estimating the probability of a rare event .
1.2.2 Building a predictive model
1.3 Where we are going
1.4 Discussion and further references .
2.1 Belief functions and probabilities
2.2 Events, partitions and Bayes’ rule
2.3 Independence
2.4 Random variables .
2.4.1 Discrete random variables .
2.4.2 Continuous random variables .
2.4.3 Descriptions of distributions .
2.6 Independent random variables.
2.7 Exchangeability .
2.8 de Finetti’s theorem .
2.9 Discussion and further references .
3 One-parameter models
3.1 The binomial model .
3.1.1 Inference for exchangeable binary data .
3.1.2 Confidence regions.
3.2 The Poisson model .
3.2.1 Posterior inference .
3.2.2 Example: Birth rates .
3.3 Exponential families and conjugate priors.
3.4 Discussion and further references .
VIII Contents
4 Monte Carlo approximation .
4.1 The Monte Carlo method
4.2 Posterior inference for arbitrary functions
4.3 Sampling from predictive distributions
4.5 Discussion and further references
5 The normal model
5.1 The normal model
5.2 Inference for the mean, conditional on the variance
5.3 Joint inference for the mean and variance .
5.4 Bias, variance and mean squared error .
5.5 Prior specification based on expectations .
5.6 The normal model for non-normal data.
5.7 Discussion and further references .
6 Posterior approximation with the Gibbs sampler .
6.1 A semiconjugate prior distribution .
6.2 Discrete approximations .
6.3 Sampling from the conditional distributions .
6.4 Gibbs sampling .
6.5 General properties of the Gibbs sampler .
6.6 Introduction to MCMC diagnostics
6.7 Discussion and further references
7 The multivariate normal model .
7.1 The multivariate normal density .
7.2 A semiconjugate prior distribution for the mean .
7.3 The inverse-Wishart distribution
7.4 Gibbs sampling of the mean and covariance .
7.5 Missing data and imputation .
7.6 Discussion and further references .
8 Group comparisons and hierarchical modeling .
8.1 Comparing two groups .
8.2 Comparing multiple groups
8.2.1 Exchangeability and hierarchical models
8.3 The hierarchical normal model .
8.3.1 Posterior inference .
8.4 Example: Math scores in U.S. public schools .
8.4.1 Prior distributions and posterior approximation
8.4.2 Posterior summaries and shrinkage
8.5 Hierarchical modeling of means and variances .
8.5.1 Analysis of math score data .
8.6 Discussion and further references . .
Contents IX
9 Linear regression .
9.1 The linear regression model .
9.1.1 Least squares estimation for the oxygen uptake data
9.2 Bayesian estimation for a regression model .
9.2.1 A semiconjugate prior distribution .
9.2.2 Default and weakly informative prior distributions .
9.3 Model selection .
9.3.1 Bayesian model comparison .
9.3.2 Gibbs sampling and model averaging.
9.4 Discussion and further references .
10 Nonconjugate priors and Metropolis-Hastings algorithms .
10.1 Generalized linear models .
10.2 The Metropolis algorithm .
10.3 The Metropolis algorithm for Poisson regression .
10.4 Metropolis, Metropolis-Hastings and Gibbs .
10.4.1 The Metropolis-Hastings algorithm
10.4.2 Why does the Metropolis-Hastings algorithm work?
10.5 Combining the Metropolis and Gibbs algorithms
10.5.2 Analysis of the ice core data
10.6 Discussion and further references . .
11 Linear and generalized linear mixed effects models .
11.1 A hierarchical regression model .
11.3 Posterior analysis of the math score data .
11.4 Generalized linear mixed effects models .
11.4.1 A Metropolis-Gibbs algorithm for posterior
approximation .
11.4.2 Analysis of tumor location data .
11.5 Discussion and further references .
12 Latent variable methods for ordinal data .
12.1 Ordered probit regression and the rank likelihood.
12.1.2 Transformation models and the rank likelihood .
12.2 The Gaussian copula model .
12.2.1 Rank likelihood for copula estimation .
12.3 Discussion and further references .
People also search for First Course in Bayesian Statistical Methods 2009th :
first course in bayesian statistical methods pdf
a first course in bayesian statistical methods pdf
a first course in bayesian statistical methods solutions
a first course in bayesian statistics
introduction to bayesian statistics pdf
Reviews
There are no reviews yet.