找回密码
 立即注册
查看: 2053|回复: 0

Doing Bayesian Data Analysis A Tutorial with R and BUGS

[复制链接]
发表于 2013-2-15 18:33:02 | 显示全部楼层 |阅读模式
Doing Bayesian Data Analysis  A Tutorial with R and BUGS
目录

1 This Book’s Organization: Read Me First! 1
1.1 Real people can read this book . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 The organization of this book . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3.1 What are the essential chapters? . . . . . . . . . . . . . . . . . . . 3
1.3.2 Where’s the equivalent of traditional test X in this book? . . . . . . 4
1.4 Gimme feedback (be polite) . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.5 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
I The Basics: Parameters, Probability, Bayes’ Rule, and R 7
2 Introduction: Models we believe in 9
2.1 Models of observations and models of beliefs . . . . . . . . . . . . . . . . 10
2.1.1 Models have parameters . . . . . . . . . . . . . . . . . . . . . . . 11
2.1.2 Prior and posterior beliefs . . . . . . . . . . . . . . . . . . . . . . 13
2.2 Three goals for inference from data . . . . . . . . . . . . . . . . . . . . . . 13
2.2.1 Estimation of parameter values . . . . . . . . . . . . . . . . . . . . 13
2.2.2 Prediction of data values . . . . . . . . . . . . . . . . . . . . . . . 14
2.2.3 Model comparison . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3 The R programming language . . . . . . . . . . . . . . . . . . . . . . . . 15
2.3.1 Getting and installing R . . . . . . . . . . . . . . . . . . . . . . . 15
2.3.2 Invoking R and using the command line . . . . . . . . . . . . . . . 15
2.3.3 A simple example of R in action . . . . . . . . . . . . . . . . . . . 16
2.3.4 Getting help in R . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.3.5 Programming in R . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.3.5.1 Editing programs in R . . . . . . . . . . . . . . . . . . . 18
2.3.5.2 Variable names in R . . . . . . . . . . . . . . . . . . . . 18
2.3.5.3 Running a program . . . . . . . . . . . . . . . . . . . . 19
2.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3 What is this stuff called probability? 21
3.1 The set of all possible events . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.1.1 Coin flips: Why you should care . . . . . . . . . . . . . . . . . . . 22
3.2 Probability: Outside or inside the head . . . . . . . . . . . . . . . . . . . . 23
3.2.1 Outside the head: Long-run relative frequency . . . . . . . . . . . 23
3.2.1.1 Simulating a long-run relative frequency . . . . . . . . . 23
iii
iv CONTENTS
3.2.1.2 Deriving a long-run relative frequency . . . . . . . . . . 24
3.2.2 Inside the head: Subjective belief . . . . . . . . . . . . . . . . . . 25
3.2.2.1 Calibrating a subjective belief by preferences . . . . . . . 25
3.2.2.2 Describing a subjective belief mathematically . . . . . . 26
3.2.3 Probabilities assign numbers to possibilities . . . . . . . . . . . . . 26
3.3 Probability distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.3.1 Discrete distributions: Probability mass . . . . . . . . . . . . . . . 27
3.3.2 Continuous distributions: Rendezvous with density† . . . . . . . . 27
3.3.2.1 Properties of probability density functions . . . . . . . . 29
3.3.2.2 The normal probability density function . . . . . . . . . 30
3.3.3 Mean and variance of a distribution . . . . . . . . . . . . . . . . . 32
3.3.3.1 Mean as minimized variance . . . . . . . . . . . . . . . 33
3.3.4 Variance as uncertainty in beliefs . . . . . . . . . . . . . . . . . . 34
3.3.5 Highest density interval (HDI) . . . . . . . . . . . . . . . . . . . . 34
3.4 Two-way distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.4.1 Marginal probability . . . . . . . . . . . . . . . . . . . . . . . . . 36
3.4.2 Conditional probability . . . . . . . . . . . . . . . . . . . . . . . . 38
3.4.3 Independence of attributes . . . . . . . . . . . . . . . . . . . . . . 39
3.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
3.5.1 R code for Figure 3.1 . . . . . . . . . . . . . . . . . . . . . . . . . 40
3.5.2 R code for Figure 3.3 . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
4 Bayes’ Rule 43
4.1 Bayes’ rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.1.1 Derived from definitions of conditional probability . . . . . . . . . 44
4.1.2 Intuited from a two-way discrete table . . . . . . . . . . . . . . . . 45
4.1.3 The denominator as an integral over continuous values . . . . . . . 47
4.2 Applied to models and data . . . . . . . . . . . . . . . . . . . . . . . . . . 47
4.2.1 Data order invariance . . . . . . . . . . . . . . . . . . . . . . . . . 49
4.2.2 An example with coin flipping . . . . . . . . . . . . . . . . . . . . 50
4.2.2.1 p(D|θ) is not θ . . . . . . . . . . . . . . . . . . . . . . . 52
4.3 The three goals of inference . . . . . . . . . . . . . . . . . . . . . . . . . 52
4.3.1 Estimation of parameter values . . . . . . . . . . . . . . . . . . . . 52
4.3.2 Prediction of data values . . . . . . . . . . . . . . . . . . . . . . . 52
4.3.3 Model comparison . . . . . . . . . . . . . . . . . . . . . . . . . . 53
4.3.4 Why Bayesian inference can be difficult . . . . . . . . . . . . . . . 56
4.3.5 Bayesian reasoning in everyday life . . . . . . . . . . . . . . . . . 56
4.3.5.1 Holmesian deduction . . . . . . . . . . . . . . . . . . . 56
4.3.5.2 Judicial exoneration . . . . . . . . . . . . . . . . . . . . 57
4.4 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.4.1 R code for Figure 4.1 . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
CONTENTS v
II All the Fundamentals Applied to Inferring a Binomial Proportion 63
5 Inferring a Binomial Proportion via Exact Mathematical Analysis 65
5.1 The likelihood function: Bernoulli distribution . . . . . . . . . . . . . . . . 66
5.2 A description of beliefs: The beta distribution . . . . . . . . . . . . . . . . 67
5.2.1 Specifying a beta prior . . . . . . . . . . . . . . . . . . . . . . . . 68
5.2.2 The posterior beta . . . . . . . . . . . . . . . . . . . . . . . . . . 70
5.3 Three inferential goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
5.3.1 Estimating the binomial proportion . . . . . . . . . . . . . . . . . 71
5.3.2 Predicting data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
5.3.3 Model comparison . . . . . . . . . . . . . . . . . . . . . . . . . . 73
5.3.3.1 Is the best model a good model? . . . . . . . . . . . . . 75
5.4 Summary: How to do Bayesian inference . . . . . . . . . . . . . . . . . . 75
5.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
5.5.1 R code for Figure 5.2 . . . . . . . . . . . . . . . . . . . . . . . . . 76
5.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
6 Inferring a Binomial Proportion via Grid Approximation 83
6.1 Bayes’ rule for discrete values of θ . . . . . . . . . . . . . . . . . . . . . . 84
6.2 Discretizing a continuous prior density . . . . . . . . . . . . . . . . . . . . 84
6.2.1 Examples using discretized priors . . . . . . . . . . . . . . . . . . 85
6.3 Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
6.4 Prediction of subsequent data . . . . . . . . . . . . . . . . . . . . . . . . . 88
6.5 Model comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
6.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
6.7 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
6.7.1 R code for Figure 6.2 etc. . . . . . . . . . . . . . . . . . . . . . . . 90
6.8 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
7 Inferring a Binomial Proportion via the Metropolis Algorithm 97
7.1 A simple case of the Metropolis algorithm . . . . . . . . . . . . . . . . . . 98
7.1.1 A politician stumbles upon the Metropolis algorithm . . . . . . . . 99
7.1.2 A random walk . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
7.1.3 General properties of a random walk . . . . . . . . . . . . . . . . . 101
7.1.4 Why we care . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
7.1.5 Why it works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
7.2 The Metropolis algorithm more generally . . . . . . . . . . . . . . . . . . 107
7.2.1 “Burn-in,” efficiency, and convergence . . . . . . . . . . . . . . . . 108
7.2.2 Terminology: Markov chain Monte Carlo . . . . . . . . . . . . . . 109
7.3 From the sampled posterior to the three goals . . . . . . . . . . . . . . . . 110
7.3.1 Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.3.1.1 Highest density intervals from random samples . . . . . . 111
7.3.1.2 Using a sample to estimate an integral . . . . . . . . . . 112
7.3.2 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
7.3.3 Model comparison: Estimation of p(D) . . . . . . . . . . . . . . . 113
7.4 MCMC in BUGS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
7.4.1 Parameter estimation with BUGS . . . . . . . . . . . . . . . . . . 116
7.4.2 BUGS for prediction . . . . . . . . . . . . . . . . . . . . . . . . . 118
vi CONTENTS
7.4.3 BUGS for model comparison . . . . . . . . . . . . . . . . . . . . . 119
7.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
7.6 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
7.6.1 R code for a home-grown Metropolis algorithm . . . . . . . . . . . 121
7.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
8 Inferring Two Binomial Proportions via Gibbs Sampling 127
8.1 Prior, likelihood and posterior for two proportions . . . . . . . . . . . . . . 129
8.2 The posterior via exact formal analysis . . . . . . . . . . . . . . . . . . . . 130
8.3 The posterior via grid approximation . . . . . . . . . . . . . . . . . . . . . 133
8.4 The posterior via Markov chain Monte Carlo . . . . . . . . . . . . . . . . 134
8.4.1 Metropolis algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 135
8.4.2 Gibbs sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
8.4.2.1 Disadvantages of Gibbs sampling . . . . . . . . . . . . . 139
8.5 Doing it with BUGS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
8.5.1 Sampling the prior in BUGS . . . . . . . . . . . . . . . . . . . . . 141
8.6 How different are the underlying biases? . . . . . . . . . . . . . . . . . . . 142
8.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
8.8 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
8.8.1 R code for grid approximation (Figures 8.1 and 8.2) . . . . . . . . 144
8.8.2 R code for Metropolis sampler (Figure 8.3) . . . . . . . . . . . . . 146
8.8.3 R code for BUGS sampler (Figure 8.6) . . . . . . . . . . . . . . . 149
8.8.4 R code for plotting a posterior histogram . . . . . . . . . . . . . . 151
8.9 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
9 Bernoulli Likelihood with Hierarchical Prior 157
9.1 A single coin from a single mint . . . . . . . . . . . . . . . . . . . . . . . 158
9.1.1 Posterior via grid approximation . . . . . . . . . . . . . . . . . . . 160
9.2 Multiple coins from a single mint . . . . . . . . . . . . . . . . . . . . . . . 164
9.2.1 Posterior via grid approximation . . . . . . . . . . . . . . . . . . . 166
9.2.2 Posterior via Monte Carlo sampling . . . . . . . . . . . . . . . . . 169
9.2.2.1 Doing it with BUGS . . . . . . . . . . . . . . . . . . . . 171
9.2.3 Outliers and shrinkage of individual estimates . . . . . . . . . . . . 175
9.2.4 Case study: Therapeutic touch . . . . . . . . . . . . . . . . . . . . 177
9.2.5 Number of coins and flips per coin . . . . . . . . . . . . . . . . . . 178
9.3 Multiple coins from multiple mints . . . . . . . . . . . . . . . . . . . . . . 178
9.3.1 Independent mints . . . . . . . . . . . . . . . . . . . . . . . . . . 178
9.3.2 Dependent mints . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
9.3.3 Individual differences and meta-analysis . . . . . . . . . . . . . . . 184
9.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
9.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
9.5.1 Code for analysis of therapeutic-touch experiment . . . . . . . . . 185
9.5.2 Code for analysis of filtration-condensation experiment . . . . . . . 188
9.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
CONTENTS vii
10 Hierarchical modeling and model comparison 195
10.1 Model comparison as hierarchical modeling . . . . . . . . . . . . . . . . . 195
10.2 Model comparison in BUGS . . . . . . . . . . . . . . . . . . . . . . . . . 197
10.2.1 A simple example . . . . . . . . . . . . . . . . . . . . . . . . . . 197
10.2.2 A realistic example with “pseudopriors” . . . . . . . . . . . . . . . 199
10.2.3 Some practical advice when using transdimensional MCMC with
pseudopriors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
10.3 Model comparison and nested models . . . . . . . . . . . . . . . . . . . . 206
10.4 Review of hierarchical framework for model comparison . . . . . . . . . . 208
10.4.1 Comparing methods for MCMC model comparison . . . . . . . . . 208
10.4.2 Summary and caveats . . . . . . . . . . . . . . . . . . . . . . . . . 209
10.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
11 Null Hypothesis Significance Testing 215
11.1 NHST for the bias of a coin . . . . . . . . . . . . . . . . . . . . . . . . . . 216
11.1.1 When the experimenter intends to fix N . . . . . . . . . . . . . . . 216
11.1.2 When the experimenter intends to fix z . . . . . . . . . . . . . . . . 219
11.1.3 Soul searching . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
11.1.4 Bayesian analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 222
11.2 Prior knowledge about the coin . . . . . . . . . . . . . . . . . . . . . . . . 222
11.2.1 NHST analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
11.2.2 Bayesian analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
11.2.2.1 Priors are overt and should influence . . . . . . . . . . . 223
11.3 Confidence interval and highest density interval . . . . . . . . . . . . . . . 224
11.3.1 NHST confidence interval . . . . . . . . . . . . . . . . . . . . . . 224
11.3.2 Bayesian HDI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
11.4 Multiple comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
11.4.1 NHST correction for experimentwise error . . . . . . . . . . . . . 228
11.4.2 Just one Bayesian posterior no matter how you look at . . . . . . . 230
11.4.3 How Bayesian analysis mitigates false alarms . . . . . . . . . . . . 231
11.5 What a sampling distribution is good for . . . . . . . . . . . . . . . . . . . 231
11.5.1 Planning an experiment . . . . . . . . . . . . . . . . . . . . . . . . 231
11.5.2 Exploring model predictions (posterior predictive check) . . . . . . 232
11.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
12 Bayesian Approaches to Testing a Point (“Null”) Hypothesis 239
12.1 The estimation (single prior) approach . . . . . . . . . . . . . . . . . . . . 240
12.1.1 Is a null value of a parameter among the credible values? . . . . . . 240
12.1.2 Is a null value of a difference among the credible values? . . . . . . 241
12.1.2.1 Differences of correlated parameters . . . . . . . . . . . 242
12.1.3 Region of Practical Equivalence (ROPE) . . . . . . . . . . . . . . 244
12.2 The model-comparison (two-prior) approach . . . . . . . . . . . . . . . . . 245
12.2.1 Are the biases of two coins equal or not? . . . . . . . . . . . . . . 246
12.2.1.1 Formal analytical solution . . . . . . . . . . . . . . . . . 247
12.2.1.2 Example application . . . . . . . . . . . . . . . . . . . . 248
12.2.2 Are different groups equal or not? . . . . . . . . . . . . . . . . . . 249
12.3 Estimation or model comparison? . . . . . . . . . . . . . . . . . . . . . . 251
12.3.1 What is the probability that the null value is true? . . . . . . . . . . 251
viii CONTENTS
12.3.2 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . 251
12.4 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
12.4.1 R code for Figure 12.5 . . . . . . . . . . . . . . . . . . . . . . . . 252
12.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
13 Goals, Power, and Sample Size 259
13.1 The Will to Power . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260
13.1.1 Goals and Obstacles . . . . . . . . . . . . . . . . . . . . . . . . . 260
13.1.2 Power . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
13.1.3 Sample Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
13.1.4 Other Expressions of Goals . . . . . . . . . . . . . . . . . . . . . 264
13.2 Sample size for a single coin . . . . . . . . . . . . . . . . . . . . . . . . . 264
13.2.1 When the goal is to exclude a null value . . . . . . . . . . . . . . . 265
13.2.2 When the goal is precision . . . . . . . . . . . . . . . . . . . . . . 266
13.3 Sample size for multiple mints . . . . . . . . . . . . . . . . . . . . . . . . 267
13.4 Power: prospective, retrospective, and replication . . . . . . . . . . . . . . 269
13.4.1 Power analysis requires verisimilitude of simulated data . . . . . . 270
13.5 The importance of planning . . . . . . . . . . . . . . . . . . . . . . . . . . 271
13.6 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
13.6.1 Sample size for a single coin . . . . . . . . . . . . . . . . . . . . . 272
13.6.2 Power and sample size for multiple mints . . . . . . . . . . . . . . 274
13.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
III The Generalized Linear Model 289
14 Overview of the Generalized Linear Model 291
14.1 The generalized linear model (GLM) . . . . . . . . . . . . . . . . . . . . . 292
14.1.1 Predictor and predicted variables . . . . . . . . . . . . . . . . . . . 292
14.1.2 Scale types: metric, ordinal, nominal . . . . . . . . . . . . . . . . 293
14.1.3 Linear function of a single metric predictor . . . . . . . . . . . . . 294
14.1.3.1 Reparameterization to x threshold form . . . . . . . . . . 296
14.1.4 Additive combination of metric predictors . . . . . . . . . . . . . . 296
14.1.4.1 Reparameterization to x threshold form . . . . . . . . . . 298
14.1.5 Nonadditive interaction of metric predictors . . . . . . . . . . . . . 298
14.1.6 Nominal predictors . . . . . . . . . . . . . . . . . . . . . . . . . . 300
14.1.6.1 Linear model for a single nominal predictor . . . . . . . 300
14.1.6.2 Additive combination of nominal predictors . . . . . . . 302
14.1.6.3 Nonadditive interaction of nominal predictors . . . . . . 303
14.1.7 Linking combined predictors to the predicted . . . . . . . . . . . . 304
14.1.7.1 The sigmoid (a.k.a. logistic) function . . . . . . . . . . . 305
14.1.7.2 The cumulative normal (a.k.a. Phi) function . . . . . . . 307
14.1.8 Probabilistic prediction . . . . . . . . . . . . . . . . . . . . . . . . 308
14.1.9 Formal expression of the GLM . . . . . . . . . . . . . . . . . . . . 308
14.2 Cases of the GLM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
14.2.1 Two or more nominal variables predicting frequency . . . . . . . . 313
14.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
CONTENTS ix
15 Metric Predicted Variable on a Single Group 317
15.1 Estimating the mean and precision of a normal likelihood . . . . . . . . . . 318
15.1.1 Solution by mathematical analysis . . . . . . . . . . . . . . . . . . 318
15.1.2 Approximation by MCMC in BUGS . . . . . . . . . . . . . . . . . 322
15.1.3 Outliers and robust estimation: The t distribution . . . . . . . . . . 323
15.1.4 When the data are non-normal: Transformations . . . . . . . . . . 326
15.2 Repeated measures and individual differences . . . . . . . . . . . . . . . . 328
15.2.1 Hierarchical model . . . . . . . . . . . . . . . . . . . . . . . . . . 330
15.2.2 Implementation in BUGS . . . . . . . . . . . . . . . . . . . . . . 331
15.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333
15.4 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333
15.4.1 Estimating the mean and precision of a normal likelihood . . . . . . 333
15.4.2 Repeated measures: Normal across and normal within . . . . . . . 335
15.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338
16 Metric Predicted Variable with One Metric Predictor 343
16.1 Simple linear regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 344
16.1.1 The hierarchical model and BUGS code . . . . . . . . . . . . . . . 346
16.1.1.1 Standardizing the data for MCMC sampling . . . . . . . 347
16.1.1.2 Initializing the chains . . . . . . . . . . . . . . . . . . . 348
16.1.2 The posterior: How big is the slope? . . . . . . . . . . . . . . . . . 349
16.1.3 Posterior prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 350
16.2 Outliers and robust regression . . . . . . . . . . . . . . . . . . . . . . . . 352
16.3 Simple linear regression with repeated measures . . . . . . . . . . . . . . . 354
16.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357
16.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 358
16.5.1 Data generator for height and weight . . . . . . . . . . . . . . . . . 358
16.5.2 BRugs: Robust linear regression . . . . . . . . . . . . . . . . . . . 359
16.5.3 BRugs: Simple linear regression with repeated measures . . . . . . 362
16.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366
17 Metric Predicted Variable with Multiple Metric Predictors 371
17.1 Multiple linear regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 372
17.1.1 The perils of correlated predictors . . . . . . . . . . . . . . . . . . 372
17.1.2 The model and BUGS program . . . . . . . . . . . . . . . . . . . 375
17.1.2.1 MCMC efficiency: Standardizing and initializing . . . . . 376
17.1.3 The posterior: How big are the slopes? . . . . . . . . . . . . . . . 376
17.1.4 Posterior prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 378
17.2 Hyperpriors and shrinkage of regression coefficients . . . . . . . . . . . . . 378
17.2.1 Informative priors, sparse data, and correlated predictors . . . . . . 382
17.3 Multiplicative interaction of metric predictors . . . . . . . . . . . . . . . . 383
17.3.1 The hierarchical model and BUGS code . . . . . . . . . . . . . . . 384
17.3.1.1 Standardizing the data and initializing the chains . . . . . 385
17.3.2 Interpreting the posterior . . . . . . . . . . . . . . . . . . . . . . . 385
17.4 Which predictors should be included? . . . . . . . . . . . . . . . . . . . . 388
17.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390
17.5.1 Multiple linear regression . . . . . . . . . . . . . . . . . . . . . . 390
17.5.2 Multiple linear regression with hyperprior on coefficients . . . . . . 394
x CONTENTS
17.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399
18 Metric Predicted Variable with One Nominal Predictor 401
18.1 Bayesian oneway ANOVA . . . . . . . . . . . . . . . . . . . . . . . . . . 402
18.1.1 The hierarchical prior . . . . . . . . . . . . . . . . . . . . . . . . . 403
18.1.1.1 Homogeneity of variance . . . . . . . . . . . . . . . . . 404
18.1.2 Doing it with R and BUGS . . . . . . . . . . . . . . . . . . . . . . 404
18.1.3 A worked example . . . . . . . . . . . . . . . . . . . . . . . . . . 406
18.1.3.1 Contrasts and complex comparisons . . . . . . . . . . . 407
18.1.3.2 Is there a difference? . . . . . . . . . . . . . . . . . . . . 408
18.2 Multiple comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 409
18.3 Two group Bayesian ANOVA and the NHST t test . . . . . . . . . . . . . . 412
18.4 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 413
18.4.1 Bayesian oneway ANOVA . . . . . . . . . . . . . . . . . . . . . . 413
18.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417
19 Metric Predicted Variable with Multiple Nominal Predictors 421
19.1 Bayesian multi-factor ANOVA . . . . . . . . . . . . . . . . . . . . . . . . 422
19.1.1 Interaction of nominal predictors . . . . . . . . . . . . . . . . . . . 422
19.1.2 The hierarchical prior . . . . . . . . . . . . . . . . . . . . . . . . . 424
19.1.3 An example in R and BUGS . . . . . . . . . . . . . . . . . . . . . 425
19.1.4 Interpreting the posterior . . . . . . . . . . . . . . . . . . . . . . . 428
19.1.4.1 Metric predictors and ANCOVA . . . . . . . . . . . . . 428
19.1.4.2 Interaction contrasts . . . . . . . . . . . . . . . . . . . . 429
19.1.5 Non-crossover interactions, rescaling, and homogeneous variances . 430
19.2 Repeated measures, a.k.a. within-subject designs . . . . . . . . . . . . . . 432
19.2.1 Why use a within-subject design? And why not? . . . . . . . . . . 434
19.3 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 435
19.3.1 Bayesian two-factor ANOVA . . . . . . . . . . . . . . . . . . . . . 435
19.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444
20 Dichotomous Predicted Variable 449
20.1 Logistic regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
20.1.1 The model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451
20.1.2 Doing it in R and BUGS . . . . . . . . . . . . . . . . . . . . . . . 451
20.1.3 Interpreting the posterior . . . . . . . . . . . . . . . . . . . . . . . 452
20.1.4 Perils of correlated predictors . . . . . . . . . . . . . . . . . . . . 454
20.1.5 When there are few 1’s in the data . . . . . . . . . . . . . . . . . . 454
20.1.6 Hyperprior across regression coefficients . . . . . . . . . . . . . . 454
20.2 Interaction of predictors in logistic regression . . . . . . . . . . . . . . . . 455
20.3 Logistic ANOVA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 456
20.3.1 Within-subject designs . . . . . . . . . . . . . . . . . . . . . . . . 458
20.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458
20.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459
20.5.1 Logistic regression code . . . . . . . . . . . . . . . . . . . . . . . 459
20.5.2 Logistic ANOVA code . . . . . . . . . . . . . . . . . . . . . . . . 463
20.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
CONTENTS xi
21 Ordinal Predicted Variable 471
21.1 Ordinal probit regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 472
21.1.1 What the data look like . . . . . . . . . . . . . . . . . . . . . . . . 472
21.1.2 The mapping from metric x to ordinal y . . . . . . . . . . . . . . . 472
21.1.3 The parameters and their priors . . . . . . . . . . . . . . . . . . . 474
21.1.4 Standardizing for MCMC efficiency . . . . . . . . . . . . . . . . . 475
21.1.5 Posterior prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 475
21.2 Some examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 476
21.2.1 Why are some thresholds outside the data? . . . . . . . . . . . . . 478
21.3 Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 480
21.4 Relation to linear and logistic regression . . . . . . . . . . . . . . . . . . . 481
21.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 481
21.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 486
22 Contingency Table Analysis 489
22.1 Poisson exponential ANOVA . . . . . . . . . . . . . . . . . . . . . . . . . 490
22.1.1 What the data look like . . . . . . . . . . . . . . . . . . . . . . . . 490
22.1.2 The exponential link function . . . . . . . . . . . . . . . . . . . . 490
22.1.3 The Poisson likelihood . . . . . . . . . . . . . . . . . . . . . . . . 492
22.1.4 The parameters and the hierarchical prior . . . . . . . . . . . . . . 494
22.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494
22.2.1 Credible intervals on cell probabilities . . . . . . . . . . . . . . . . 495
22.3 Log linear models for contingency tables . . . . . . . . . . . . . . . . . . . 496
22.4 R code for Poisson exponential model . . . . . . . . . . . . . . . . . . . . 497
22.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504
23 Tools in the Trunk 507
23.1 Reporting a Bayesian analysis . . . . . . . . . . . . . . . . . . . . . . . . 508
23.1.1 Essential points . . . . . . . . . . . . . . . . . . . . . . . . . . . . 508
23.1.2 Optional points . . . . . . . . . . . . . . . . . . . . . . . . . . . . 509
23.1.3 Helpful points . . . . . . . . . . . . . . . . . . . . . . . . . . . . 509
23.2 MCMC burn-in and thinning . . . . . . . . . . . . . . . . . . . . . . . . . 510
23.3 Functions for approximating highest density intervals . . . . . . . . . . . . 513
23.3.1 R code for computing HDI of a grid approximation . . . . . . . . . 513
23.3.2 R code for computing HDI of a MCMC sample . . . . . . . . . . . 513
23.3.3 R code for computing HDI of a function . . . . . . . . . . . . . . . 515
23.4 Reparameterization of probability distributions . . . . . . . . . . . . . . . 516
23.4.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 516
23.4.2 Reparameterization of two parameters . . . . . . . . . . . . . . . . 517

下载地址:
Doing_Bayesian_Data_Analysis__A_Tutorial_with_R_and_BUGS.rar (9.63 MB, 下载次数: 0, 售价: 5 )

备注:
很多人都有收集一堆资料而不看的习惯。为了有效利用资源,养成下载一本看一本的习惯,特设置了积分下载,请见谅。
多参加论坛的活动、多帮助别人,会很容易凑够积分的!
祝大家使用愉快!
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

Archiver|手机版|小黑屋|R语言中文网

GMT+8, 2024-11-24 22:33 , Processed in 0.029074 second(s), 20 queries .

Powered by Discuz! X3.5

© 2001-2024 Discuz! Team.

快速回复 返回顶部 返回列表