Appendix
mpg | cyl | disp | hp | drat | wt | qsec | vs | |
---|---|---|---|---|---|---|---|---|
Mazda RX4 | 21.0 | 6 | 160.0 | 110 | 3.90 | 2.620 | 16.46 | 0 |
Mazda RX4 Wag | 21.0 | 6 | 160.0 | 110 | 3.90 | 2.875 | 17.02 | 0 |
Datsun 710 | 22.8 | 4 | 108.0 | 93 | 3.85 | 2.320 | 18.61 | 1 |
Hornet 4 Drive | 21.4 | 6 | 258.0 | 110 | 3.08 | 3.215 | 19.44 | 1 |
Hornet Sportabout | 18.7 | 8 | 360.0 | 175 | 3.15 | 3.440 | 17.02 | 0 |
Valiant | 18.1 | 6 | 225.0 | 105 | 2.76 | 3.460 | 20.22 | 1 |
Duster 360 | 14.3 | 8 | 360.0 | 245 | 3.21 | 3.570 | 15.84 | 0 |
Merc 240D | 24.4 | 4 | 146.7 | 62 | 3.69 | 3.190 | 20.00 | 1 |
Merc 230 | 22.8 | 4 | 140.8 | 95 | 3.92 | 3.150 | 22.90 | 1 |
Merc 280 | 19.2 | 6 | 167.6 | 123 | 3.92 | 3.440 | 18.30 | 1 |
mpg | cyl | disp | hp | drat | wt | qsec | vs | |
---|---|---|---|---|---|---|---|---|
Mazda RX4 | 21.0 | 6 | 160.0 | 110 | 3.90 | 2.620 | 16.46 | 0 |
Mazda RX4 Wag | 21.0 | 6 | 160.0 | 110 | 3.90 | 2.875 | 17.02 | 0 |
Datsun 710 | 22.8 | 4 | 108.0 | 93 | 3.85 | 2.320 | 18.61 | 1 |
Hornet 4 Drive | 21.4 | 6 | 258.0 | 110 | 3.08 | 3.215 | 19.44 | 1 |
Hornet Sportabout | 18.7 | 8 | 360.0 | 175 | 3.15 | 3.440 | 17.02 | 0 |
Valiant | 18.1 | 6 | 225.0 | 105 | 2.76 | 3.460 | 20.22 | 1 |
Duster 360 | 14.3 | 8 | 360.0 | 245 | 3.21 | 3.570 | 15.84 | 0 |
Merc 240D | 24.4 | 4 | 146.7 | 62 | 3.69 | 3.190 | 20.00 | 1 |
Merc 230 | 22.8 | 4 | 140.8 | 95 | 3.92 | 3.150 | 22.90 | 1 |
Merc 280 | 19.2 | 6 | 167.6 | 123 | 3.92 | 3.440 | 18.30 | 1 |
mpg | cyl | disp | hp | drat | wt | qsec | vs | |
---|---|---|---|---|---|---|---|---|
Mazda RX4 | 21.0 | 6 | 160.0 | 110 | 3.90 | 2.620 | 16.46 | 0 |
Mazda RX4 Wag | 21.0 | 6 | 160.0 | 110 | 3.90 | 2.875 | 17.02 | 0 |
Datsun 710 | 22.8 | 4 | 108.0 | 93 | 3.85 | 2.320 | 18.61 | 1 |
Hornet 4 Drive | 21.4 | 6 | 258.0 | 110 | 3.08 | 3.215 | 19.44 | 1 |
Hornet Sportabout | 18.7 | 8 | 360.0 | 175 | 3.15 | 3.440 | 17.02 | 0 |
Valiant | 18.1 | 6 | 225.0 | 105 | 2.76 | 3.460 | 20.22 | 1 |
Duster 360 | 14.3 | 8 | 360.0 | 245 | 3.21 | 3.570 | 15.84 | 0 |
Merc 240D | 24.4 | 4 | 146.7 | 62 | 3.69 | 3.190 | 20.00 | 1 |
Merc 230 | 22.8 | 4 | 140.8 | 95 | 3.92 | 3.150 | 22.90 | 1 |
Merc 280 | 19.2 | 6 | 167.6 | 123 | 3.92 | 3.440 | 18.30 | 1 |
I strongly recommend to type the code line rather than copy and paste it. See below for a brief introduction to R software.↩︎
Observe that I use the term “Bayes’ rule” rather than “Bayes’ theorem”. It was Laplace (P. S. Laplace 1774) who actually generalized the Bayes’ theorem (Thomas Bayes 1763). His generalization is named the Bayes’ rule.↩︎
\(\lnot\) is the negation symbol. In addition, we have that \(P(B|A)=1-P(B|A^c)\) in this example, where \(A^c\) is the complement of \(A\). However, this is not true in general.↩︎
https://www.wolframalpha.com/input/?i=number+of+people+who+have+ever+lived+on+Earth↩︎
https://www.r-bloggers.com/2019/04/base-rate-fallacy-or-why-no-one-is-justified-to-believe-that-jesus-rose/↩︎
From a Bayesian perspective \(\mathbf{\theta}\) is fixed, but unknown. Then, it is treated as a random object.↩︎
\(\propto\) is the proportional symbol.↩︎
See also (M. Bayarri and Berger 2000) to show potential flows due to using data twice in the construction of the predictive p values, and alternative proposals, for instance the partial posterior predictive p value.↩︎
\(\perp\) is the independence symbol.↩︎
Take into account that in the likelihood function the argument is \(\theta\). However, we keep the notation for facility in exposition.↩︎
Independent and identically distributed draws.↩︎
We should be aware that there may be technical problems using this king of hyperparameters in this setting (Andrew Gelman et al. 2006).↩︎
(Chernozhukov and Hong 2003) propose Laplace type estimators (LTE) based on the quasi-posterior, \(p(\mathbf{\theta})=\frac{\exp\left\{L_n(\mathbf{\theta})\right\}\pi(\mathbf{\theta})}{\int_{\mathbf{\Theta}}\exp\left\{L_n(\mathbf{\theta})\right\}\pi(\mathbf{\theta})d\theta}\) where \(L_n(\mathbf{\theta})\) is not necessarily a log-likelihood function. The LTE minimizes the quasi-posterior risk.↩︎
Type I error is rejecting the null hypothesis when this is true, and the Type II error is not rejecting the null hypothesis when this is false.↩︎
A pivot quantity is a function of unobserved parameters and observations whose probability distribution does not depend on the unknown parameters.↩︎
An ancillary statistic is a pivotal quantity that is also a statistic.↩︎
https://fivethirtyeight.com/features/not-even-scientists-can-easily-explain-p-values/↩︎
See https://joyeuserrance.wordpress.com/2011/04/22/proof-that-p-values-under-the-null-are-uniformly-distributed/ for a simple proof.↩︎
Another parametrization of the gamma density is the scale parametrization where \(\kappa_0=1/\beta_0\). See the health insurance example in Chapter 1.↩︎
A particular case of the Woodbury matrix identity↩︎
Using this result \(({\bf{A}}+{\bf{B}}{\bf{D}}{\bf{C}})^{-1}={\bf{A}}^{-1}-{\bf{A}}^{-1}{\bf{B}}({\bf{D}}^{-1}+{\bf{C}}{\bf{A}}^{-1}{\bf{B}})^{-1}{\bf{C}}{\bf{A}}^{-1}\)↩︎
\(vec\) denotes the vectorization operation, and \(\otimes\) denotes the kronecker product↩︎
We can write down the former expression in a more familiar way using vectorization properties, \(\underbrace{vec(Y)}_{\bf{y}}=\underbrace{({\bf{I}}_M\otimes {\bf{X}})}_{{\bf{Z}}}\underbrace{vec({\bf{B}})}_{\beta}+\underbrace{vec({\bf{U}})}_{\mu}\), where \({\bf{y}}\sim N_{N\times M}({\bf{Z}}\beta,\bf{\Sigma}\otimes {\bf{I}}_N)\).↩︎
I strongly recommend to type the code line rather than copy and paste it.↩︎
Users should take into account that formal inference (hypothesis tests) in a Bayesian framework is based on Bayes factors.↩︎
Tuning parameters should be set in a way such that one obtains reasonable diagnostic criteria and aceptation rates.↩︎
Formulating the acceptance rate using \(\log\) helps to mitigate computational problems.↩︎