Advanced Statistical Computing
Welcome
Stay in Touch!
Setup
1
Introduction
1.1
Example: Linear Models
1.2
Principle of Optimization Transfer
1.3
Textbooks vs. Computers
1.3.1
Using Logarithms
1.3.2
Linear Regression
1.3.3
Multivariate Normal Distribution
2
Solving Nonlinear Equations
2.1
Bisection Algorithm
2.1.1
Example: Quantiles
2.2
Rates of Convergence
2.2.1
Linear convergence
2.2.2
Superlinear Convergence
2.2.3
Quadratic Convergence
2.2.4
Example: Bisection Algorithm
2.3
Functional Iteration
2.3.1
The Shrinking Lemma
2.3.2
Convergence Rates for Shrinking Maps
2.4
Newton’s Method
2.4.1
Proof of Newton’s Method
2.4.2
Convergence Rate of Newton’s Method
2.4.3
Newton’s Method for Maximum Likelihood Estimation
3
General Optimization
3.1
Steepest Descent
3.1.1
Example: Multivariate Normal
3.2
The Newton Direction
3.2.1
Generalized Linear Models
3.2.2
Newton’s Method in R
3.3
Quasi-Newton
3.3.1
Quasi-Newton Methods in R
3.4
Conjugate Gradient
3.5
Coordinate Descent
3.5.1
Convergence Rates
3.5.2
Generalized Additive Models
4
The EM Algorithm
4.1
EM Algorithm for Exponential Families
4.2
Canonical Examples
4.2.1
Two-Part Normal Mixture Model
4.2.2
Censored Exponential Data
4.3
A Minorizing Function
4.3.1
Example: Minorization in a Two-Part Mixture Model
4.3.2
Constrained Minimization With and Adaptive Barrier
4.4
Missing Information Principle
4.5
Acceleration Methods
4.5.1
Louis’s Acceleration
4.5.2
SQUAREM
5
Integration
5.1
Laplace Approximation
5.1.1
Computing the Posterior Mean
6
Independent Monte Carlo
6.1
Random Number Generation
6.1.1
Pseudo-random Numbers
6.2
Non-Uniform Random Numbers
6.2.1
Inverse CDF Transformation
6.2.2
Other Transformations
6.3
Rejection Sampling
6.3.1
The Algorithm
6.3.2
Properties of Rejection Sampling
6.3.3
Empirical Supremum Rejection Sampling
6.4
Importance Sampling
6.4.1
Example: Bayesian Sensitivity Analysis
6.4.2
Properties of the Importance Sampling Estimator
7
Markov Chain Monte Carlo
7.1
Background
7.1.1
A Simple Example
7.1.2
Basic Limit Theorem
7.1.3
Time Reversibility
7.1.4
Summary
7.2
Metropolis-Hastings
7.2.1
Random Walk Metropolis-Hastings
7.2.2
Independence Metropolis Algorithm
7.2.3
Slice Sampler
7.2.4
Hit and Run Sampler
7.2.5
Single Component Metropolis-Hastings
7.3
Gibbs Sampler
7.3.1
Example: Bivariate Normal Distribution
7.3.2
Example: Normal Likelihood
7.3.3
Gibbs Sampling and Metropolis Hastings
7.3.4
Hybrid Gibbs Sampler
7.3.5
Reparametrization
7.3.6
Example: Bernoulli Random Effects Model
7.4
Monitoring Convergence
7.4.1
Monte Carlo Standard Errors
7.4.2
Gelman-Rubin Statistic
7.5
Simulated Annealing
Published with bookdown
Advanced Statistical Computing
2
Solving Nonlinear Equations