### BayesianTools Package

For todayâ€™s lab, weâ€™ll use the BayesianTools package, which has a bunch of different methods built in.

set.seed(7)
library(BayesianTools)
library(deSolve)
library(ggplot2)

### List of some MCMC approaches

Methods marked in italics are included in the BayesianTools package, but Iâ€™ve included a couple of other common ones since theyâ€™re in commonly used packages like Stan or POMP. Links to more info included as well (also more info here)!

• Metropolis-Hastings
• Gibbs sampling
• Variations of the above:
• Prior optimization - runs a standard optimization first (prior to sampling), to get better starting points for the sampler
• Adaptive methods - adapts the proposal distribution as you go based on how much variability the method estimates is needed
• Delayed rejection - draws multiple samples before rejection
• DRAM (Delayed Rejection Adaptive Metropolis-Hastings)
• Differential evolution - runs multiple chains in parallel, and the proposal distribution depends on the position of all the chains (see ter Braak 2006 and ter Braak and Vrugt 2008)
• Hamiltonian methods (Stan) - uses a Hamiltonian approach to generate distant samples that are more likely to be accepted (nice blog post about this here)
• Particle filtering (in particular, Iterated Filtering in POMP) - a set of particles (points/samples in parameter space) is weighted by its likelihood or posterior, and then resampled or new particles drawn according to the weights.

### Intro to the BayesianTools Package

As a simple example to get to know the BayesianTools package, letâ€™s set up a similar normal distribution model to the one we built last time. Weâ€™ll suppose the data comes from a normal distribution with unknown $$\mu$$, and known $$\sigma = 30$$.

Letâ€™s simulate some basic data to run our example:

data = rnorm(50,0,30) # here we're supposing the true mean is zero, and sampling 55 data points

The likelihood is just a normal distribution evaluated for our data set and whatever value of $$\mu$$ weâ€™re considering using:

loglike = function(par){
sum(dnorm(data, par, 30, log = T))
}

Now, we create a BayesianSetupâ€”this is an object that has the prior and likelihood information that BayesianTools needs.

We can set a prior using a distribution, or if we want to assume a uniform distribution with a given range, we can just provide the upper and lower limits. Note that if we donâ€™t specify a prior, BayesianTools will use an infinite uniform prior (basically equivalent to just using the likelihood).

Weâ€™ll also set the total number of iterations to run. We can set the starting values for the parameters, but if we donâ€™t, BayesianTools will default to sampling them from the prior, which is often a good idea anyhow.

setup = createBayesianSetup(likelihood = loglike, lower = -20, upper = 20)
settings = list(iterations = 10000, message=F)

Now, letâ€™s run the MCMC! Weâ€™ll use the default algorithm for now (a form of differential evolution). Letâ€™s also spit out some summary statistics and plots.

res = runMCMC(bayesianSetup = setup, settings = settings)
summary(res)
## # # # # # # # # # # # # # # # # # # # # # # # # #
## ## MCMC chain summary ##
## # # # # # # # # # # # # # # # # # # # # # # # # #
##
## # MCMC sampler:  DEzs
## # Nr. Chains:  3
## # Iterations per chain:  3334
## # Rejection rate:  0.545
## # Effective sample size:  2295
## # Runtime:  0.891  sec.
##
## # Parameters
##         psf   MAP   2.5% median  97.5%
## par 1 1.006 7.158 -1.092   7.09 15.186
##
## ## DIC:  483.844
## ## Convergence
##  Gelman Rubin multivariate psrf:
##
## ## Correlations
##       par 1
## par 1     1
marginalPlot(res)

plot(res)