For todayâ€™s lab, weâ€™ll use the `BayesianTools`

package, which has a bunch of different methods built in.

```
set.seed(7)
library(BayesianTools)
library(deSolve)
library(ggplot2)
```

Methods marked in italics are included in the `BayesianTools`

package, but Iâ€™ve included a couple of other ones for us to talk over since theyâ€™re in commonly used packages like `Stan`

or `POMP`

.

*Metropolis-Hastings**Gibbs sampling*- Variations of the above:
*Prior optimization**Adaptive methods**Delayed rejection**DRAM*(Delayed Rejection Adaptive Metropolis-Hastings)

*Differential evolution*- Hamiltonian methods (Stan)
- Particle filtering (in particular, Iterated Filtering in POMP)

As a simple example to get to know the `BayesianTools`

package, letâ€™s set up a similar normal distribution model to the one we built last time. Weâ€™ll suppose the data comes from a normal distribution with unknown \(\mu\), and known \(\sigma = 30\).

Letâ€™s simulate some basic data to run our example:

`data = rnorm(50,0,30) # here we're supposing the true mean is zero, and sampling 55 data points`

The likelihood is just a normal distribution evaluated for our data set and whatever value of \(\mu\) weâ€™re considering using:

```
loglike = function(par){
sum(dnorm(data, par, 30, log = T))
}
```

Now, we create a BayesianSetupâ€”this is an object that has the prior and likelihood information that `BayesianTools`

needs.

We can set a prior using a distribution, or if we want to assume a uniform distribution with a given range, we can just provide the upper and lower limits. Note that if we donâ€™t specify a prior, `BayesianTools`

will use an infinite uniform prior (basically equivalent to just using the likelihood).

Weâ€™ll also set the total number of iterations to run. We can set the starting values for the parameters, but if we donâ€™t, `BayesianTools`

will default to sampling them from the prior, which is often a good idea anyhow.

```
setup = createBayesianSetup(likelihood = loglike, lower = -20, upper = 20)
settings = list(iterations = 10000, message=F)
```

Now, letâ€™s run the MCMC! Weâ€™ll use the default algorithm for now (a form of differential evolution). Letâ€™s also spit out some summary statistics and plots.

```
res = runMCMC(bayesianSetup = setup, settings = settings)
summary(res)
```

```
## # # # # # # # # # # # # # # # # # # # # # # # # #
## ## MCMC chain summary ##
## # # # # # # # # # # # # # # # # # # # # # # # # #
##
## # MCMC sampler: DEzs
## # Nr. Chains: 3
## # Iterations per chain: 3334
## # Rejection rate: 0.531
## # Effective sample size: 2276
## # Runtime: 0.922 sec.
##
## # Parameter MAP 2.5% median 97.5%
## # par 1 : 7.161 -1.034 6.984 15.223
##
## ## DIC: 483.869
## ## Convergence
## Gelman Rubin multivariate psrf:
##
## ## Correlations
## par 1
## par 1 1
```

`marginalPlot(res)`

`## Package 'sm', version 2.2-5.4: type help(sm) for summary information`

`plot(res)`