Bayesian inference models the process= of learning. That is, we start with a certain level of belief, however vag= ue, and through the accumulation of experience, our belief becomes more fin= etuned. Some people take a dislike to Bayesian inference because it is ove= rtly subjective and they like to think of statistics as being objective. We= don't agree, because any statistical analysis is necessarily subjective as= a result of the need to make assumptions, but also because many approximat= ions are accepted without question, even without warning that they have bee= n made. For that reason we appreciate the extra transparency of Bayesian in= ference, but it also frequently provides answers where classical statistics= cannot. Perhaps more importantly for us as risk analysts, Bayesian inferen= ce encourages our clients to think about the level of knowledge they have about their problem, and what that means to them.
Bayesian inference is an extremely powerful technique, based on Bayes' Theorem (sometimes called Bayes' Formula), = for using data to improve one's estimate of a parameter. There are essentia= lly three steps involved:
Constructing a confidence distribution of the parameter = before analyzing the new data set. This is called the prior distribution;
Find an appropriate likelihood f= unction for the observed data; and
Modify the prior distribution using the likelihood funct= ion to get a revised estimate known as the posterior distribution.
=20 =20 =20 =20
=20
=20
=20
=20
=20
=20

This section starts with some introdu= ctory topics:
Introducti= on to the concept and some simple examples
How to det= ermine prior distributions
How to determine = likelihood functions
We then turn to the actual execution = of a Bayesian inference for risk analysis models with Crystal Ball by looki= ng at various techniques for arriving at the posterior distribution:=
Simulation with accept/re= ject method
Markov Chain Monte Car= lo method
Most important of all, we offer a num= ber of worked examples:
Examples of Bayesian infer= ence calculations
A simple Bayesian inference example using= const= ruction
Simple= construction model showing the interaction between likelihood functions an= d informed priors
Gender of a random sample of people
A simp= le construction model illustrating the importance of the prior distribution=
Microfractures on turbine blades
A model to show how to incorporate hyperp= arameters by simulation, as well as offering both simulation and constructi= on approaches to determining the posterior distribution
A fun = model of a classic problem showing the sometimes nonintuitive nature of th= e Bayesian result
Using cow pats to estimate inf= ected animals in a herd
A bit = gross, but this example shows how you can perform Bayesian inference calcul= ations with very involved stochastic processes (i.e. very complicated likel= ihood functions) by using Monte Carol simulation with accept/reject criteri= a.
= Bayesian estimation of a components mean time to failure MTTF
A simp= le construction example that shows how we use data that describe being abov= e or below a threshold, instead of exact observations
Showin= g how Taylor series expansion lets you determine the normal approximation t= o posterior distributions, and a method for algebraically obtaining the sta= ndard deviation
Normal approximation to the Beta posterior distribution
Exampl= e of a Taylor series expansion
Two common statistical problems
Estimate of the mean of a = Normal distribution with unknown standard deviation
A standard statistics problem with the sa= me outcome as the classical method
Bayesian estimate of the mea= n of a Normal distribution with known standard deviation
Another standard statistics problem with = the same outcome as the classical method
Once you have reviewed t= he material in this section, you might like to test how much you have learn= ed by taking the selftest quiz:
A quiz on Bayesian infer= ence: