To learn more about EpiX Analytics' work, please visit our modeling applications, white papers, and training schedule.

Page tree

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 16 Current »


Like the binomial probability p, the mean events per period λ is a fundamental property of the stochastic system in question. It can never be observed and it can never be exactly known. However, we can become progressively more certain about its value as more data are collected. Statistics provides us with a means of quantifying the state of our knowledge as we accumulate data.


We discuss two approaches: Bayesian and classical statistics


1. Bayesian inference

Assuming an uninformed prior π(λ) = 1/ λ  and the Poisson likelihood function for observing a events in period t:


l(\alpha \bigg|\lambda,t)=\frac{e^{-\lambda t}\big(\lambda t\big)^\alpha}{\alpha !}\propto e^{-\lambda t}(\lambda )^\alpha


The proportional statement is acceptable because we can ignore terms that don't involve λ, and we then get the posterior distribution:


p( \lambda\bigg|\alpha) \propto e^{-\lambda t}(\lambda )^{\alpha-1}


which by comparison with a Gamma density function is a Gamma(0,1/t,α) distribution. The Gamma distribution can also be used to describe our uncertainty about λ if we start off with an informed opinion and then observe α events in time t. If we can reasonably describe our prior belief with a Gamma(0,b,a) distribution, the posterior is given by a Gamma(0, b/ (1 + b t),a + α) distribution.


More difficult: the effect of the prior

The following paragraph is fairly difficult (but interesting) and is not totally necessary to understand the use of the Gamma distribution in determining the mean number of events per period (λ). The choice of π(λ) = 1/ λ (which is equivalent to a Gamma(0,z,1/z) distribution where z is extremely large) as an uninformed prior is an uncomfortable one for many. We can get a feel for the importance of the prior with the following train of thought:


  1. A π(λ) = 1/ λ prior is equivalent to Gamma(0,z,1/z) where z approaches infinity. You can prove this by looking at the Gamma probability distribution function and setting α to zero and β to infinity.

  2.  A flat prior (the opposite extreme to the π(λ) = 1/ λ prior) would be equivalent to a Gamma(0,z,1), where z approaches infinity, i.e. an infinitely drawn out Exponential distribution.

  3. We have seen that for a Gamma(0,b,a) prior, the resultant posterior is Gamma(0, b/ (1 + α t),a + α) which means that the posterior for 1. would be Gamma(0, 1/t,α) and for 2. would be Gamma(0, 1/t,α +1).

  4. Thus, the sensitivity of the Gamma distribution to the prior amounts to whether (α+1) is approximately the same as a. Moreover, Gamma(0,β,α) is the sum of a independent Exponential(1/β) distributions so one can think of the choice of priors as being whether we add one extra Exponential distribution or not to the a Exponential distributions from the data. Thus, if a was 100 for example, the distribution would be roughly 1% influenced by the prior and 99% influenced by the data. In this model, the information contained in the quantity of data available always overpowers the prior.


2. Classical statistics

Various classic statistics approaches to estimating λ are discussed here.



3. Comparison of classical and Bayesian methods

A comparison of the different approaches to estimating λ are discussed here.







  • No labels