#### Motivation

If you have understood how the Poisson process works and are willing to accept that a Gamma distribution models the time to wait to observe *a* events *α* events, this section is superfluous to your needs. We explain here:

How the Exponential distribution models the time to wait for the first event and arises naturally out of a memoryless system;

And therefore why the distribution of the time to wait to observe an event remains the same even if one has waited a while;

How the Gamma distribution is the sum of a number of Exponential distributions, and thus is the waiting time distribution for

*a*events.

...

*α*events.

#### Deriving the Exponential distribution

The Poisson process assumes that there is a constant probability that an event will occur per increment of time. If we consider a small element of time * Dt*, then the probability an event will occur in that element of time is

*k*, where

*D*t*k*is some constant. Now let

*P(t)*be the probability that the event will not have occurred by time

*t*. The probability that an event occurs the first time during the small interval

*after time*

*D*t*t*is then

*k*. This is also equal to

*D*tP(t)*P(t) - P(t+*and we have:

*D*t)LaTeX Math Block | ||
---|---|---|

| ||

\bigg[\frac{P\big(t+\Delta t \big)-P\big(t\big)}{P\big(t\big)}\bigg]={-k\Delta t} |

Making * Dt* infinitesimally small, this becomes the differential equation:

...

LaTeX Math Block | ||
---|---|---|

| ||

\frac{dP(t)}{P(t)}=-kdt |

...

Integration gives:

...

LaTeX Math Block | ||
---|---|---|

| ||

In[P(t)]=-kt |

...

LaTeX Math Block | ||
---|---|---|

| ||

P(t)=exp[-kt] |

If we define *F(t)* as the probability that the event will have occurred before time *t* (i.e. (1-*P(t)*), the cumulative distribution function for *t*), we then have:

LaTeX Math Block | ||
---|---|---|

| ||

F(t)=1-exp[-kt] |

which is the cumulative distribution function for an Exponential distribution Exponential(*k*) with mean 1/*k*. Thus 1/*k* is the mean time between occurrences of events or, equivalently, *k* is the mean number of events per unit time, which is the Poisson parameter *l**λ*. The parameter 1/

*l**, the mean time between occurrences of events, is given the notation*

*λ**β.*

*b*

#### Derivation of the Gamma distribution

We have shown above that the time until occurrence of the first event for a Poisson distribution is given by:

*t*_{1} = Exponential(1/* b*β) where

*β = 1/*

*b*

*l*

*λ*

From the mathematics of convolutions we have:

LaTeX Math Block | ||
---|---|---|

| ||

Z=X+Y \\ f_z(z)= \int\limits_{-\infty} ^\infty f_x(x) f_Y(z-x)dx |

For X = Gamma(0,*b*β,*a** α*) and Y = Exponential(1/

*b*β) this gives:

LaTeX Math Block | ||
---|---|---|

| ||

f_z(z)= \int\limits_{0} ^z \frac{t^{\alpha-1}}{\beta^\alpha(\alpha-1)!}e^{-\frac{t}{\beta}}\frac{1}{\beta}e^\frac{(z-t)}{\beta } dt \\ =\frac{1}{\beta^{\alpha+1}(\alpha-1)!}e^{-\frac{z}{\beta}} \int\limits_{0} ^z t^{\alpha-1} dt \frac{t^{\alpha}}{\beta^{\alpha+1} \alpha!}e^{-\frac{z}{\beta}} |

This is equal to a Gamma (0,*b*β,*a* * α* +1).

Since Gamma(0,bβ*,*1) = Y = Exponential(1/*b*β) we have proven by induction that:

LaTeX Math Block | ||
---|---|---|

| ||

Gamma(0,\beta,a)=\displaystyle\sum_{a}Exponential(1/\beta) |

...

The probability that the first event will occur at time *x*, given it has not yet occurred by time *t* (*x*>*t*), is given by:

LaTeX Math Block | ||
---|---|---|

| ||

f(x \mid x>t)= \frac{f(x)}{1-F(t)}=\frac{1}{\beta}exp \bigg[ -\frac{(x-t)}{\beta}\bigg] |

which is another Exponential distribution. Thus, although the event may not have occurred after time *t*, the remaining time until it will occur has the same probability distribution as it had at any prior point in time.