To learn more about EpiX Analytics' work, please visit our modeling applications, white papers, and training schedule.

Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Updated/corrected lambda, alpha and Beta


 

 

Motivation

If you have understood how the Poisson process works and are willing to accept that a Gamma distribution models the time to wait to observe a events α events, this section is superfluous to your needs. We explain here:

 


  • How the Exponential distribution models the time to wait for the first event and arises naturally out of a memoryless system;

  • And therefore why the distribution of the time to wait to observe an event remains the same even if one has waited a while;

  • How the Gamma distribution is the sum of a number of Exponential distributions, and thus is the waiting time distribution for a events.

 

...

  • α events.



Deriving the Exponential distribution

The Poisson process assumes that there is a constant probability that an event will occur per increment of time. If we consider a small element of time Dt, then the probability an event will occur in that element of time is kDt, where k is some constant. Now let P(t) be the probability that the event will not have occurred by time t. The probability that an event occurs the first time during the small interval Dt after time t is then kDtP(t). This is also equal to P(t) - P(t+Dt) and we have: 


LaTeX Math Block
alignmentleft
\bigg[\frac{P\big(t+\Delta t \big)-P\big(t\big)}{P\big(t\big)}\bigg]={-k\Delta t}

 


Making Dt infinitesimally small, this becomes the differential equation:

...


LaTeX Math Block
alignmentleft
\frac{dP(t)}{P(t)}=-kdt

...


Integration gives:

...


LaTeX Math Block
alignmentleft
In[P(t)]=-kt

...

LaTeX Math Block
alignmentleft
P(t)=exp[-kt]

 


If we define F(t) as the probability that the event will have occurred before time t (i.e. (1-P(t)), the cumulative distribution function for t), we then have:

 


LaTeX Math Block
alignmentleft
F(t)=1-exp[-kt]

 


which is the cumulative distribution function for an Exponential distribution Exponential(k) with mean 1/k. Thus 1/k is the mean time between occurrences of events or, equivalently, k is the mean number of events per unit time, which is the Poisson parameter l λ. The parameter 1/lλ, the mean time between occurrences of events, is given the notation b β.

 

 



Derivation of the Gamma distribution

We have shown above that the time until occurrence of the first event for a Poisson distribution is given by:

 


t1 = Exponential(1/bβ)                         where b β = 1/lλ

 


From the mathematics of convolutions we have:

 


LaTeX Math Block
alignmentleft
Z=X+Y \\ f_z(z)= \int\limits_{-\infty} ^\infty f_x(x) f_Y(z-x)dx

For X = Gamma(0,bβ,aα) and Y = Exponential(1/bβ) this gives:

 


LaTeX Math Block
alignmentleft
f_z(z)= \int\limits_{0} ^z \frac{t^{\alpha-1}}{\beta^\alpha(\alpha-1)!}e^{-\frac{t}{\beta}}\frac{1}{\beta}e^\frac{(z-t)}{\beta } dt \\
=\frac{1}{\beta^{\alpha+1}(\alpha-1)!}e^{-\frac{z}{\beta}} \int\limits_{0} ^z t^{\alpha-1} dt \frac{t^{\alpha}}{\beta^{\alpha+1} \alpha!}e^{-\frac{z}{\beta}}

 


This is equal to a Gamma (0,bβ,a α +1).

 


Since Gamma(0,bβ,1) = Y = Exponential(1/bβ) we have proven by induction that:

 


LaTeX Math Block
alignmentleft
Gamma(0,\beta,a)=\displaystyle\sum_{a}Exponential(1/\beta)

...

The probability that the first event will occur at time x, given it has not yet occurred by time t (x>t), is given by: 


LaTeX Math Block
alignmentleft
f(x \mid x>t)= \frac{f(x)}{1-F(t)}=\frac{1}{\beta}exp \bigg[ -\frac{(x-t)}{\beta}\bigg]

 


which is another Exponential distribution. Thus, although the event may not have occurred after time t, the remaining time until it will occur has the same probability distribution as it had at any prior point in time.