# Gamma distribution

Template:Probability distribution In probability theory and statistics, the gamma distribution is a continuous probability distribution. For integer values of the parameter k it is also known as the Erlang distribution.

 Contents

## Probability density function

The probability density function of the gamma distribution can be expressed in terms of the gamma function:

[itex] f(x|k,\theta) = x^{k-1} \frac{e^{-x/\theta}}{\theta^k \, \Gamma(k)}
\ \mathrm{for}\ x > 0 \,\![itex]


where [itex]k > 0[itex] is the shape parameter and [itex]\theta > 0[itex] is the scale parameter of the gamma distribution. (NOTE: this parameterization is what is used in the infobox and the plots.)

Alternatively, the gamma distribution can be parameterized in terms of a shape parameter [itex]\alpha = k[itex] and an inverse scale parameter [itex]\beta = 1/\theta[itex], called a rate parameter:

[itex] g(x|k,\lambda) = x^{\alpha-1} \frac{\beta^{\alpha} \, e^{-\beta\,x} }{\Gamma(\alpha)} \ \mathrm{for}\ x > 0 \,\![itex]

Both are common because they are more convenient to use in certain fields with different parameterizations.

## Properties

The cumulative distribution function can be expressed in terms of the incomplete gamma function,

[itex] F(x|k,\theta) = \int_0^x f(u|k,\theta)\,du
 = \frac{\gamma(k, x/\theta)}{\Gamma(k)} \,\![itex]


The information entropy is given by:

[itex]S=k\theta+(1-k)\ln(\theta)+\ln(\Gamma(k))+(1-k)\psi(k)\,[itex]

where [itex]\psi(k)[itex] is the polygamma function.

If [itex]X_i \sim \mathrm{Gamma}(\alpha_i, \beta)[itex] for [itex]i=1, 2, \cdots, N[itex] and [itex]\bar{\alpha} = \sum_{k=1}^N \alpha_i[itex] then

[itex]

\left[ Y = \sum_{i=1}^N X_i \right] \sim \mathrm{Gamma} \left( \bar{\alpha}, \beta \right) [itex]

provided all [itex]X_i[itex] are independent. The gamma distribution exhibits infinite divisibility.

If [itex]X \sim \operatorname {Gamma} (\alpha, \beta)[itex], then [itex]\frac X \beta \sim \operatorname {Gamma} (\alpha, 1)[itex]. Or, more generally, for any [itex]t > 0[itex] it holds that [itex]tX \sim \operatorname {Gamma} (\alpha, t \beta)[itex]. That is the meaning of β (or θ) being the scale parameter.

## Parameter estimation

The likelihood function is

[itex]L=\prod_{i=1}^N f(x_i|k,\theta)[itex]

from which we calculate the log-likelihood function

[itex]\ell=(k-1)\sum_{i=1}^N\ln(x_i)-\sum x_i/\theta-Nk\ln(\theta)-N\ln\Gamma(k))[itex]

Finding the maximum with respect to [itex]\theta[itex] by taking the derivative an setting it equal to zero yields the maximum likelihood estimate of the [itex]\theta[itex] parameter:

[itex]\theta=\frac{1}{kN}\sum_{i=1}^N x_i[itex]

## Generating gamma random variables

Given the scaling property above, it is enough to generate gamma variables with [itex]\beta = 1[itex] as we can later convert to any value of β with simple division.

Using the fact that if [itex]X \sim \operatorname {Gamma} (1, 1)[itex], then also [itex]X \sim \operatorname {Exponential} (1)[itex], and the method of generating exponential variables, we conclude that if U is uniformly distributed on (0, 1], then [itex]-\ln U \sim \operatorname {Gamma} (1, 1)[itex]. Now, using the "α-addition" property of gamma distribution, we expand this result:

[itex]\sum _{k=1} ^n {-\ln U_k} \sim \operatorname {Gamma} (n, 1)[itex],

where [itex]U_k[itex] are all uniformly distributed on (0, 1] and independent.

All that is left now is to generate a variable distributed as [itex]\operatorname {Gamma} (\delta, 1)[itex] for [itex]0 < \delta < 1[itex] and apply the "α-addition" property once more. This is the most difficult part, however.

We provide an algorithm without proof. It is an instance of the acceptance-rejection method:

1. Let m be 1.
2. Generate [itex]V_{2m - 1}[itex] and [itex]V_{2m}[itex] — independent uniformly distributed on (0, 1] variables.
3. If [itex]V_{2m - 1} \le v_0[itex], where [itex]v_0 = \frac e {e + \delta}[itex], then go to step 4, else go to step 5.
4. Let [itex]\xi_m = \left( \frac {V_{2m - 1}} {v_0} \right) ^{\frac 1 \delta}, \ \eta_m = V_{2m} \xi _m^ {\delta - 1}[itex]. Go to step 6.
5. Let [itex]\xi_m = 1 - \ln {\frac {V_{2m - 1} - v_0} {1 - v_0}}, \ \eta_m = V_{2m} e^{-\xi_m}[itex].
6. If [itex]\eta_m > x^{\delta - 1} e^{-x}[itex] then increment m and go to step 2.
7. Assume [itex]\xi = \xi_m[itex] to be the realization of [itex]\operatorname {Gamma} (\delta, 1)[itex].

Now, to summarize,

[itex]\frac 1 \beta \left( \xi - \sum _{k=1} ^{[\alpha]} {\ln U_k} \right) \sim \operatorname {Gamma} (\alpha, \beta)[itex],

where [itex][\alpha][itex] is the integral part of α, ξ has been generating using the algorithm above with [itex]\delta = \{\alpha\}[itex] (the fractional part of α), [itex]U_k[itex] and [itex]V_l[itex] are distributed as explained above and are all independent.

## Related distributions

• [itex]X \sim \mathrm{Exponential}(\theta)[itex] is an exponential distribution if [itex]X \sim \mathrm{Gamma}(1, \theta)[itex].
• [itex]Y \sim \mathrm{Gamma}(N, \theta)[itex] is a gamma distribution if [itex]Y = X_1 + \cdots + X_N[itex] and if the [itex]X_i \sim \mathrm{Exponential}(\theta)[itex] are all independent and share the same parameter [itex]\theta[itex].
• [itex]X \sim \chi^2(\nu)[itex] is a chi-square distribution if [itex]X \sim \mathrm{Gamma}(k=\nu/2, \theta = 2)[itex].
• If [itex]k[itex] is an integer, the gamma distribution is an Erlang distribution (so named in honor of A. K. Erlang) and is the probability distribution of the waiting time until the [itex]k^{th}[itex] "arrival" in a one-dimensional Poisson process with intensity [itex]1/\theta[itex].
• [itex]X \sim \mathrm{Gamma}(k, \theta)[itex] then [itex]Y \sim \mathrm{InvGamma}(k, \theta^{-1})[itex] if [itex]Y = 1/X[itex], where [itex]\mathrm{InvGamma}[itex] is the inverse-gamma distribution.
• [itex]Y = X_1/(X_1+X_2) \sim \mathrm{Beta}[itex] is a beta distribution if [itex]X_1 \sim \mathrm{Gamma}[itex] and [itex]X_2 \sim \mathrm{Gamma}[itex] and are also independent.
• [itex]Y \sim \mathrm{Maxwell}(\beta)[itex] is a Maxwell-Boltzmann distribution if [itex]X \sim \mathrm{Gamma}(\alpha = 3/2, \beta)[itex].
• [itex]Y \sim N(\mu = \alpha \beta, \sigma^2 = \alpha \beta^2)[itex] is a normal distribution as [itex]Y = \lim_{\alpha \to \infty} X[itex] where [itex]X \sim \mathrm{Gamma}(\alpha, \beta)[itex].

## References

• R. V. Hogg and A. T. Craig. Introduction to Mathematical Statistics, 4th edition. New York: Macmillan, 1978. (See Section 3.3.)

• Art and Cultures
• Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
• Space and Astronomy