Generalized inverse Gaussian distribution
Probability density function
![]() |
|
Parameters | a > 0, b > 0, p real |
---|---|
Support | x > 0 |
![]() |
|
Mean | ![]() |
Mode | ![]() |
Variance | ![]() |
MGF | ![]() |
CF | ![]() |
In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function
where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.[1][2][3] It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. It is also known as the Sichel distribution, after Herbert Sichel.[4] Its statistical properties are discussed in Bent Jørgensen's lecture notes.[5]
Contents
Properties
Summation
Barndorff-Nielsen and Halgreen proved that the GIG distribution has Infinite divisibility[6]
Entropy
The entropy of the generalized inverse Gaussian distribution is given as[citation needed]
where is a derivative of the modified Bessel function of the second kind with respect to the order
evaluated at
Differential equation
The pdf of the generalized inverse Gaussian distribution is a solution to the following differential equation:
Related distributions
Special cases
The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = -1/2 and b = 0, respectively.[7] Specifically, an inverse Gaussian distribution of the form
is a GIG with ,
, and
. A Gamma distribution of the form
is a GIG with ,
, and
.
Other special cases include the inverse-gamma distribution, for a=0, and the hyperbolic distribution, for p=0.[7]
Conjugate prior for Gaussian
The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture.[8][9] Let the prior distribution for some hidden variable, say , be GIG:
and let there be observed data points,
, with normal likelihood function, conditioned on
:
where is the normal distribution, with mean
and variance
. Then the posterior for
, given the data is also GIG:
where .[note 1]
Notes
<templatestyles src="Reflist/styles.css" />
Cite error: Invalid <references>
tag; parameter "group" is allowed only.
<references />
, or <references group="..." />
References
<templatestyles src="Reflist/styles.css" />
Cite error: Invalid <references>
tag; parameter "group" is allowed only.
<references />
, or <references group="..." />
See also
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Étienne Halphen was the uncle of the mathematician Georges Henri Halphen.
- ↑ Sichel, H.S., Statistical valuation of diamondiferous deposits, Journal of the South African Institute of Mining and Metallurgy 1973
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ O. Barndorff-Nielsen and Christian Halgreen, Infinite Divisibility of the Hyperbolic and Generalized Inverse Gaussian Distributions, Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 1977
- ↑ 7.0 7.1 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Dimitris Karlis, "An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution", Statistics & Probability Letters 57 (2002) 43–52.
- ↑ Barndorf-Nielsen, O.E., 1997. Normal Inverse Gaussian Distributions and stochastic volatility modelling. Scand. J. Statist. 24, 1–13.
Cite error: <ref>
tags exist for a group named "note", but no corresponding <references group="note"/>
tag was found, or a closing </ref>
is missing