Inverse Gaussian distribution

July 8, 2024 — July 9, 2024

Lévy processes
probability
stochastic processes
time series
Figure 1

Placeholder, about the Inverse Gaussian distribution, which is a tractable exponential family distribution for non-negative random variables.

tl;dr

pdf
\(\sqrt\frac{\lambda}{2 \pi x^3} \exp\left[-\frac{\lambda (x-\mu)^2}{2 \mu^2 x}\right]\)
mean
\(\operatorname{E}[X] = \mu\)
variance
\(\operatorname{Var} X] = \frac{\mu^3}{\lambda}\)

As a non-negative exponential family, it also induces a Lévy subordinator.

1 Conjugate prior

Banerjee and Bhattacharyya (1979) present a reasonably nice conjugate prior, albeit with an alternative parameterization of the distribution.

Write the IG pdf as

\[ \begin{aligned} & f(x \mid \psi, \lambda)=\left(\frac{\lambda}{2 \pi}\right)^{1 / 2} x^{-3 / 2} \\ & \exp \left\{-\frac{\lambda x}{2}\left(\psi-\frac{1}{x}\right)^2\right\}, x>0, \psi>0, \lambda>0\end{aligned} \] the likelihood of a random sample \(\mathbf{x}=\left(x_1, \cdots, x_n\right)\) from \(\operatorname{IG}(\psi, \lambda)\), is \[ l(\psi, \lambda \mid \mathbf{x}) \propto \exp \left\{-\frac{n u}{2}\left[1+\frac{\bar{x}}{u}\left(\psi-\frac{1}{\bar{x}}\right)^2\right] \lambda\right\} \lambda^{n / 2}, \] where \[ \bar{x}=\sum x_i / n \text { and } \bar{x}_r=\frac{1}{n} \sum\left(1 / x_i\right) \] are respectively the sample mean of the observations and that of their reciprocals, and \(u=\bar{x}_r-1 / \bar{x}\).

Their major result is as follows

[…]a bivariate natural conjugate family for \((\psi, \lambda)\) can be taken as \[ \begin{aligned} & p_c(\psi, \lambda)=K_1 \exp \left\{-\frac{r^{\prime} \alpha^{\prime}}{2}\left[1+\frac{\beta^{\prime}}{\alpha^{\prime}}\right.\right. \\ & \left.\left.\cdot\left(\psi-\frac{1}{\beta^{\prime}}\right)^2\right] \lambda\right\} \lambda^{r^{\prime \prime 2-1}}, \quad \psi>0, \lambda>0 \end{aligned} \] where \(r^{\prime}>1, \alpha^{\prime}>0, \beta^{\prime}>0\) are parameters and the constant \(K_1\) is given by \[ \begin{gathered} K_1=\frac{\left(\frac{\beta^{\prime}}{\alpha^{\prime}}\right)^{1 / 2}\left(\frac{r^{\prime} \alpha^{\prime}}{2}\right)^{r^{\prime \prime 2}}}{\left.H_{\nu^{\prime}\left(\xi^{\prime}\right) B}^{\nu^{\prime}}, \frac{1}{2}\right) \Gamma\left(\frac{r^{\prime}}{2}\right)}, \\ \nu^{\prime}=r^{\prime}-1, \quad \xi^{\prime}=\left(\frac{\nu^{\prime}}{\alpha^{\prime} \beta^{\prime}}\right)^{1 / 2} . \end{gathered} \] […] \[ \begin{aligned} n u+r^{\prime} \alpha^{\prime}+n \bar{x}\left(\psi-\frac{1}{\bar{x}}\right)^2 & +r^{\prime} \beta^{\prime}\left(\psi-\frac{1}{\beta^{\prime}}\right)^2 \\ & =r^{\prime \prime} \alpha^{\prime \prime}+r^{\prime \prime} \beta^{\prime \prime}\left(\psi-\frac{1}{\beta^{\prime \prime}}\right)^2 . \end{aligned} \]

Hence the joint posterior pdf of \(\psi\) and \(\lambda\) can be reduced to the form \[ \begin{aligned} & p_c(\psi, \lambda \mid \mathbf{x}) \propto \exp \{- \frac{r^{\prime \prime} \alpha^{\prime \prime}}{2}\left[1+\frac{\beta^{\prime \prime}}{\alpha^{\prime \prime}}\right. \\ &\left.\left.\cdot\left(\psi-\frac{1}{\beta^{\prime \prime}}\right)^2\right] \lambda\right\} \lambda^{\prime \prime \prime 2-1} \end{aligned} \] […] the marginal posterior distribution of \(\lambda\) is the modified gamma \(G^*\left(r^{\prime \prime} \alpha^{\prime \prime} / 2, \nu^{\prime \prime} / 2, r^{\prime \prime} / \beta^{\prime \prime}\right)\), and the marginal posterior pdf of \(\psi\) is the truncated \(t\) distribution \(t_d\left(1 / \beta^{\prime \prime}, q^{\prime \prime}, \nu^{\prime \prime}\right)\) with \(\left.q^{\prime \prime}=\left[\alpha^{\prime \prime} / \nu^{\prime \prime} \beta^{\prime \prime}\right)\right]^{1 / 2}\).

The modified gamma is derived from this guy:

\(\begin{aligned} & p(\lambda \mid x)=\frac{(n u / 2)^{\nu / 2}}{\Gamma(\nu / 2)} \frac{\Phi\left((n \lambda / \bar{x})^{1 / 2}\right)}{H_{\mathcal{U}}(\xi)} \\ & \quad \cdot \exp \left(-\frac{n u}{2} \lambda\right) \lambda^{\nu / 2-1}, \lambda>0\end{aligned}\)

This looks … somewhat tedious, but basically feasible I suppose.

2 References

Banerjee, and Bhattacharyya. 1979. Bayesian Results for the Inverse Gaussian Distribution with an Application.” Technometrics.
Joe, Seshadri, and Arnold. 2012. Multivariate Inverse Gaussian and Skew-Normal Densities.” Statistics & Probability Letters.
Minami. 2003. A Multivariate Extension of Inverse Gaussian Distribution Derived from Inverse Relationship.” Communications in Statistics - Theory and Methods.
———. 2007. Multivariate Inverse Gaussian Distribution as a Limit of Multivariate Waiting Time Distributions.” Journal of Statistical Planning and Inference, Special Issue: In Celebration of the Centennial of The Birth of Samarendra Nath Roy (1906-1964),.
Seshadri. 1993. The Inverse Gaussian Distribution: A Case Study in Exponential Families. Oxford Science Publications.
———. 2012. The Inverse Gaussian Distribution: Statistical Theory and Applications.