Generalized linear models

March 24, 2016 — August 5, 2021

machine learning
optimization
regression
statistics

Using the machinery of linear regression to predict in somewhat more general regressions, using least-squares or quasi-likelihood approaches. This means you are still doing something like familiar linear regression, but outside the setting of e.g. linear response and possibly homoskedastic Gaussian noise.

Figure 1

1 TODO

Discover the magical powers of log-concavity and what they enable.

2 Classic linear models

See linear models.

3 Generalised linear models

The original extension. Kenneth Tay’s explanation is simple and efficient.

To learn:

  • When can we do this? e.g. Must the response be from an exponential family for really real? What happens if not?
  • Does anything funky happen with regularisation? What?
  • Model selection theory

3.1 Response distribution

🏗 What constraints do we have here?

3.2 Linear Predictor

🏗

3.4 Quasi-likelihood

A generalisation of likelihood used in some tricky corners of GLMs. (Wedderburn 1976) used it to provide a unified GLM/ML rationale. I don’t yet understand it. Heyde says (Heyde 1997):

Historically there are two principal themes in statistical parameter estimation theory

It is now possible to unify these approaches under the general description of quasi-likelihood and to develop the theory of parameter estimation in a very general setting. […]

It turns out that the theory needs to be developed in terms of estimating functions (functions of both the data and the parameter) rather than the estimators themselves. Thus, our focus will be on functions that have the value of the parameter as a root rather than the parameter itself.

4 Hierarchical generalised linear models

GLM + hierarchical model = HGLM.

5 Generalised additive models

Generalised generalised linear models. Semiparametric simultaneous discovery of some non-linear predictors and their response curve under the assumption that the interaction is additive in the transformed predictors \[ g(\operatorname{E}(Y))=\beta_0 + f_1(x_1) + f_2(x_2)+ \cdots + f_m(x_m). \]

These have now also been generalised in the obvious way.

6 Generalised additive models for location, scale and shape

Folding GARCH and other regression models into GAMs.

GAMLSS website:

GAMLSS is a modern distribution-based approach to (semiparametric) regression models, where all the parameters of the assumed distribution for the response can be modelled as additive functions of the explanatory variables

7 Vector generalised additive models

See Yee (2015).

8 Vector generalised hierarchical additive models for location, scale and shape

Exercise for the student.

9 Generalised estimating equations

🏗

But see Johnny Hong and Kellie Ottoboni. Is this just the quasi-likelihood thing again?

10 GGLLM

Generalized² Linear² models (Gordon 2002) unify GLMs with non-linear matrix factorisations.

11 References

Atal. 2006. The History of Linear Prediction.” IEEE Signal Processing Magazine.
Barbier, Krzakala, Macris, et al. 2017. Phase Transitions, Optimal Errors and Optimality of Message-Passing in Generalized Linear Models.” arXiv:1708.03395 [Cond-Mat, Physics:math-Ph].
Bolker, Brooks, Clark, et al. 2009. Generalized Linear Mixed Models: A Practical Guide for Ecology and Evolution.” Trends in Ecology & Evolution.
Boyd, Hastie, Boyd, et al. 2016. Saturating Splines and Feature Selection.” arXiv:1609.06764 [Stat].
Breslow, and Clayton. 1993. Approximate Inference in Generalized Linear Mixed Models.” Journal of the American Statistical Association.
Buja, Hastie, and Tibshirani. 1989. Linear Smoothers and Additive Models.” Annals of Statistics.
Currie, Durban, and Eilers. 2006. Generalized Linear Array Models with Applications to Multidimensional Smoothing.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Eichler, Dahlhaus, and Dueck. 2016. Graphical Modeling for Multivariate Hawkes Processes with Nonparametric Link Functions.” Journal of Time Series Analysis.
Finke, and Singh. 2016. Approximate Smoothing and Parameter Estimation in High-Dimensional State-Space Models.” arXiv:1606.08650 [Stat].
Friedman, Hastie, and Tibshirani. 2010. Regularization Paths for Generalized Linear Models via Coordinate Descent.” Journal of Statistical Software.
Gordon. 2002. Generalized² Linear² Models.” In Proceedings of the 15th International Conference on Neural Information Processing Systems. NIPS’02.
Hansen. 2010. Penalized Maximum Likelihood Estimation for Generalized Linear Point Processes.” arXiv:1003.0848 [Math, Stat].
Hastie, and Tibshirani. 1990. Generalized Additive Models.
Heyde. 1997. Quasi-Likelihood and Its Application: A General Approach to Optimal Parameter Estimation. Springer Series in Statistics 2.0.
Hoaglin, and Welsch. 1978. The Hat Matrix in Regression and ANOVA.” The American Statistician.
Lee, Nelder, and Pawitan. 2006. Generalized linear models with random effects. Monographs on statistics and applied probability 106.
Lu. 2022. A Rigorous Introduction to Linear Models.”
Mayr, Fenske, Hofner, et al. 2012. Generalized Additive Models for Location, Scale and Shape for High Dimensional Data—a Flexible Approach Based on Boosting.” Journal of the Royal Statistical Society: Series C (Applied Statistics).
McCullagh. 1984. Generalized Linear Models.” European Journal of Operational Research.
Nelder, and Baker. 2004. Generalized Linear Models.” In Encyclopedia of Statistical Sciences.
Nelder, and Wedderburn. 1972. Generalized Linear Models.” Journal of the Royal Statistical Society. Series A (General).
Scandroglio, Gori, Vaccaro, et al. 2013. Estimating VaR and ES of the Spot Price of Oil Using Futures-Varying Centiles.” International Journal of Financial Engineering and Risk Management.
Stasinopoulos, D. Mikis, and Rigby. 2007. Generalized Additive Models for Location Scale and Shape (GAMLSS) in R.” Journal of Statistical Software.
Stasinopoulos, Dimitrios, Rigby, Heller, et al. n.d. Flexible Regression and Smoothing: Using GAMLSS in R.
Thrampoulidis, Abbasi, and Hassibi. 2015. LASSO with Non-Linear Measurements Is Equivalent to One With Linear Measurements.” In Advances in Neural Information Processing Systems 28.
Venables, and Dichmont. 2004. GLMs, GAMs and GLMMs: An Overview of Theory for Applications in Fisheries Research.” Fisheries Research, Models in Fisheries Research: GLMs, GAMS and GLMMs,.
Wedderburn. 1974. Quasi-Likelihood Functions, Generalized Linear Models, and the Gauss—Newton Method.” Biometrika.
———. 1976. On the Existence and Uniqueness of the Maximum Likelihood Estimates for Certain Generalized Linear Models.” Biometrika.
Wood. 2008. Fast Stable Direct Fitting and Smoothness Selection for Generalized Additive Models.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Xia, Wang, and Jiang. 2014. Asymptotic Properties of Maximum Quasi-Likelihood Estimator in Quasi-Likelihood Nonlinear Models with Misspecified Variance Function.” Statistics.
Yee. 2015. Vector Generalized Linear and Additive Models. Springer Series in Statistics.
Zoeter. 2007. Bayesian Generalized Linear Models in a Terabyte World.” In 2007 5th International Symposium on Image and Signal Processing and Analysis.