Tutorial: GAMs and other smooth GLMs with R
 


Simon Wood, Mathematical Sciences, University of Bath, UK

Abstract


Generalized Additive Models (GAMs) are Generalized Linear Models (GLMs) in which the linear predictor is specified partly in terms of a sum of smooth functions
of covariates. Generalizing a little further, it is possible to include
random effects terms in the linear predictor, yielding Generalized Additive Mixed Models (GAMM), and also to include linear functionals of smooths as terms (leading to signal regression models, functional GLMs, varying coefficient models etc.). Representing the smooth functions using low rank splines (P-splines or classical penalized regression splines) leads to a computationally convenient and fairly complete framework for such modelling, in particular allowing reliable estimation of the appropriate degree of smoothness for each component function. The
workshop will give an outline of this penalized regression spline approach to penalized GLMs as implemented in R package mgcv, and will discuss issues of model building, checking and inference, in practice.


Topics

1) GAMs, GAMMs and other penalized GLMs.
2) Sketch of the theory for low rank penalized spline regression.
3) Smoothness selection.
4) Alternative smoothers: P-splines, thin plate regression splines, tensor
product smoothing and adaptive smoothing. What to use when.
5) Model checking: residuals, smoothing basis dimension, stability of results.
6) Model selection: shrinkage and other approaches.
7) Inference: using the Bayesian smoothing model.

Examples and exercises will use mgcv and will include some examples of penalized GLMs beyond GAMs.


Audience

People interested in using GAMs and other penalized GLMs. Knowledge of GLMs and the use of `glm' in R is assumed. Some previous experience of mixed modelling and smoothing (in particular use of `gam' in package `mgcv') would be useful, but is not essential.