Quantitative risk measurement
Mathematics of actuarial and financial disaster
April 30, 2015 — September 22, 2020
Actuarial bread-and-butter. The mathematical study of measuring the chances of something terrible happening. This is usually a financial risk, but can also be extreme weather conditions, earthquakes, whatever.
This usage of risk is distinct from the “risk” in “statistical risk bounds”, which is the domain of statistical learning theory. He we are talking about probabilities that things are going to go wrong, rather than expected badness of an estimate over some class of functions.
How do you evaluate how bad the worst cases are when deciding whether to do something? Generally, this involves ignoring how good the best scenario is; Given financial history, probably that’s the niche to worry about filling. How do you trade off the badness and likelihood of the bad cases? This is about the risk measures themselves. For a useful class of distributions to use in this context, it might be worth considering extreme value theory.
🏗 introduce risk coherence, and Discuss application to climate risk, and scheduling.
1 Value-at-Risk
\(X\) is a random variable, the payoff of a portfolio at some future time, and our quantile of interest is \(0 < \alpha < 1\).
The \(\alpha-\operatorname{VaR}\) of an asset \(X\) with cdf \(F\) is, up to minus signs, an estimate of the icdf. It is defined as
\[ \begin{aligned} \operatorname{VaR}_\alpha(X) &:= \inf\{x\in\mathbb{R}:\mathbb{P}(X<-x)\leq 1-\alpha\}\\ &= \inf\{x\in\mathbb{R}:1-F(-x)\geq \alpha\}\\ \end{aligned} \]
2 Expected shortfall
“the expected loss of portfolio value given that a loss is occurring at or above the \(q\)-quantile.”
The expected shortfall (ES) is defined
\[ ES_{\alpha} := \frac{1}{\alpha}\int_0^{\alpha} \operatorname{VaR}_{\gamma}(X)d\gamma\\ = -\frac{1}{\alpha}\left(E[X \ 1_{\{X \leq x_{\alpha}\}}] + x_{\alpha}(\alpha - \mathbb{P}[X \leq x_{\alpha}])\right) \]
where VaR is Value At Risk, and where \(x_{\alpha} = \inf\{x \in \mathbb{R}: P(X \leq x) \geq \alpha\}\)
It is also known as the Conditional Value-at-Risk, \(\alpha-\operatorname{CVaR}.\)
According to Wikipedia, I might care about the dual representation, \(ES_{\alpha} = \inf_{Q \in \mathcal{Q}_{\alpha}} E^Q[X]\) with \(\mathcal{Q}_{\alpha}\) the set of probability measures absolutely continuous with respect to the physical measure \(P\), such that \(\frac{dQ}{dP} \leq \alpha^{-1}\) almost surely.
…Why might I care about that again?
3 Subadditivity/coherence
TBC
4 G-expectation
I don’t understand this yet, but Shige Peng just gave a talk wherein he argued that the generalised, sublinear expectation operator derived from distributional uncertainty generates coherent risk measures, although it is not immediately obvious to me how this works. See, e.g. Shige Peng (2004).
5 Sensitivity to parameters of risk measures
SWIM is a method for analysis of sensitivity for parameter assumptions in risk measures. (R package) I might use this article as a point of entry to this field if I need it.
An efficient sensitivity analysis for stochastic models based on Monte Carlo samples. Provides weights on simulated scenarios from a stochastic model, such that stressed random variables fulfil given probabilistic constraints (e.g. specified values for risk measures), under the new scenario weights. Scenario weights are selected by constrained minimisation of the relative entropy to the baseline model.
6 Rosenblatt transform
Mentioned mnemonically because it seems to arise in QRM all the time. The Rosenblatt transform is the one that takes a vector random variate with a known (continuous) joint distribution and transforms it to a uniform distribution over the unit hypercube.
7 Knightian risk
See black swans.