Free SOA Exam SRM (Statistics for Risk Modeling) Formula Sheet (2026)

Every Exam SRM formula you need on the test, grouped by topic, rendered with full math notation. 22 formulas across 5 topics, calibrated to the 2026 syllabus. Free forever, no signup required.

22 Formulas
5 Topics
2026 Syllabus
Free Forever
Print-ready PDF: 1080x1350 portrait, math pre-rendered, fonts embedded. Download once, study anywhere.
Download PDF →

All Exam SRM Formulas

Basics of Statistical Learning 2 items
Bias-variance tradeoff
E[(yf^(x))2]=Bias2(f^)+Var(f^)+σε2E[(y-\hat{f}(x))^2]=\text{Bias}^2(\hat{f})+\text{Var}(\hat{f})+\sigma^2_\varepsilon
Irreducible error σε2\sigma^2_\varepsilon cannot be reduced
k-fold cross-validation error
CV(k)=1kj=1kMSEjCV_{(k)}=\dfrac{1}{k}\sum_{j=1}^k \text{MSE}_j
Each fold serves as validation once
Linear Models 10 items
R-squared
R2=1SSresSStot=1(yiy^i)2(yiyˉ)2R^2 = 1 - \dfrac{SS_{\text{res}}}{SS_{\text{tot}}} = 1 - \dfrac{\sum(y_i-\hat{y}_i)^2}{\sum(y_i-\bar{y})^2}
Adjusted R-squared
Rˉ2=1(1R2)(n1)np1\bar{R}^2 = 1 - \dfrac{(1-R^2)(n-1)}{n-p-1}
pp=number of predictors (excludes intercept)
OLS estimator (matrix form)
β^=(XX)1Xy\hat{\boldsymbol{\beta}}=(\mathbf{X}^\top\mathbf{X})^{-1}\mathbf{X}^\top\mathbf{y}
F-statistic (regression)
F=(SStotSSres)/pSSres/(np1)F=\dfrac{(SS_{\text{tot}}-SS_{\text{res}})/p}{SS_{\text{res}}/(n-p-1)}
Tests H0:H_0: all slope coefficients are zero
Variance inflation factor (VIF)
VIFj=11Rj2\text{VIF}_j = \dfrac{1}{1-R_j^2}
Rj2R_j^2=R2R^2 from regressing XjX_j on all other predictors
VIF>5–10 indicates multicollinearity
AIC
AIC=2p2lnL^AIC = 2p - 2\ln\hat{L}
pp=number of parameters; lower is better
BIC
BIC=plnn2lnL^BIC = p\ln n - 2\ln\hat{L}
Penalizes complexity more than AIC for n>7n>7
Ridge regression penalty
Minimize: (yiy^i)2+λj=1pβj2\sum(y_i-\hat{y}_i)^2+\lambda\sum_{j=1}^p\beta_j^2
L2L_2 penalty; shrinks but does not zero out coefficients
LASSO penalty
Minimize: (yiy^i)2+λj=1pβj\sum(y_i-\hat{y}_i)^2+\lambda\sum_{j=1}^p|\beta_j|
L1L_1 penalty; produces sparse solutions (exact zeros)
Elastic net penalty
Minimize: (yiy^i)2+λ1βj+λ2βj2\sum(y_i-\hat{y}_i)^2+\lambda_1\sum|\beta_j|+\lambda_2\sum\beta_j^2
Combines LASSO (L1L_1) and Ridge (L2L_2)
Time Series Models 5 items
AR(1) model
Xt=ϕXt1+εt,εtWN(0,σ2)X_t=\phi X_{t-1}+\varepsilon_t,\quad\varepsilon_t\sim WN(0,\sigma^2)
Stationary iff ϕ<1|\phi|<1
MA(1) model
Xt=εt+θεt1,εtWN(0,σ2)X_t=\varepsilon_t+\theta\varepsilon_{t-1},\quad\varepsilon_t\sim WN(0,\sigma^2)
Always stationary
ARMA(1,1) model
Xt=ϕXt1+εt+θεt1X_t=\phi X_{t-1}+\varepsilon_t+\theta\varepsilon_{t-1}
Stationary iff ϕ<1|\phi|<1
ACF of AR(1)
ρ(h)=ϕh,h=0,1,2,\rho(h)=\phi^h,\quad h=0,1,2,\ldots
Decays geometrically; PACF cuts off after lag 1
Ljung-Box test statistic
Q=n(n+2)k=1mρ^k2nkQ=n(n+2)\sum_{k=1}^m\dfrac{\hat{\rho}_k^2}{n-k}
Tests H0:H_0: first mm autocorrelations are zero
Distributed χ2(m)\chi^2(m) under H0H_0
Decision Trees 3 items
Gini impurity
G=k=1Kp^k(1p^k)=1k=1Kp^k2G = \sum_{k=1}^{K} \hat{p}_k(1-\hat{p}_k) = 1 - \sum_{k=1}^{K}\hat{p}_k^2
p^k\hat{p}_k=fraction of class kk in node
Entropy (node impurity)
H=k=1Kp^klogp^kH = -\sum_{k=1}^{K}\hat{p}_k\log\hat{p}_k
Bagging (bootstrap aggregation)
Train BB trees on bootstrap samples; aggregate predictions
f^(x)=1Bb=1Bf^b(x)\hat{f}(x)=\dfrac{1}{B}\sum_{b=1}^B\hat{f}_b(x)
Reduces variance without increasing bias
Unsupervised Learning Techniques 2 items
PCA — proportion of variance explained
PVEk=λkj=1pλj\text{PVE}_k=\dfrac{\lambda_k}{\sum_{j=1}^p\lambda_j}
λk\lambda_k=kkth eigenvalue of covariance (or correlation) matrix
K-means objective function
minC1,,CKk=1KiCkxixˉk2\min_{C_1,\ldots,C_K}\sum_{k=1}^K\sum_{i\in C_k}\|x_i-\bar{x}_k\|^2
xˉk\bar{x}_k=centroid of cluster kk

Frequently Asked Questions

Is the Exam SRM formula sheet free?
Yes. The full Exam SRM formula sheet is free, with no signup, no email, and no credit card required. 22 formulas across 5 topics, all rendered with the same KaTeX math notation used in the FreeFellow study app.
Can I download the Exam SRM formula sheet as a printable PDF?
Yes. A 1080x1350 portrait PDF (Instagram and LinkedIn carousel native size, also great for tablet study) is linked at the top of this page. The PDF is fully self-contained: math is pre-rendered, fonts are embedded, no internet connection needed once downloaded.
What's covered on the Exam SRM formula sheet?
Every formula is grouped by official syllabus topic, with the formula in math notation plus a one-line note on when to use it (or a watch-out from CAIA, CFA, or other prep-provider commentary). Coverage is calibrated to the 2026 syllabus and refreshed when the corpus changes.
Is this formula sheet affiliated with SOA?
No. FreeFellow is not affiliated with the SOA or any examination body. This is an independent study aid covering the published syllabus.
What else is free at FreeFellow for Exam SRM candidates?
The full question bank with detailed solutions, mixed practice, readiness tracking, lessons (where available), and the formula sheet are all free forever. Fellow ($59/quarter or $149/year per track) unlocks timed mock exams, spaced-repetition flashcards, performance analytics, AI essay grading, and a personalized study plan.
Practice Exam SRM questions free →

About FreeFellow

FreeFellow is an AI-native exam prep platform for actuarial (SOA & CAS), CFA, CFP, CPA, CAIA, and securities licensing candidates — built around modern AI as a core capability rather than as a bolt-on. Every lesson ships with AI-narrated audio. Every constructed-response item has a copy-to-AI prompt builder so candidates can paste their answer into their own ChatGPT or Claude for self-graded feedback. Fellow members get instant AI grading on essays against the official rubric (currently CFA Level III, expanding to other essay-bearing sections).

The 70% you need to pass — question bank, written solutions, lessons, formula sheet, mixed practice, readiness tracking — is free forever, with no trial period and no credit card. Become a Fellow ($59/quarter or $149/year per track) to unlock mock exams, flashcards with spaced repetition, performance analytics, AI essay grading, and a personalized study plan.