SOA Exam P (Probability) Glossary
37 essential terms and definitions for SOA Exam P (Probability). Each definition is written for exam preparation, covering the concepts as they are tested on the 2026 syllabus.
B
- Bayes' Theorem
- Bayes' Theorem is a formula that describes how to update the probability of a hypothesis given new evidence, expressing the posterior probability in terms of the prior probability, the likelihood, and the marginal likelihood.
- Bernoulli Distribution
- Bernoulli distribution is a discrete probability distribution for a random variable that takes the value 1 with probability p and the value 0 with probability 1-p, modeling a single trial with two possible outcomes.
- Beta Distribution
- Beta distribution is a continuous probability distribution defined on the interval [0, 1] and parameterized by two positive shape parameters alpha and beta, commonly used to model proportions and prior distributions in Bayesian analysis.
- Binomial Distribution
- Binomial distribution is the discrete probability distribution of the number of successes in a fixed number of independent Bernoulli trials, each with the same probability of success.
C
- Central Limit Theorem
- Central Limit Theorem states that the sampling distribution of the sample mean of independent, identically distributed random variables with finite variance approaches a normal distribution as the sample size increases, regardless of the underlying distribution.
- Chebyshev's Inequality
- Chebyshev's Inequality is a probabilistic bound stating that for any random variable with finite mean and variance, the probability that it deviates from its mean by more than k standard deviations is at most 1/k-squared.
- Conditional Probability
- Conditional probability is the probability of an event A occurring given that another event B has already occurred, defined as the ratio of the joint probability of A and B to the probability of B.
- Convolution
- Convolution refers to the mathematical operation used to determine the probability distribution of the sum of two or more independent random variables, computed by integrating or summing the product of their individual density or mass functions.
- Covariance
- Covariance is a measure of the joint variability of two random variables, indicating the degree to which they tend to deviate from their respective means in the same direction.
- Cumulative Distribution Function
- Cumulative distribution function (CDF) is a function that gives the probability that a random variable takes a value less than or equal to a specified value, defined for all real numbers.
D
- Deductible
- Deductible is the amount of a loss that the policyholder must pay before the insurer begins to pay. In actuarial modeling, an ordinary deductible reduces the loss amount, while a franchise deductible eliminates coverage entirely below the threshold.
- Double Expectation Theorem
- Double Expectation Theorem (Law of Total Expectation) states that the expected value of a random variable equals the expected value of its conditional expectation given another random variable.
E
- Exponential Distribution
- Exponential distribution is a continuous probability distribution that models the time between events in a Poisson process, characterized by its constant hazard rate (memoryless property).
G
- Gamma Distribution
- Gamma distribution is a two-parameter family of continuous probability distributions that generalizes the exponential distribution, commonly used to model waiting times for multiple events in a Poisson process.
- Geometric Distribution
- Geometric distribution is a discrete probability distribution that models the number of Bernoulli trials needed to achieve the first success, with each trial having the same probability of success p.
H
- Hazard Rate Function
- Hazard rate function (failure rate) is the instantaneous rate of failure at time t given survival up to time t, defined as the ratio of the probability density function to the survival function.
I
- Independence
- Independence refers to the property where the occurrence of one event does not affect the probability of another event. Two events A and B are independent if and only if the probability of their intersection equals the product of their individual probabilities.
J
- Joint Distribution
- Joint distribution is the probability distribution that describes the simultaneous behavior of two or more random variables, specifying the probability of each combination of values the variables can take.
L
- Law of Total Probability
- Law of Total Probability states that if a set of events partitions the sample space, the probability of any event can be expressed as the sum of conditional probabilities weighted by the partition probabilities.
- Law of Total Variance
- Law of Total Variance decomposes the variance of a random variable into the expected value of the conditional variance plus the variance of the conditional expectation.
- Lognormal Distribution
- Lognormal distribution is a continuous probability distribution of a random variable whose natural logarithm is normally distributed, commonly used to model quantities that are the product of many independent positive factors.
- Loss Elimination Ratio
- Loss Elimination Ratio is the proportion of expected losses eliminated by imposing a deductible, calculated as the ratio of expected payments below the deductible to total expected losses.
M
- Marginal Distribution
- Marginal distribution is the probability distribution of a subset of random variables obtained by summing or integrating the joint distribution over the values of the other variables.
- Moment Generating Function
- Moment generating function (MGF) is a function that encodes all the moments of a probability distribution, defined as the expected value of e raised to the power tX, and uniquely determines the distribution when it exists in a neighborhood of zero.
N
- Negative Binomial Distribution
- Negative binomial distribution is a discrete probability distribution that models the number of trials needed to achieve a specified number of successes in a sequence of independent Bernoulli trials.
- Normal Distribution
- Normal distribution is a symmetric, bell-shaped continuous probability distribution parameterized by its mean and variance, central to probability theory due to the Central Limit Theorem.
O
- Order Statistics
- Order statistics refers to the values obtained by sorting a random sample in ascending order. The k-th order statistic is the k-th smallest value, used extensively in modeling insurance claim amounts and extreme values.
P
- Pareto Distribution
- Pareto distribution is a heavy-tailed continuous probability distribution used in actuarial science to model large losses, characterized by a power-law tail behavior.
- Poisson Distribution
- Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space, assuming events occur independently at a constant average rate.
- Policy Limit
- Policy limit is the maximum amount an insurer will pay for a covered loss. When a loss exceeds the policy limit, the insured bears the excess, and the insurer payment is capped at the limit minus any deductible.
- Probability Density Function
- Probability density function (PDF) is a function whose integral over an interval gives the probability that a continuous random variable falls within that interval. The PDF must be nonnegative and integrate to one over the entire real line.
- Probability Mass Function
- Probability mass function (PMF) is a function that gives the probability that a discrete random variable equals each of its possible values, with all probabilities summing to one.
S
- Standard Deviation
- Standard deviation is the positive square root of the variance of a random variable, measuring the typical distance of values from the mean in the same units as the variable.
- Survival Function
- Survival function is the complement of the cumulative distribution function, giving the probability that a random variable exceeds a specified value.
U
- Uniform Distribution
- Uniform distribution is a probability distribution in which all outcomes in a specified range are equally likely. For the continuous case on [a, b], the density is constant at 1/(b-a).
V
- Variance
- Variance is a measure of the dispersion of a random variable around its mean, defined as the expected value of the squared deviation from the mean.
W
- Weibull Distribution
- Weibull distribution is a continuous probability distribution used to model time-to-failure data, with a flexible hazard rate that can be increasing, decreasing, or constant depending on its shape parameter.