Last updated on July 9th, 2025
In probability, a discrete random variable is a variable that can take a countable set of values, such as whole numbers or integers. These values represent the possible outcomes of a random experiment.
There are two types of random variables: discrete and continuous. Discrete random variables take on a countable set of distinct values, such as 0, 1, 2, 3, …. The outcomes of a random experiment are countable. The probability distribution of discrete random variables is represented using a probability mass function.
Random variables can be classified into two types: discrete and continuous. Here is the difference between them.
Discrete Random Variable | Continuous Random Variable |
|
|
|
|
|
|
|
|
The mean and variance are used to describe the behavior of a discrete random variable. Here, we will discuss what the mean and variance of discrete random variables, denoted E[X], is its expected value.
Mean of Discrete Random Variable:
The mean of a discrete random variable is the average value of a random variable. It is represented as E[X], where X is the random variable. The mean is also known as the expected value and weighted average.
The formula to calculate the mean of a discrete random variable is:
E[X] = Σx xP (X = x)
Where P(X = x) is the probability mass function
Variance of a Discrete Random Variable:
The average of the squared deviations of the random variable from its mean is the variance of the discrete random variable. It can be represented as Var[X] or σ2.
The variance of discrete random variables is calculated using the formula:
Var[X] = Σ(x - µ)2 P(X = x)
Here, µ is the mean, and p(X = x) is the probability of each value.
The discrete random variable is a variable that can take on specific countable values. It represents the possible outcomes of a random experiment and the probability of each outcome. The types of discrete random variables are:
The binomial random variables are the possible outcomes of a binomial experiment. A binomial random variable involves a fixed number of independent Bernoulli trials, where each result is either a success or a failure. It is represented as X ∼ Bin(n, p), where X is the binomial random variable.
For a binomial random variable, the probability mass function:
P (X = x) = (xn) px (1 - p)n - x
The geometric random variables represent the number of trials needed to get the first success in a sequence of independent Bernoulli trials. In a geometric random variable, the probability of success is denoted by p, and the probability of failure is 1 - p
The geometric variable is represented by: X ∼ Geom(p)
For a geometric random variable, the probability mass function is:
P(X = x) = (1 - p)x - 1 p
A Bernoulli random variable represents an experiment with two possible outcomes, i.e., 1 for a success and 0 for a failure. It is represented by X ∼ Bern(p).
The probability mass function of a Bernoulli random variable is:
P(X = x) = { p if x = 01 -p if x = 1
A Poisson random variable represents the number of events that occur within a fixed time interval of time or space, where the events occur independently and at a constant rate. It is denoted by X ∼ λ, where λ is the parameter of a Poisson distribution, which is always greater than 0.
The probability mass function is: P(X = x) = λxe-λ/x!, x = 0, 1, 2, … .
The probability distribution of a discrete random variable is the probability of each of its possible outcomes. When constructing a distribution table, the probability should follow these conditions:
In fields like quality control, statistics, and finance, we use discrete random variables. In this section, we will learn some real-world applications of discrete random variables.
In probability, understanding the discrete random variable is important, but students often make errors. Here are a few common mistakes and ways to avoid them.
Let X be the number shown on a fair six-sided die. Find the probability mass function and expected value
The probability mass function is ⅙, and the expected value is 3.5
The probability mass function is the ratio of the number of outcomes to the total outcomes
Here, k = 1, 2, 3, 4, 5, 6
So, the sample space is {1, 2, 3, 4, 5, 6}, so |S| = 6
P(X = k) = ⅙
The expected value: E[X] = ΣxP (X = x)
Here, P(X = x) = ⅙
So, E[X] = Σx × 1/6
= (1 + 2 + 3 + 4 + 5 + 6)/6
= 21/6 = 3.5
Let X be the number of heads when a fair coin is tossed twice. Find the probability distribution of X.
P(X = 0) = P(TT) = 1/4
P(X = 1) = P(HT or TH) = 1/2
P(X = 2) = P(HH) = 1/4
The sample space = {TT, TH, HT, HH}
Assuming the number of heads observed as X, then
Outcomes TT → X = 0
Outcomes TH, HT → X = 1
Outcomes HH → X = 2
The probability = favorable outcomes/total outcomes.
P(X = 0) = 1/4
P(X = 1) = 2/4 = 1/2
P(X = 2) = 1/4
Let X be the outcome when a fair die is rolled. Find the expected value.
The expected value is 3.5
Here, P(X = k) = ⅙
k = 1, 2, 3, 4, 5, 6
The expected value is calculated using: E[X] = ΣxP (X = x) = (1 - p)x - 1 p
= 1/6 (1 + 2 + 3 + 4 + 5 + 6)
= 1/6 × 21
= 3.5
A machine fails with probability p = 0.4 per test. Let X be the number of tests until the first failure. Find P(X = 3)
Here, P(X = 3) = 0.144
The formula to find the probability mass function of a geometric random variable is:
P(X = k) = (1 - p)k - 1 × p
= (1 - 0.4)2 × 0.4
= (0.6)2 × 0.4
= 0.36 × 0.4
= 0.144
A bookstore receives an average of 2 online orders per hour. Let X ∼ Poisson be the number of orders in an hour. Find P(X = 3)
Here, P(X = 3) = 0.18
The probability mass function of a Poisson random variable,
P(X = x) = λxe-λ/x!
Here, λ = 2 and x = 3
P(X = x) = (2)3e-2/3!
= 8 × e-2/3 × 2 × 1, where e = 2.718
= (8 × 2.718-2) / 6
= 1.0824 / 6
= 0.18
Jaskaran Singh Saluja is a math wizard with nearly three years of experience as a math teacher. His expertise is in algebra, so he can make algebra classes interesting by turning tricky equations into simple puzzles.
: He loves to play the quiz with kids through algebra to make kids love it.