Summarize this article:
229 LearnersLast updated on November 25, 2025

A continuous random variable is a random variable that can have different values within a specific range. It is used to represent measurements like weight, height, and time. In this article, we will explore the concepts and properties of continuous random variables.
A continuous random variable is a type of variable in probability and statistics that can take any value within a given interval. Unlike a discrete random variable, which can take only countably many values such as 1, 2, or 3, a continuous random variable can assume an infinite number of values.
A continuous random variable is used to represent quantities measured rather than counted, such as a person's height, the time taken to finish a task, or the amount of rainfall during a day. These measurements may vary gradually, meaning they do not move in fixed steps but can take any value between two points. Therefore, continuous random variables are useful in modelling many real-world situations.
Continuous Random Variable Definition
A continuous random variable is a variable that can take any value within a specified interval, meaning its possible outcomes form an uncountably infinite set. It represents quantities that are measured, and its probability distribution is described by a probability density function (PDF).
Continuous Random Variable Examples
Continuous random variables are used to represent quantities that can take any value within a range. They are applicable in many real-world examples, and a few are listed below:
The probabilities associated with a continuous random variable are described by the probability density function (PDF) and cumulative probability function (CDF). Here are some key formulas related to continuous random variables.
PDF of Continuous Random Variable
PDF of a continuous random variable is a function that shows the likelihood of the variable taking on different values. It does not provide probabilities for specific values but rather for intervals. X is the continuous random variable, and the formula for the PDF, f(x), is:
f(x) = dF(x) / dx = F'(x)
Here, F(x) is the cumulative distribution function.
CDF of Continuous Random Variable
It describes the likelihood that a specific value, x, will be equal to or less than the random variable, X. CDF of a continuous random variable can be found by integrating the PDF. The formula is calculated between two points, a and b. The formula for CDF of continuous random variable is:
The Mean of Continuous Random Variable
The mean of the continuous random variable, X, is its expected value. It is calculated as a weighted average of all possible values. Also, each value is weighted according to the probability density function (PDF). The formula is:
The Variance of Continuous Random Variable
It measures how much the values of X deviate from the mean. It is the expected value of the squared differences between the variable and its mean. The formula is given as follows:
To understand the probability distributions and conduct statistical analysis, we need to comprehend the various properties of a continuous random variable. Several important properties distinguish the continuous random variable from the discrete random variable. They include:
Probability Density Function
A probability density function f(x) defines a continuous random variable x. The relative probability for x to be a certain value x is represented by the PDF f(x). The PDF needs to meet two conditions. The first one is f(x) ≥ 0 for all x. It means the probability density function (PDF) must always be non-negative but can be zero for some values. The next condition is that the total probability over all possible values of x must be 1, meaning the sum of all probability values must be 100% or 1. This is written as:
\(\int_{-\infty}^{\infty} f(x)\, dx = 1 \)
Cumulative Distribution Function (CDF)
The cumulative distribution function (CDF), represented as F(x), shows the likelihood that a continuous random variable X will take a value that is less than or equal to x:
\(F(x) = P(X \leq x) = \int_{-\infty}^{x} f(t)\,dt \)
The CDF is non-decreasing and continuous. The CDF approaches 0 as x moves closer to negative infinity.
lim F(x) = 0
x→∞
Likewise, the CDF approaches 1 as x moves closer to positive infinity.
lim F(x) = 1
x→∞
Moment Generating Function (MGF)
The moment generating function (MGF) of a continuous random variable X is represented as Mx (t). It is defined as:
\(M_X(t) = E\!\left(e^{tX}\right) = \int_{-\infty}^{\infty} e^{tx} f(x)\,dx \)
The MGF can be used to find all the moments of X, including its mean and variance if it exists.
Characteristic Function
The characteristic function of a continuous random variable X is expressed as ϕX(t). It is the Fourier transform of the probability density function (PDF).
\(\varphi_X(t) = E\!\left(e^{itX}\right) = \int_{-\infty}^{\infty} e^{itx} f(x)\,dx\)
It determines the distribution of X uniquely.
Mean and Variance of Continuous Random Variable
A continuous random variable X with PDF f(x) has the following expectation or mean:
\(E(X) = \int_{-\infty}^{\infty} x f(x)\,dx \)
The formula represents the expected value of X.
The variance of X measures the average of the squared deviations of the random variable from the mean. It is defined as:
\(\operatorname{Var}(X) = E\!\left[(X - E(X))^{2}\right] = \int_{-\infty}^{\infty} (x - \mu)^{2} f(x)\,dx \)
Here, μ = E (X) is the mean.
Where X's second moment is represented by E(X2) and it is denoted as:
\(E(X^2) = \int_{-\infty}^{\infty} x^2 f(x)\,dx \)


A random experiment’s numerical outcome is called a random variable. The two types of random variables are discrete random variable and continuous random variable. Let us look at the differences between them:
| Continuous Random Variable | Discrete Random Variable |
|---|---|
| Can have any value within a specified range | Can only have particular and separate values |
| The possible values are infinite within a certain range. Example: All real values between 1 and 2 | The possible values are finite or countably infinite. Example: 1, 2, 3, and so on. |
| A probability density function (PDF) describes a continuous random variable | A probability mass function describes a discrete random variable |
| The probability of a single value is zero (P (X = x) = 0 | The probability of a single value is non-zero (P (X = x) > 0 |
| It is represented by a smooth curve. | It is represented by bar graphs |
| Examples include time, distance, temperature, height, and weight | Examples include the number of children, the number of flowers, and the results of a die roll |
We use continuous random variables to model situations that involve measurements. For example, it is used when we want to know the possible amount of rainfall over a year or temperature on any given day. Following are the significant continuous random variables linked to certain probability distributions:
Uniform Random Variable
A uniform random variable represents a uniform distribution, which describes events with equal chances of happening. A uniform random variable’s PDF is as follows:
\(f(x) = \begin{cases} \dfrac{1}{b - a}, & a \le x \le b, \\ 0, & \text{otherwise.} \end{cases}\)
Here, a and b are the lower and upper bounds of the distribution.
Normal Random Variable
A normal random variable is a continuous random variable that is used to model a normal distribution. If a normal distribution’s parameters are expressed as X ~ N (μ, σ2), then the following is the formula for the PDF:
\(f(x) = \frac{1}{\sigma \sqrt{2\pi}} e^{-\frac{1}{2}\left(\frac{x - \mu}{\sigma}\right)^2}\)
Here, μ is the mean
σ is the standard deviation
σ2 is the variance
Exponential Random Variable
An exponential distribution is a continuous probability distribution used to model processes in which a specific number of events occur continuously and independently at a constant average rate λ, where λ ≥ 0. The exponential random variable follows an exponential distribution. The following is the PDF of an exponential random variable:
\(f(x) = \begin{cases} \lambda e^{-\lambda x}, & x \ge 0, \\ 0, & x < 0. \end{cases} \)
Where,
\(\lambda \) > 0 is the rate parameter
x \(\ge\ \)0 because the exponential distribution models non-negative values
Learning continuous random variables becomes much simpler when you understand how they behave and how their probabilities are calculated. By using some practical tips and tricks, learners can build a strong foundation. Likewise, parents and teachers can ensure a better learning experience.
A continuous random variable is a random variable that can take any value within a specific range. Knowing the key properties and concepts of this variable helps us to solve various mathematical problems, and it improves our problem-solving skills. Here are some of the common mistakes and their helpful solutions to avoid these errors.
Within a specified range, a continuous random variable can have an infinite number of values. This concept is widely used in situations where measurements are involved. Here are some real-world applications of continuous random variables.
A random variable X has the probability density function (PDF): f (x) = k(5−x), 0 ≤ x ≤5. Find the value of k.
0.08
A function must adhere to the following basic rule in order to be considered a valid probability density function (PDF):
\(\int_{-\infty}^{\infty} f(x)\,dx = 1 \)
As we know, the given PDF is defined for 0 ≤ x ≤5, then the integral is taken over this range:
\(\int_{0}^{5} f(x)\,dx = 1 \)
Next, we can substitute f(x) = k (5 − x):
So, now we have to factor out the constant k:
\(k \int_{0}^{5} (5 - x)\,dx = 1 \)
Then, we have to integrate (5 − x) term by term:
∫ (5 − x) dx = ∫ 5 dx − ∫ x dx
Here, we have to use the basic integration rules:
∫ 5 dx = 5x
∫ x dx = x2 / 2
So the antiderivative is:
\(\int (5 - x)\,dx = 5x - \frac{x^2}{2} \)
Next, we evaluate the definite integral from 0 to 5:
\(\begin{align*} \int_{0}^{5} (5 - x)\,dx &= \left[ 5x - \frac{x^2}{2} \right]_0^5 \\ &= \left( 5(5) - \frac{5^2}{2} \right) - \left( 5(0) - \frac{0^2}{2} \right) \\ &= (25 - 12.5) - 0 \\ &= 12.5 \end{align*} \)
Substitute back to solve for k:
So, k (12.5) = 1
k = 1 / 12.5
k = 0.08
Hence, the value of k is 0.08.
For a continuous random variable with PDF: f (x) = 4x^2, 0 ≤ x ≤ 1. Find the value of E (X).
1
Given: f (x) = 4x2, 0 \(\le \) x \(\le \) 1
We have to find E (X) of X. Here, we can use the formula:
\(E(X) = \int_{a}^{b} x f(x)\,dx\)
Where a = 0 and b = 1
Now, we have to substitute f (x)
\(E(X) = \int_{0}^{1} x (4x^2)\,dx = \int_{0}^{1} 4x^3\,dx\)
Integrating,
\(\int 4x^3\,dx = x^4 + C\)
Evaluating the definite integral from 0 to 1,
\(E(X) = \left[ x^4 \right]_0^1 = 1^4 - 0^4 = 1 \)= 1
Find the mean E(X) for the uniform distribution U (2,10).
6
The formula for the mean of a uniform distribution U (a, b) is:
E (X) = (a + b) / 2
From U (2, 10), we have a = 2 and b = 10, so,
E (X) = (2 + 10) / 2
= 12 / 2 = 6
Hence, the mean is 6.
Find the median m such that P (X ≤ m) = 0.6 for the uniform distribution U (0, 5).
m = 3
Here, to find the value of the median, we have to use the formula:
F(m) = (m − a) / (b − a)
From U (0,5), this becomes:
F(m) = (m − 0) / 5 = m / 5
Setting F(m) = 0.6
m / 5 = 0.6
Next, we can solve for m:
m = 0.6 × 5
m = 3
The median is 3.
Find the median m such that P (X ≤ m) = 0.5 for the uniform distribution U (2, 8).
m = 5
F(m) = (m − a) / (b − a)
From U (2,8), this becomes:
F(m) = (m − 2) / (8 − 2) = (m − 2) / 6
Setting F(m) = 0.5:
(m − 2) / 6 = 0.5
Next, we can solve for m:
m − 2 = 0.5 × 6
m − 2 = 3
m = 5
So, the median is 5.
Jaipreet Kour Wazir is a data wizard with over 5 years of expertise in simplifying complex data concepts. From crunching numbers to crafting insightful visualizations, she turns raw data into compelling stories. Her journey from analytics to education ref
: She compares datasets to puzzle games—the more you play with them, the clearer the picture becomes!






