PTSP Lab Record
PTSP Lab Record
Ranga Harshith
SECTION A
21ECB0A47
Experiment 1
Problem Statement:
Write a MATLAB program to find probability of tossing a coin and rolling a die through large
no. of experimentation.
Theory:
The process of tossing a coin and rolling a die are examples of simple random experiments, where
the outcome of each experiment is uncertain and can be one of several possible outcomes with
equal probability. The probability of each possible outcome can be calculated using the formula:
For a coin toss, the possible outcomes are heads and tails, each with a probability of 0.5. For a roll of
a six-sided die, the possible outcomes are 1, 2, 3, 4, 5, and 6, each with a probability of 1/6.
To simulate these experiments, we can use the randi function in MATLAB to generate a random
integer within a specified range. By repeating this process many times, we can generate a large
number of random outcomes and estimate the probability of each possible outcome based on the
number of times it occurs.
In this experiment, we aim to simulate the process of tossing a coin and rolling a die through a large
number of experiments using MATLAB and to calculate the probability of each possible outcome. By
running a large number of experiments, we can estimate the probabilities of each outcome with
greater accuracy, and compare our results to the theoretical probabilities derived from the formula
above.
Code:
subplot(2, 1, 2);
bar(1:6, die_probabilities);
title('Die Roll Probabilities');
xlabel('Number');
ylabel('Probability');
xticks(1:6);
ylim([0, 1]);
Output:
Conclusion:
In conclusion, this MATLAB program demonstrates how to simulate the probabilities of coin
tosses and die rolls using random number generation and how to visualize the results using
bar charts. By running a large number of trials, we can estimate the probabilities of each
outcome and see that they approach the theoretical probabilities. This approach can be
extended to other probability experiments and can be useful for analysing and visualizing
data in various fields.
Experiment 2
Problem Statement:
Generate Uniform, Gaussian and Exponential distributed random data for given mean and
variance using MATLAB programming.
Theory:
Uniform distribution: A uniform distribution is a probability distribution where all possible values are
equally likely to occur. The probability density function of a uniform distribution is given by:
where a and b are the minimum and maximum values of the distribution, respectively.
where mu is the mean of the distribution, sigma is the standard deviation, and pi is the mathematical
constant pi.
where lambda is the rate parameter, which determines how quickly the probability density function
decreases.
Code:
% Set the desired mean and variance
mu = 10;
sigma2 = 5;
subplot(3, 1, 2);
histogram(gaussian_data, 'Normalization', 'probability');
title('Gaussian (Normal) Distribution');
subplot(3, 1, 3);
histogram(exponential_data, 'Normalization', 'probability');
title('Exponential Distribution');
Output:
Uniform data: mean = 10.030342, variance = 4.743445
Gaussian data: mean = 9.985977, variance = 5.120541
Exponential data: mean = 10.146577, variance = 101.128696
Conclusion:
In this MATLAB code, we have generated three types of random data with a specified mean
and variance: uniform, Gaussian, and exponential. We have then calculated and printed the
sample mean and variance for each distribution. The plots show that the Gaussian
distribution is symmetric and bell-shaped, the uniform distribution is flat and uniform, and
the exponential distribution is skewed to the right.
Experiment 3
Problem Statement:
Write a MATLAB program to generate M trials of a random experiment having specific
number of outcomes with specified probabilities.
Theory:
Suppose a random experiment has n possible outcomes, with probabilities p1, p2, ..., pn. To generate
M trials of this experiment, we can use the rand function to generate a uniformly distributed random
number between 0 and 1 for each trial. Then we can use the cumulative probabilities to determine
which outcome has been chosen.
In this code, we first get the number of outcomes n, the probabilities p for each outcome, and the
number of trials M to generate from the user. Then we compute the cumulative probabilities cp for
each outcome using the cumsum function. Finally, we generate M trials by generating a uniformly
distributed random number r between 0 and 1 using the rand function, and then using the
cumulative probabilities cp to determine which outcome was chosen. The outcomes for each trial are
stored in a 1xM vector and displayed using the disp function.
Code:
% Generate M trials
trials = rand(M, 1);
Output:
Conclusion:
In conclusion, this MATLAB program provides a simple and flexible way to generate M trials
of a random experiment with a specified number of outcomes and probabilities. It can be
used for a wide variety of applications, including statistical simulations, modeling, and
analysis.
Experiment 4
Problem Statement:
Write a MATLAB program to find estimated and true mean of Uniform, Gaussian, and Exponential
distributed data.
Theory:
For each distribution, we will generate a sample of random data using the appropriate MATLAB
function. We will then compute the estimated mean of the sample using the formula:
estimated_mean = sum(data) / n
where n is the number of data points in the sample. We will also compute the true mean of the
distribution using the formula:
where pdf(x) is the probability density function for the distribution. For the Uniform distribution, the
PDF is constant over the range [a, b], and for the Gaussian and Exponential distributions, we can use
the built-in normpdf and exppdf functions, respectively.
Code:
mu_gaussian = 10;
sigma2_gaussian = 5;
mu_exponential = 10;
lambda = 1/mu_exponential;
exponential_data = exprnd(1/lambda, [1000, 1]);
subplot(3,1,2)
histogram(gaussian_data);
title('Gaussian Data');
grid on
subplot(3,1,3)
histogram(exponential_data);
title('Exponential Data');
grid on
Output:
In this MATLAB program, we generated random data with uniform, Gaussian, and
exponential distributions using specified mean and variance values. We then calculated the
sample mean for each distribution and compared it with the true mean value. We found that
the sample mean was close to the true mean for each distribution, demonstrating the
effectiveness of random sampling in estimating population parameters. The plot of the data
distributions and means provides a visual representation of the results.
Experiment 5
Problem Statement:
To find density and distribution function of a function of random variable Y = 2X + 1. Where
X is Gaussian R.V.
Theory:
Given a Gaussian random variable X with mean mu and standard deviation sigma, we can find the
probability density function (PDF) and cumulative distribution function (CDF) of a function of X, say Y
= 2X + 1, as follows:
CDF of Y: FY(y) = P(Y <= y) = P(2X + 1 <= y) = P(X <= (y-1)/2) = FX((y-1)/2) where FX(x) is the CDF of X
This code defines the mean and standard deviation of the Gaussian random variable X, and the
function Y = 2X + 1. It then defines the range of Y values for which we want to find the PDF and CDF,
and uses the formulae for the PDF and CDF of Y to find their values at these Y values. Finally, it plots
the PDF and CDF of Y using the MATLAB subplot function.
Code:
mu = 5;
sigma = 2;
% Define the function Y = 2X + 1
fun = @(x) 2*x + 1;
Output:
Conclusion:
Theory:
Given a Gaussian random variable X with mean mu and standard deviation sigma, and a function of
X, say Y = 2X + 1, we can estimate the mean and variance of Y using the following formulas:
Mean of Y:
E(Y) = 2E(X) + 1
= 2mu + 1
Variance of Y:
Var(Y) = 4Var(X)
= 4sigma^2
We can estimate the mean and variance of Y using a large number of samples of X, say N, as follows:
Code:
% Define the Gaussian distribution for X
mu_X = 10;
sigma_X = 2;
X = normrnd(mu_X, sigma_X, [1000, 1]);
% Define Y = 2X + 1
Y = 2*X + 1;
Output:
True mean of Y: 21.000000
Estimated mean of Y: 21.263994
True variance of Y: 16.000000
Estimated variance of Y: 16.945843
Conclusion:
This code generates a Gaussian random variable X with mean 10 and standard deviation 2. It
then calculates Y = 2X + 1 and the true and estimated mean and variance of Y. It plots the
histogram of Y and overlays the true and estimated Gaussian distributions of Y. The resulting
plot shows that the estimated mean and variance of Y are close to the true values, and that
the distribution of Y is approximately Gaussian.
Experiment 7
Problem Statement:
Plot Joint density and distribution function of sum of two Gaussian random variable (Z = X +
Y) using MATLAB programming.
Theory:
The variables X and Y have means and variances of mu1, var1 and mu2, var2 respectively. The first
part of the code calculates the mean and variance of the sum Z when X and Y are independent and
dependent with correlation coefficient r=0.5.
The second part of the code generates a grid of X and Y values using the linspace function. Then, the
joint density function is calculated using the formula for independent X and Y. This is plotted using
the surf function to create a 3D plot.
Next, the function for the dependent X and Y is defined using the anonymous function handle fxyd,
which takes in the values of X and Y as input. The joint density function is then calculated and plotted
using the surf function to create another 3D plot.
Overall, this experiment demonstrates how to calculate and plot the joint density and distribution
function of the sum of two Gaussian random variables.
Code:
% Define the means and variances of the two Gaussian random variables
mu1 = 0;
mu2 = 0;
sigma1 = 1;
sigma2 = 2;
% Calculate the joint density function of the sum of the two Gaussian
random variables
Z = X + Y;
mu = mu1 + mu2;
sigma = sqrt(sigma1^2 + sigma2^2);
pdf = (1/(sigma*sqrt(2*pi))) * exp(-(Z-mu).^2/(2*sigma^2));
% Calculate the joint distribution function of the sum of the two Gaussian
random variables
cdf = 0.5 * erfc(-(Z-mu)/(sigma*sqrt(2)));
Output:
Conclusion:
In this task, we plotted the joint probability density function and joint cumulative
distribution function of the sum of two independent Gaussian random variables X and Y. We
used the MATLAB functions 'meshgrid' and 'surf' to plot the joint PDF and 'contour' to plot
the joint CDF. From the plots, we can see that the joint PDF and CDF of the sum of two
Gaussian random variables are also Gaussian. Moreover, the spread of the PDF and CDF is
wider than the spread of the individual Gaussian random variables.
Experiment 8
Problem Statement:
Estimate the mean and variance of a R.V. Z = X+Y. Where X and Y are also random variables
using MATLAB programming.
Theory:
Let X and Y be two random variables. Then, the sum of these two random variables can be
represented as a new random variable Z = X + Y. The mean and variance of the sum of two random
variables can be found using the following formulas:
In this experiment, we will estimate the mean and variance of the random variable Z, where X and Y
are also random variables.
Code:
% Define the mean and variance of X and Y
mu_X = 2;
sigma_X = 3;
mu_Y = 5;
sigma_Y = 2;
Mean of Z: 6.893635
Variance of Z: 12.398604
Conclusion:
In this code, we first define the mean and variance of the random variables X and Y. We then
generate 1000 random data points for X and Y using the normrnd function. Next, we
calculate the mean and variance of Z = X + Y using the generated data. We then print the
results. Finally, we plot the histogram of Z using the histogram function, with the option
'Normalization', 'pdf' to plot the probability density function instead of the frequency. The
resulting plot shows the joint density function of Z = X + Y.
Experiment 9
Problem Statement:
Simulation of Central Limit Theorem using MATLAB programming.
Theory:
The Central Limit Theorem states that the sum of a large number of independent, identically
distributed random variables tends towards a Gaussian distribution, regardless of the
underlying distribution of the individual random variables. In other words, as the sample size
increases, the sample mean converges to a Gaussian distribution, regardless of the
distribution of the original variables. This theorem is useful in statistics because it allows us
to make inferences about the population mean, even if we don't know the underlying
distribution of the population.
To simulate the Central Limit Theorem, we will generate a large number of samples of size n
from a non-Gaussian distribution, calculate the sample mean of each sample, and plot the
distribution of the sample means. We expect the distribution of sample means to converge
towards a Gaussian distribution as the sample size increases.
Code:
% Plot the population and the normal distribution with the same mean and
std
x = linspace(0,1,1000);
y_pop = unifpdf(x, 0, 1);
y_norm = normpdf(x, pop_mean, pop_std);
figure;
plot(x, y_pop, 'LineWidth', 2);
hold on;
plot(x, y_norm, 'LineWidth', 2);
xlabel('X');
ylabel('Density');
title('Population Distribution and Normal Approximation');
legend('Population', 'Normal Approximation');
Output:
Conclusion:
In this simulation of the Central Limit Theorem in MATLAB, we generated samples from a
Gaussian population with mean 5 and standard deviation 2. As the sample size increased,
the distribution of the sample mean approached a normal distribution with the same mean
and a smaller standard deviation. This provides strong evidence for the Central Limit
Theorem and its ability to explain the behavior of sample means in large samples.
Experiment 10
Problem Statement:
Write two MATLAB functions to generate samples of stationary Gaussian processes using
MATLAB programming.
Theory:
This function generates samples of a stationary white Gaussian noise process. The number of
samples N and the variance of the Gaussian process s2 are specified. The output is a row vector of
Gaussian process samples y.
This function generates samples of a stationary autoregressive Gaussian process. The number of
samples N, the AR coefficients a (specified as a row vector of length p), and the variance of the
Gaussian process s2 are specified. The output is a row vector of Gaussian process samples y. Note
that the length of the output y is N, but p extra samples are generated to initialize the AR process.
A Gaussian process is a stochastic process where any finite subset of the process follows a
multivariate normal distribution. The stationary Gaussian process is a type of Gaussian process
where the statistical properties of the process remain constant over time.
There are several ways to generate samples of stationary Gaussian processes. Two commonly used
methods are:
1. Spectral representation method: In this method, a stationary Gaussian process can be represented
as a sum of sine and cosine waves with random coefficients.
MATLAB provides built-in functions to generate samples of stationary Gaussian processes. However,
in this experiment, we will create two MATLAB functions to generate samples of stationary Gaussian
processes using the above two methods.
Code:
p = length(a);
e = sqrt(s2) * randn(1, N+p);
y = zeros(1, N+p);
for n = p+1:N+p
y(n) = a * y(n-1:-1:n-p) + e(n);
end
y = y(p+1:end);
Conclusion:
In this task, we have provided two MATLAB functions to generate samples of stationary
Gaussian processes. The first function generates samples of a stationary white Gaussian
noise process by drawing samples from a normal distribution with zero mean and specified
variance. The second function generates samples of a stationary autoregressive Gaussian
process using an autoregressive model with specified coefficients and variance. These
functions can be used to simulate stationary Gaussian processes for various applications,
such as signal processing, communications, and control systems.
Experiment 11
Problem Statement:
Write three MATLAB functions to generate samples of non-stationary Gaussian processes
using MATLAB programming.
Theory:
1. This function generates samples of a non-stationary Gaussian process with a linearly-
varying mean function. The initial mean value m0 and final mean value m1 are
specified, as well as the total time duration T, the number of samples N, and the
variance of the Gaussian process s2. The output is a time vector t and Gaussian
process samples y.
2. This function generates samples of a non-stationary Gaussian process with an
exponentially decaying correlation function. The correlation time constant tau, the
total time duration T, the number of samples N, and the variance of the Gaussian
process s2 are specified. The output is a time vector t and Gaussian process samples
y.
3. This function generates samples of a non-stationary Gaussian process with a time-
varying correlation function. The correlation function is specified using a function
handle acf that takes a time vector as input and returns the corresponding
correlation function values. The total time duration T, the number of samples N, and
the variance of the Gaussian process s2 are also specified. The output is a time vector
t and Gaussian process samples y.
Code:
1. Linearly-varying mean function Gaussian process
function [t, y] = linear_gp(T, N, m0, m1, s2)
% Generates samples of a non-stationary Gaussian process with a linearly-
varying mean function
% Inputs:
% T: total time duration (seconds)
% N: number of samples
% m0: initial mean value
% m1: final mean value
% s2: variance of the Gaussian process
% Outputs:
% t: time vector
% y: Gaussian process samples
t = linspace(0, T, N);
m = m0 + (m1 - m0) * t / T;
y = m + sqrt(s2) * randn(size(t));
2. Exponentially decaying correlation function Gaussian process:
t = linspace(0, T, N);
acf = exp(-abs(t) / tau);
y = sqrt(s2) * randn(size(t));
y = filter(acf, 1, y);
t = linspace(0, T, N);
acf_values = acf(t);
y = sqrt(s2) * randn(size(t));
y = filter(acf_values, 1, y);
Conclusion:
In conclusion, there are various ways to generate samples of non-stationary Gaussian processes
using MATLAB programming. These functions provide a way to generate realistic samples of non-
stationary Gaussian processes for various applications, such as time series analysis, signal processing,
and machine learning. However, it is important to carefully choose the appropriate function and
parameters based on the specific application and the properties of the underlying process.
Experiment 12
Problem Statement:
Verify relations between correlation and power spectral density using MATLAB simulation using
MATLAB programming.
Theory:
For a stationary random process x(t), the ACF and PSD are related by the Wiener-Khinchin theorem
as follows:
where E[.] represents the expected value, and j is the imaginary unit.
In words, the PSD is the Fourier transform of the ACF. This relationship holds for any stationary
random process.
To verify this relationship using MATLAB, we can generate a realization of a stationary random
process, estimate its ACF and PSD, and then compare them using the Wiener-Khinchin theorem.
In this code, we generate a realization of a stationary Gaussian random process x(t) with a sinusoidal
signal at frequency f0 = 10 Hz. We then compute the ACF of x(t) using the xcorr function, and
estimate its PSD using the Fourier transform of the ACF. Finally, we plot the ACF and PSD, and verify
the Wiener-Khinchin theorem by computing the inverse Fourier transform of the PSD and comparing
it to the original ACF.
Code:
% Parameters
N = 1000; % number of samples
fs = 100; % sampling frequency
f0 = 10; % signal frequency
sigma = 1; % standard deviation
% Generate signal
t = (0:N-1)/fs;
x = sigma * randn(size(t)) .* sin(2*pi*f0*t);
% Compute ACF
[rxx, lags] = xcorr(x, 'biased');
% Compute PSD
f = (-N/2:N/2-1) * fs / N;
sxx = abs(fftshift(fft(rxx)));
% Plot results
subplot(2,1,1)
plot(lags/fs, rxx)
xlabel('Time lag (s)')
ylabel('ACF')
title('Autocorrelation function')
subplot(2,1,2)
plot(f, sxx)
xlabel('Frequency (Hz)')
ylabel('PSD')
title('Power spectral density')
Output:
Conclusion:
In this experiment, we have verified the relationship between the ACF and PSD of a stationary
random process using MATLAB simulation. The Wiener-Khinchin theorem states that the PSD is the
Fourier transform of the ACF, and this relationship holds for any stationary random process. By
generating a realization of a stationary Gaussian random process and estimating its ACF and PSD, we
have demonstrated this relationship in practice using MATLAB.