CDS 110b: Random Processes

From Murray Wiki
Jump to navigationJump to search
WARNING: This page is for a previous year.
See current course homepage to find most recent page available.
Course Home L7-2: Sensitivity L8-1: Robust Stability L9-1: Robust Perf Schedule

This lecture presents an introduction to random processes.

Lecture Outline

  1. Quick review of random variables
  2. Random Processes
    • Definition
    • Properties
    • Example

Lecture Materials

References and Further Reading

  • Hoel, Port and Stone, Introduction to Probability Theory - this is a good reference for basic definitions of random variables
  • Apostol II, Chapter 14 - another reference for basic definitions in probability and random variables

Frequently Asked Questions

Q: Can you explain the jump from pdfs to correlations in more detail?

The probability density function (pdf), \(p(x; t)\) tells us how the value of a random process is distributed at a particular time:

\( P(a \leq x(t) \leq b) = \int_a^b p(x; t) dx. \)

You can interpret this by thinking of \(x(t)\) as a separate random variable for each time \(t\)

The correlation for a random process tells us how the value of a random process at one time, \(t_1\) is related to the value at a different time \(t_2\). This relationship is probabalistic, so it is also described in terms of a distribution. In particular, we use the joint probability density function, \(p(x_1, x_2; t_1, t_2)\) to characterize this:

\( P(a_1 \leq x_1(t_1) \leq b_1, a_2 \leq x_2(t_2) \leq b_2) = \int_{a_1}^{b_1} \int_{a_2}^{b_2} p(x_1, x_2; t_1, t_2) dx_1 dx_2 \)

Given any random process, \(p(x_1, x_2; t_1, t_2)\) descibes (as a density) how the value of the random variable at time \(t_1\) is related (or "correlated") with the value at time \(t_2\). We can thus describe a random process according to its joint probability density function.

In practice, we don't usually describe random processes in terms of their pdfs and joint pdfs. It is usually easier to describe them in terms of their statistics (mean, variance, etc). In particular, we almost never describe the correlation in terms of joint pdfs, but instead use the correlation function:

\( \rho(t, \tau) = E\{x(t) x(\tau)\} = \int_{-\infty}^\infty \int_{-\infty}^\infty x_1 x_2 p(x_1, x_2; t, \tau) dx_1 dx_2 \)

The utility of this particular function is seen primarily through its application: if we know the correlation for one random process and we "filter" that random process through a linear system, we can compute the correlation for the corresponding output process (we'll see this in the next lecture).