ECE534 - Random Processes

Sample Sets and Sigma Algebras

Rolling a die: \(\Omega = \{1,2,3,4,5,6\}\)

We call this the fundamental set

Event: rolling an odd - \(\Omega = \{1, 3, 5\}\)

A sigma-algebra is a set of subsets of \(\Omega\) that satisfy the following properties

  • \(\emptyset \in \mathcal{F}\)
  • \(A \in \mathcal{F} \implies A^c \in \mathcal{F}\)
  • \(A_1, A_2, A_3 \in \mathcal{F} \implies \bigcup_{i=1} A_i \in \mathcal{F}\)
  1. \(\mathcal{F} = \{\emptyset, \Omega\}\)

  2. \(\mathcal{F} = \{\emptyset, \Omega, \{1, 3, 5\}, \{2, 4, 6\}\}\)

\(\sigma(A)\) is defined as the smallest sigma-algebra that contains \(A\)

\(\sigma(A)\) is the intersection of all sigma-algebras that contain \(A\)

Cardinality

Cantor is considered the father of modern set theory. He used bijections to define sets with the same cardinality and showed that \(\text{card}([0, 1]) > \mathbb{N} = \{0, 1, 2, ...\}\).

  1. \(\text{card}(\mathbb{N}) = \text{card}(\mathbb{Q})\). We can map every p/q to a natural number. img_1.png

  2. \(\text{card}(\mathbb{N}) < \text{card}(2^{\mathbb{N}})\)

Random Variables

Let \(I\) be the set of all open intervals. Then \(B(I)\) is called the Borel sigma-algebra.

  • \(\text{card}(\mathbb{R}) = \text{card}(B(\mathbb{R}))\) Can be shown using transfinite induction.

  • \(B(\mathbb{R})\) contains singletons \[\lim_{n \Rightarrow \infty} \left(x - \frac{1}{n}, x + \frac{1}{n} \right) = x\]

  • \(B(\mathbb{R})\) contains all closed intervals \((1, 2), \{1\}, \{2\} \in B(\mathbb{R}) \implies [1, 2] \in B(\mathbb{R})\)

\(\sigma(I_1)\) is a sub-algebra of \(\sigma (I_2)\) if \(\sigma(I_1) \subseteq \sigma(I_2)\)

  • \(I_1 \subseteq I_2 \implies \sigma(I_1) \subseteq \sigma(I_2)\)

A random variable is a mapping \(X: \Omega \Rightarrow \mathbb{R}\) such that \(\omega \in \Omega\) for all \(X(\omega) \in B\).

In words: Take any open interval \(B \in \mathbb{R}\). Let \(\omega\) be any element in \(\Omega\) that maps into this interval. If \(\omega\) is not in \(\mathcal{F}\) for any possible \(\omega\)'s, then \(X\) is not a random variable.

\(\Omega = \{1, 2, 3, 4, 5, 6\}\). \(\mathcal{F} = \{\emptyset, \Omega, \{1, 3, 5\}, \{2, 4, 6\}\}\)

  1. \(X(\omega) = \omega\) is not a random variable over \((\Omega, \mathcal{F})\). Let \(B = (0, 1)\), \(\omega = 1\). \(X(\omega) = 1\) which is in \(B\), yet \(\omega\) is not in \(\mathcal{F}\).

If \(X\) is a random variable on \((\Omega, \mathcal{F})\), we say it is measurable over that space.

Probability Measure and Atoms

Let \(I\) be a set. A measure is a mapping \(\text{meas(\cdot)}: I \rightarrow \mathbb{R}\) with the properties

  1. \(\text{meas}(I) \geq 0\)
  2. if \(\bigcap_{i \neq j} I_i = \nullset\), then

    \(\text{meas}(\bigcup_i I_i) = \sum_i \text{meas}(I_i)\)

    Additivity - if $I_i$ are disjoint, then the measure of their unions is the sum of their measures

Other properties

  • \(P(\bigcup_i I_i) \leq \sum_i P(I_i)\)
  • \(P(I^c) = 1 - P(I)\)
  • \(A \subseteq B \implies P(A) \leq P(B)\)
  • \(P(A \cup B) = P(A) + P(CB) - P(A \cap B)\)
  • If \(I_1 \subseteq I_2 \subseteq \dots\), then \(P(\bigcup_i I_i) = \lim_{i=\infty} P(I_i)\)
Proof

A probability measure is a mapping \(P: \mathcal{F} \Rightarrow [0, 1]\) with the following properties

  • \(P(A) \geq 0\), for all \(A \in \mathcal{F}\)
  • \(P(\Omega) = 1\)
  • If \(A_1, ... \in \mathcal{F}\), where \(A_i, A_j\) are disjoint, then \(P\left(\bigcup_{i=1}^N A_i\right) = \sum_{i=1}^n P(A_i)\)

atom

\(A\) is an atom if it has nonzero measure and all of its subsets have zero measure.

  1. What are the atoms in probability space \(((0, 1), B((0, 1)), P)\)?

    There are none, because any open interval with nonzero measure has a subinterval also with nonzero measure.

Cantor Set

  1. \(card{\[0, 1\]}\) - any number in \(\[0, 1\]\) is represented as infinite series of 0's and 1's

  2. \(card(C)}\) - any number in \(C\) is represented as infinite series of 0's and 2's

\([0, 1]\) and \(C\) have the same cardinality

Distribution of a Random Variable

The distribution \(\mu_X\) of an RV \(X\) is a mapping \(\mu_X:B(\mathbb{R} \rightarrow \[0, 1\]\) such that

\(\mu_X(B) = P(X \in B) = P(X^{-1}(B) \in \mathcal{F})\).

In words: \(\mu_X(B_1)\) is the sum of the probabilities of all events that \(X\) maps to \(B_1\)

Properties

  • if \(P\) is a measure, then \(\mu_X\) is also a measure so long as \(X\) is an RV

Cumulative Distribution function

Let \(X\) be a RV over \((\Omega, \mathcal{F}, P)\).

Then the CDF is \(F_X(t) = P(X \leq t) = P(\omega)\) where \(X(\omega) \in (-\infty, t\]\)

Properties

  • \(\lim_{t \rightarrow \infty} F_x(t) = 1\)
Proof

Random Process

Random process

\(\{X_t | t \in T\}\) is a random process. \(T\) is potentially uncountable.

  • a random process is a function of \(\omega\) and \(t\)
  • a discrete random process is the same as a sequence of random variables
  • all \(X_t\) are random variables over the same probability space

Sample path

\(\{X_t(\omega_t)\}\) is a sample path

Essentially a realization of all the random variables in the set.

i.i.d. random process

A random process is i.i.d iff

\(X_t_1\) and \(X_t_2\) are i.i.d. for any \(t_1, t_2 \in T\) where \(t_1 \neq t_2\)

  • Bernoulli process, i.i.d Gaussian are examples of this

Random Walk

Random Walk

$Z1, …$ is a random walk if it is i.i.d. and \(P(Z_1 = 1) = P(Z_1 = -1) = 0.5\) and \(X_N = \sum_{n=1}^N Z_n\) \(X_0 = 0\).

  1. Compute \(P(\{X_n = k\})\)

    Let \(k = a - (n - a)\), where \(a\) is positive steps and \(n - a\) is negative steps.

    \(a = \frac{n + k}{2}\). We need to choose where to put these \(a\) positive steps.

    \(P(\{X_n = k\}) = \binom{n}{a} 0.5^n\)

Brownian motion

A random process is considered Brownian motion if it is

a random walk where step height \(h\) and time step \(\tau\) go to zero at the same rate. I.e \(\frac{|h|}{\tau} = \frac{h^2}{\tau}\) is constant.

Markov process

A random process is a Markov process iff

\(P(\{X_t_N = x_N | \{X_t_{N-1} = x_N-1\}, ..., \{X_t_1 = x_1\}\}) = P(\{X_t_N = x_N | \{X_t_{N-1} = x_N-1\})\)

\(P(\{X_t_N = x_N\})\) only depends on the previous term.

  1. Is an i.i.d. process Markovian?

    Yes, because P(A|B) = P(A).

  2. Is a random walk Markovian?