Rolling a die: \(\Omega = \{1,2,3,4,5,6\}\)
We call this the fundamental set
Event: rolling an odd - \(\Omega = \{1, 3, 5\}\)
A sigma-algebra is a set of subsets of \(\Omega\) that satisfy the following properties
\(\mathcal{F} = \{\emptyset, \Omega\}\)
\(\mathcal{F} = \{\emptyset, \Omega, \{1, 3, 5\}, \{2, 4, 6\}\}\)
\(\sigma(A)\) is defined as the smallest sigma-algebra that contains \(A\)
\(\sigma(A)\) is the intersection of all sigma-algebras that contain \(A\)
Cantor is considered the father of modern set theory. He used bijections to define sets with the same cardinality and showed that \(\text{card}([0, 1]) > \mathbb{N} = \{0, 1, 2, ...\}\).
\(\text{card}(\mathbb{N}) = \text{card}(\mathbb{Q})\). We can map every p/q to a natural number. img_1.png
\(\text{card}(\mathbb{N}) < \text{card}(2^{\mathbb{N}})\)
Let \(I\) be the set of all open intervals. Then \(B(I)\) is called the Borel sigma-algebra.
\(\text{card}(\mathbb{R}) = \text{card}(B(\mathbb{R}))\) Can be shown using transfinite induction.
\(B(\mathbb{R})\) contains singletons \[\lim_{n \Rightarrow \infty} \left(x - \frac{1}{n}, x + \frac{1}{n} \right) = x\]
\(B(\mathbb{R})\) contains all closed intervals \((1, 2), \{1\}, \{2\} \in B(\mathbb{R}) \implies [1, 2] \in B(\mathbb{R})\)
\(\sigma(I_1)\) is a sub-algebra of \(\sigma (I_2)\) if \(\sigma(I_1) \subseteq \sigma(I_2)\)
A random variable is a mapping \(X: \Omega \Rightarrow \mathbb{R}\) such that \(\omega \in \Omega\) for all \(X(\omega) \in B\).
In words: Take any open interval \(B \in \mathbb{R}\). Let \(\omega\) be any element in \(\Omega\) that maps into this interval. If \(\omega\) is not in \(\mathcal{F}\) for any possible \(\omega\)'s, then \(X\) is not a random variable.
\(\Omega = \{1, 2, 3, 4, 5, 6\}\). \(\mathcal{F} = \{\emptyset, \Omega, \{1, 3, 5\}, \{2, 4, 6\}\}\)
If \(X\) is a random variable on \((\Omega, \mathcal{F})\), we say it is measurable over that space.
Let \(I\) be a set. A measure is a mapping \(\text{meas(\cdot)}: I \rightarrow \mathbb{R}\) with the properties
if \(\bigcap_{i \neq j} I_i = \nullset\), then
\(\text{meas}(\bigcup_i I_i) = \sum_i \text{meas}(I_i)\)
Additivity - if $I_i$ are disjoint, then the measure of their unions is the sum of their measures
Other properties
A probability measure is a mapping \(P: \mathcal{F} \Rightarrow [0, 1]\) with the following properties
atom
\(A\) is an atom if it has nonzero measure and all of its subsets have zero measure.
What are the atoms in probability space \(((0, 1), B((0, 1)), P)\)?
There are none, because any open interval with nonzero measure has a subinterval also with nonzero measure.
\(card{\[0, 1\]}\) - any number in \(\[0, 1\]\) is represented as infinite series of 0's and 1's
\(card(C)}\) - any number in \(C\) is represented as infinite series of 0's and 2's
\([0, 1]\) and \(C\) have the same cardinality
The distribution \(\mu_X\) of an RV \(X\) is a mapping \(\mu_X:B(\mathbb{R} \rightarrow \[0, 1\]\) such that
\(\mu_X(B) = P(X \in B) = P(X^{-1}(B) \in \mathcal{F})\).
In words: \(\mu_X(B_1)\) is the sum of the probabilities of all events that \(X\) maps to \(B_1\)
Properties
Cumulative Distribution function
Let \(X\) be a RV over \((\Omega, \mathcal{F}, P)\).
Then the CDF is \(F_X(t) = P(X \leq t) = P(\omega)\) where \(X(\omega) \in (-\infty, t\]\)
Properties
Random process
\(\{X_t | t \in T\}\) is a random process. \(T\) is potentially uncountable.
Sample path
\(\{X_t(\omega_t)\}\) is a sample path
Essentially a realization of all the random variables in the set.
i.i.d. random process
A random process is i.i.d iff
\(X_t_1\) and \(X_t_2\) are i.i.d. for any \(t_1, t_2 \in T\) where \(t_1 \neq t_2\)
Random Walk
$Z1, …$ is a random walk if it is i.i.d. and \(P(Z_1 = 1) = P(Z_1 = -1) = 0.5\) and \(X_N = \sum_{n=1}^N Z_n\) \(X_0 = 0\).
Compute \(P(\{X_n = k\})\)
Let \(k = a - (n - a)\), where \(a\) is positive steps and \(n - a\) is negative steps.
\(a = \frac{n + k}{2}\). We need to choose where to put these \(a\) positive steps.
\(P(\{X_n = k\}) = \binom{n}{a} 0.5^n\)
Brownian motion
A random process is considered Brownian motion if it is
a random walk where step height \(h\) and time step \(\tau\) go to zero at the same rate. I.e \(\frac{|h|}{\tau} = \frac{h^2}{\tau}\) is constant.
Markov process
A random process is a Markov process iff
\(P(\{X_t_N = x_N | \{X_t_{N-1} = x_N-1\}, ..., \{X_t_1 = x_1\}\}) = P(\{X_t_N = x_N | \{X_t_{N-1} = x_N-1\})\)
\(P(\{X_t_N = x_N\})\) only depends on the previous term.
Is an i.i.d. process Markovian?
Yes, because P(A|B) = P(A).
Is a random walk Markovian?