Vectorspaces and Subspaces
vectorspace
A set of vectors V is a vectorspace if:
- closed under addition and scalar multiplication - \(\overrightarrow{v} + \overrightarrow{w} \in V\), \(c \overrightarrow{v} \in V\)
- an identity is defined - \(1 \cdot \overrightarrow{v} = \overrightarrow{v}\)
Properties of Vectorspaces
- commutative - \(\overrightarrow{v} + \overrightarrow{w} = \overrightarrow{w} + \overrightarrow{v}\)
- distributive - \((a + b)\overrightarrow{v} = a\overrightarrow{v} + b \overrightarrow{v}\) and \(a(\overrightarrow{v} + \overrightarrow{w}) = a \overrightarrow{v} + b \overrightarrow{w}\)
- \(- \overrightarrow{v} + \overrightarrow{v} = 0\)
- \(0 + \overrightarrow{v} = \overrightarrow{v}\)
0 in a vectorspace is unique.
Proof
Suppose \(0,0' \in V\), such that \(0 + \overrightarrow{v} = \overrightarrow{v}\) and \(0' + \overrightarrow{v} = \overrightarrow{v}\)
Then \(0 + \overrightarrow{v} = 0' + \overrightarrow{v} \longrightarrow 0 = 0'\)
Subspace
subspace
Let \(W,V\) be vectorspaces. If \(W \subseteq V\), then \(W\) is a subspace.
Equivalently, \(W\) is a subspace \(W \subseteq V\) and:
- \(0 \in W\)
- \(\overrightarrow{v} + \overrightarrow{w} \in W\) for \(\overrightarrow{v},\overrightarrow{w} \in W\)
- \(a \overrightarrow{v} \in W\) for \(\overrightarrow{v} \in W, a \in \mathbb{R}\)
The intersection of subspaces is a subspace.
Proof
Let \(W_1, W_2\) be subspaces of \(V\). Show \(W_1 \cap W_2\) is also a subspace of \(V\)
- \(0 \in W_1, W_2\), so \(0 \in W_1 \cap W_2\)
- Let \(\overrightarrow{v}, \overrightarrow{w} \in W_1 \cap W_2\). Then \(\overrightarrow{v}, \overrightarrow{w} \in W_1, W_2\). By property of closed addition, \(\overrightarrow{v} + \overrightarrow{w} \in W_1 \cap W_2\)
- Let \(a \in \mathbb{R}\), \(\overrightarrow{v} \in W_1 \cap W_2\). Then \(\overrightarrow{v} \in W_1, W_2\). By property of closed scalar multiplication, \(a \overrightarrow{w} \in W_1 \cap W_2\)
Sum of vectorspaces is also a vectorspace.
\(W_1 + W_2 \equiv \{w_1 + w_2 | w_1 \in W_1, w_2 \in W_2\}\)
Proof
Let \(W_1, W_2\) be subspaces of \(V\). Show \(W_1 + W_2\) is also a vectorspace.
- \(0 \in W_1, W_2\), so \(0 \in W_1 + W_2\)
Let \(\overrightarrow{v}, \overrightarrow{w} \in W_1 + W_2\),
where \(\overrightarrow{v} = \overrightarrow{v_1} + \overrightarrow{v_2}\), \(\overrightarrow{w} = \overrightarrow{w_1} + \overrightarrow{w_2}\), \(\overrightarrow{v_1}, \overrightarrow{w_1} \in W_1\), \(\overrightarrow{v_2, \overrightarrow{w_2} \in W_2\)
Then \(\overrightarrow{v} + \overrightarrow{w} = \overrightarrow{v_1} + \overrightarrow{w_1} + \overrightarrow{v_2} + \overrightarrow{w_2}\).
Since \(\overrightarrow{v_1} + \overrightarrow{w_1} \in W_1\) and \(\overrightarrow{v_2} + \overrightarrow{w_2} \in W_2\),
then \(\overrightarrow{v} + \overrightarrow{w} \in W_1 + W_2\)
\(a \overrightarrow{v} = a \overrightarrow{v_1} + a \overrightarrow{v_2}\)
Since \(a \overrightarrow{v_1} \in W_1\) and \(a \overrightarrow{v_2} \in W_2\), \(a \overrightarrow{v} \in W_1 + W_2\)
Span and Linear Dependence
span
The span of a set \(S\) is all possible linear combinations of the set.
Let \(S = {v_1, ..., v_n}\)
\(\text{span} (S) = \{c_1 v_1 + ... + c_n v_n | c_1,...,c_n \in \mathbb{R}\}\)
Let \(S\) be a set, \(W\) be a vectorspace.
If \(S \subseteq W\), then \(\text{span} (S) \subseteq W\)
Proof
Let \(\overrightarrow{v} = c_1 s_1 + ... + c_n s_n \in \text{span} (S)\)
Since \(S\) is in \(W\), by the properties of closed addition and scalar multiplication, \(c_1 s_1 + ... + c_n s_n\) is also in \(W\)
The span of subspaces \(W_1 \cup W_2 = W_1 + W_2\).
Proof
-> Since \(W_1 \cup W_2 \subseteq W_1 + W_2\), \(\text{span} (W_1 \cup W_2) \subseteq W_1 + W_2\) (since vectorspace containing set \(S\) also contains \(\text{span} (S)\)
<- Let \(\overrightarrow{v} \in W_1\), \(\overrightarrow{w} \in W_2\)
Since \(\overrightarrow{v}, \overrightarrow{w} \in W_1 \cup W_2\),
\(\overrightarrow{v} + \overrightarrow{w} \in W_1 \cup W_2\) and \(\overrightarrow{v} + \overrightarrow{w} \in \text{span} (W_1 \cup W_2)\)
therefore \(W_1 + W_2 \subseteq \text{span} (W_1 \cup W_2)\)
linear independence
A set \(S = \{s_1, ..., s_n\}\) is linearly independent if \(c_1, ..., c_n = 0\) is the only solution to \(c_1 s_1 + ... + c_n s_n = 0\)
Equivalently, a set is linearly independent if every element cannot be expressed as a combination of the other elements.
Is the set \(S = \{1 + x + x^2 + x^3, x + x^2 + x^3, x^2 + x^3, x^3\}\) linearly independent?
\[\begin{eqnarray*}
c_1(1 + x + x^2 + x^3) + c_2 ( x + x^2 + x^3) + c_3 ( x^2 + x^3) + c_4 x^3 = 0 \\
c_1 + (c_1 + c_2)x + (c_1 + c_2 + c_3)x^2 + (c_1 + c_2 + c_3 + c_4)x^3 = 0 \\
c_1 = 0 \\
(c_1 + c_2)x = 0 \\
(c_1 + c_2 + c_3)x^2 = 0 \\
(c_1 + c_2 + c_3 + c_4)x^3 = 0
\end{eqnarray*}
\]
Subset Dependence
If \(S_1 \subseteq S_2\) and \(S_1\) is dependent, then \(S_2\) is dependent.
Proof
Let \(S_1 = \{\overrightarrow{v_1}, ...,\overrightarrow{v_m}\}\) be dependent and \(S_1 \subseteq S_2\)
Then at least one \(c\) is nonzero in \(c_1 \overrightarrow{v_1} + ... + c_m \overrightarrow{v_m} = 0\).
Then at least one \(c\) is nonzero in \(c_1 \overrightarrow{v_1} + ... + c_m \overrightarrow{v_m} + ... + c_n \overrightarrow{n} = 0\).
so \(S_2\) is linearly dependent.
If \(S_1 \subseteq S_2\) and \(S_2\) is independent, then \(S_1\) is also independent.
Proof
Let \(S_2 = \{\overrightarrow{v_1}, ...,\overrightarrow{v_n}\}\) be independent and \(S_1 \subseteq S_2\)
Then all \(c\) must be zero in \(c_1 \overrightarrow{v_1} + ... + ... + c_m \overrightarrow{v_m} + ... + c_n \overrightarrow{v_n} = 0\).
Then all \(c\) must be zero in \(c_1 \overrightarrow{v_1} + ... + c_m \overrightarrow{v_m} = 0\).
so \(S_1\) is linearly independent.
Basis and Dimension
A set \(B\) is called a basis of vectorspace \(V\) if
it is linearly independent and a generating set (\(V \subseteq \text{span} (B)\))
or
\(B\) is the smallest set of \(V\) such that \(\text{span} (B) = V\)
or
every vector in \(V\) can be expressed uniquely from linear combinations of \(B\)
Proof
Show that every vector in \(V\) can be expressed uniquely in terms of the basis \(B\)
Let \(\beta\) be a basis of \(V\) and \(\overrightarrow{V} \in V\)
\(\overrightarrow{v} = c_1 b_1 + ... + c_n b_n, \overrightarrow{v} = c_1' b_1 + ... c_n' b_n\) for \(c_1, ..., c_n, c_1', ..., c_n'\)
then \(c_1 b_1 + ... + c_n b_n = c_1' b_1 + ... + c_n' b_n\)
\(0 = (c_1 - c_1')b_1 + ... + (c_n - c_n')b_n\)
Since \(B\) is linearly independent, \(c_1 = c_1', ..., c_n = c_n'\).
Each vector is uniquely expressed in terms of \(B\)
Let \(L\) be an independent set of \(V\) with \(m\) elements. An indepenent set can be extended to create a generating set
Every basis of a finite vectorspace has the same number of elements.
Proof
Let \(B_1, B_2\) be bases of \(V\) where \(\dim (B_1) = n, \dim(B_2) = m\)
Since \(B_1\) is a generating set and \(B_2\) is an independent set, \(|B_1| \geq |B_2|\)
Since \(B_2\) is a generating set and \(B_1\) is an independent set, \(|B_2| \geq |B_1|\)
So \(|B_1| = |B_2|\)
For every vectorspace of dimension \(n\)
\(\dim(\text{generating set}) \geq n\)
\(\dim(\text{independent set}) \leq n\)
Proof
Let \(G\) be a generating set and \(B\) be a basis of some vectorspace of dimension \(n\)
Since \(G\) is a generating set and \(B\) is independent, \(|G| \geq |B| = n\)
Let \(L\) be a linearly independent set and \(B\) be a basis of some vectorspace of dimension \(n\)
Since \(L\) is a linearly independent set and \(B\) is a generating set, \(|L| \leq |B| = n\)
For a vectorspace \(V\) of dimension \(n\)
A generating set with \(n\) elements is a basis of \(V\)
An independent set with \(n\) elements is a basis of \(V\)
For a vectorspace \(V\) of dimension \(n\)
A generating set with dimension greater than \(n\) can be reduced to be a basis.
An independent set with dimension less than \(n\) can be extended to be a basis.
Dimension of Subspace
Let \(W\) be a subspace of \(V\). If the dimensions of \(W\) equals \(V\), then \(V = W\). If the dimension of \(W\) is zero, then \(W = \{0\}\)
Corollary
A basis \(B\) of \(W\) can be extended to be a basis of \(V\).
Properties
Sum
\(\dim(W_1 + W_2) = \dim (W_1) + \dim (W_2) - \dim (W_1 \cap W_2)\)
Line and coincident plane

\(\dim(\text{line + plane}) = 2 = \dim(\text{line}) + \dim(\text{plane}) - \text{dim(line} \cap \text{plane)}\)
where \(\text{dim(plane)} - \text{dim(line} \cap \text{plane)} = 1\)
Line and noncoincident plane

\(\dim(\text{line + plane}) = 3 = \dim(\text{line}) + \dim(\text{plane}) - \dim(\text{line} \cap \text{plane)}\)
where \(\dim(\text{plane}) - \dim(\text{line} \cap \text{plane)} = 0\)
Direct Sum
\(W_1 + W_2 = W_1 \oplus W_2\) iff \(\text{dim} (W_1 + W_2) = \text{dim} (W_1) + \text{dim} (W_2)\)
Proof
\(\Rightarrow\) Assume \(W_1 + W_2 = W_1 \oplus W_2\).
Then \(W_1 \cap W_2 = \{0\}\), so \(\text{dim} (W_1 \cap W_2) = 0\)
Then by the sum property, \(\text{dim} (W_1 + W_2) = \text{dim} (W_1) + \text{dim} (W_2)\)
\(\Leftarrow\) Assume \(\text{dim} ( W_1 + W_2) = \text{dim} (W_1) + \text{dim} (W_2)\)
Then \(\text{dim} (W_1 \cap W_2) = 0\), so \(W_1 \cap W_2 = \{0\}\)
therefore \(W_1 + W_2 = W_1 \oplus W_2\)
A linear transformation is a mapping \(T:V \rightarrow W\) from one vectorspace to another such that
Properties
- \(T(\overrightarrow{0}) = \overrightarrow{0}\)
- proof: \(T(\overrightarrow{0}) = T(0*\over{v}) = 0T(\overrightarrow{v}) = \overrightarrow{0}\)
- \(T(\overrightarrow{u} - \overrightarrow{v}) = T(\overrightarrow{u}) - T(\overrightarrow{v})\)
- proof: \(T(\overrightarrow{u} - \overrightarrow{v}) = T(\overrightarrow{u}) + T(-1 \overrightarrow{v}) = T(\overrightarrow{u}) - T(\overrightarrow{v})\)
For \(T:P_n(\mathbb{R}) \rightarrow P_{n-1}(\mathbb{R})\) where \(T(f) = f'\) for \(f \in P_n(\mathbb{R})\). Show \(T\) is linear.
Let \(a,b \in \mathbb{R}\), \(f,g \in P_n(\mathbb{R})\).
\(T(af + bg) = (af + bg)' = af' + bg' = aT(f) + bT(g)\)
Identity Transformation
A mapping \(T:v \rightarrow W\) such that \(T(\overrightarrow{v} = \overrightarrow{v}\) for \(\overrightarrow{v} \in V\) #enddefinition
#+begindefinition Zero Transformation
A mapping \(T:V \rightarrow W\) such that \(T(\overrightarrow{v}) = \overrightarrow{0}\) for \(\overrightarrow{v} in V\)
Kernel and Image
Kernel
The kernel (\(\ker(T)\) or \(N(T)\)) of a transformation \(T:V \rightarrow W\) is the set of vectors in \(V\) that map to \(\overrightarrow{0}\) in \(W\)

Image
The image (\(Im(T)\) or \(R(T)\)) of a transformation \(T:V \rightarrow W\) is the set of vectors in \(W\) mapped to by vecotrs in \(V\).

Prove the kernel of \(T\) is a subspace of \(V\). (show \(0 \in \ker(T)\), \(\ker(T)\) closed for addition, \(\ker(T)\) closed for scalar multiplication
By definition \(T(0) = 0\), so \(0 \in \ker(T)\)
For \(\overrightarrow{u}, \overrightarrow{v} \in \ker(T)\), we have \(T(\overrightarrow{u}) = T(\overrightarrow{v}) = 0\)
\(T(\overrightarrow{u} + \overrightarrow{v}) = T(\overrightarrow{u}) + T(\overrightarrow{v}) = 0\)
so \(\overrightarrow{u} + \overrightarrow{v} \in \ker(T)\)
For \(a \in R\), \(v \in \ker(T)\) we have \(T(\overrightarrow{v}) = 0\)
\(T(a\overrightarrow{v}) = aT(\overrightarrow{v}) = 0\), so \(a \overrightarrow{v} \in \ker(T)\)
Let \(T:\mathbb{R}_3 \to \mathbb{R}_2\), where \[T \left( \begin{matrix} a\\b\\c \end{matrix} \right) = \left( \begin{matrix} a-b \\ 0 \\ 2c \end{matrix} \right)\]. Find the kernel of \(T\)
\[\ker(T) = \left\{ \left( \begin{matrix} a\\b\\c \end{matrix} \right) \middle| T \left( \begin{matrix} a\\b\\c \end{matrix} \right) = 0 \right\}\]
\[\left( \begin{matrix} a-b \\ 0 \\ 2c \end{matrix} \right) = 0\], so \(a = b\) and \(c = 0\)
\[\ker(T) = \left\{ \left( \begin{matrix} a\\a\\0 \end{matrix} \right) \middle| a \in \mathbb{R} \right\}\]
Find the image of \(T\).
\[Im(T) = \left\{ T(\overrightarrow{v}) \middle| \overrightarrow{v} \in \mathbb{R}_3 \right\} = \left\{ \left( \begin{matrix} a-b \\ 0 \\ 2c \end{matrix} \right) \middle| a,b,c \in \mathbb{R} \right\} = \left\{ \left( \begin{matrix} a\\ 0 \\ b \end{matrix} \right) \middle| a,b \in \mathbb{R} \right\}\]