\documentclass{article}
\usepackage[left=2.5cm,right=2.5cm, top=2cm, bottom=2cm]{geometry}
\usepackage{enumerate,amsmath,amssymb,amsthm,graphicx}
\newenvironment{amatrix}[1]{%
\left(\begin{array}{@{}*{#1}{c}|c@{}}
}{%
\end{array}\right)
}
\newif\ifsinglecharacter
\def\lengthone#1{\isSingleCharacterHelper(#1\empty)!!end!!\ifsinglecharacter}
\def\isSingleCharacterHelper(#1#2)!!end!!{\ifx#2\empty\singlecharactertrue\else\singlecharacterfalse\fi}
\def\splitcommainto(#1#2)!!end!!#3{\ifx#1,#3\else #1\fi\ifx#2\empty\else\splitcommainto(#2)!!end!!{#3}\fi}
\def\det(#1){\operatorname{det}(#1)}
\def\trans#1{\lengthone{#1} #1^T\else\left(#1\right)^T\fi}
\def\inv#1{\lengthone{#1} #1^{-1}\else\left(#1\right)^{-1}\fi}
\def\dotprod#1#2{\vec{#1}\cdot\vec{#2}}
\def\vec#1{\mathbf{#1}}
\def\vecsub #1_#2{\mathbf{#1_{#2}}}
\def\norm#1{\left|\!\left|#1\right|\!\right|}
\def\R{\mathbb{R}}
\def\spn{\operatorname{span}}
\def\rank{\operatorname{rank}}
\def\nullity{\operatorname{nullity}}
\def\row{\operatorname{row}}
\def\col{\operatorname{col}}
\def\ker{\operatorname{null}}
\def\set#1{\left\{\,#1\,\right\}}
\def\s#1{\left\{#1\right\}}
\def\st{\text{\huge{.}}}
\def\colvec#1{\begin{pmatrix}\splitcommainto(#1\empty)!!end!!{\\}\end{pmatrix}}
\def\mlist#1_#2{#1_1, \ldots, #1_{#2}}
\def\veclist#1_#2{\mlist\vec{#1}_{#2}}
\def\lincomb#1#2_#3{#1_1\vec{#2_1} + \cdots + #1_{#3}\vec{#2_{#3}}}
\def\sbs{\subseteq}
\def\df#1{\textbf{#1}}
\def\theorem{\par\noindent{\bf \underline{Theorem}} }
\def\lemma{\par\noindent{\bf \underline{Lemma}} }
\def\proof{\par\noindent{\sl Proof.} }
\def\example{\par\noindent{\bf \underline{Example}} }
\def\soln{\par\noindent{\sl Solution.} }
\def\remark{\par\noindent{\bf \underline{Remark}} }
\title{Summary of Day 10}
\author{William Gunther}
\date{June 2, 2014}
\begin{document}
\maketitle
\section{Objectives}
\begin{itemize}
\item Talk about subspaces of $\R^n$.
\item Prove results about special subspaces from a matrix.
\item Define dimension and basis.
\end{itemize}
\begin{itemize}
\item We have actually already encountered some subspaces without mentioning it explicitly. Let $A$ be a $m\times n$ matrix.
\begin{itemize}
\item The \df{row space} of a matrix $A$ is the subspace (of $\R^n$) spanned by the rows of the matrix. We denote this subspace $\row(A)$.
\item The \df{column space} of a matrix $A$ is the subspace (or $\R^m$) spanned by the columns of the matrix. We denote this subspace $\col(A)$.
\end{itemize}
\example What is the row space and column space of $I$?
\example Is $[1,1]$ in the row space of
\[
\begin{pmatrix}
3 & 4\\
1 & 2
\end{pmatrix}
\]
\theorem Let $A$ be a matrix. Suppose that $A$ is row equivalent with a matrix $B$. Then \[\row(A) = \row(B)\]
\proof Bookkeep the row operations of obtaining $B$ from $A$. These tell you that any row $B$ can be written as a linear combination of rows of $A$, and therefore (by a homework problem), $\spn(B)\sbs\spn(A)$. But, row operations are reversible; so we can do a similarly thing starting with $B$ and getting $A$ and there $\spn(A)\sbs\spn(B)$ follows.
\item Let $A$ be an $m\times n$ matrix. The \df{null space} (also called the \df{kernel}) is the set of all solutions to the homogeneous equation represented by $A$. That is:
\[
\ker(A) = \set{ \vec x\in \R^n \mid A\vec x = \vec 0 }
\]
Here, is non-trivial that this is a subspace.
\theorem $\ker(A)$ is a subspace
\proof This is theorem 3.21 in the book. We will prove it in class.
\item A \df{basis} for a subspace $V$ is a set of vectors which spans the set $V$ and is linearly independent. As it turns out, all vector spaces haves a basis (in fact, most have infinitely many).
\example The standard unit vectors $\veclist e_n\sbs\R^n$ is a basis. They clearly span the set as $[\veclist a_n] = \lincomb a e_n$. Moreover, the are all linearly independent (why?).
\example The set of vectors $\s{[1,1], [2,2]}$ span a line, but they are not a basis for the line since they are not linearly independent.
\example If I wanted to find a basis for the line spanned by $\s{[1,1], [2,2]}$ I would just find a linear dependency (like $[1,1] = 1/2[2,2]$) and then remove the vector from the set. So the set $\s{[2,2]}$ is a basis for the line.
\example Suppose I wanted to find a basis for the row space of this matrix:
\[
\begin{pmatrix}
1 & 1 & 3 & 1 & 6\\
2 & -1 & 0 & 1 & -1\\
-3 & 2 & 1 & -2 & 1\\
4& 1& 6& 1& 3
\end{pmatrix}
\]
clearly, the row vectors span the row space since the row space is defined to be the span of the row vectors. So we need to only determine if the vectors are linearly dependent. If we row reduce the matrix, all the rows are still in the row space of the original matrix. Moreover, if we row reduce to rref the rows are linearly independent (why?). Therefore, row reducing will give us a basis for the row space. We can see when we row reduce we get:
\[
\begin{pmatrix}
1 & 0 & 1 & 0 & -1\\
0 & 1& 2& 0 & 3\\
0& 0& 0 & 1& 4\\
0& 0& 0& 0& 0
\end{pmatrix}
\]
Thus the following set is a basis for the row space:
\[
\s{ \colvec{1,0,1,0,-2}, \colvec{0,1,2,0,3}, \colvec{0,0,0,1,4} }
\]
\remark So in order to find a basis for a subspace spanned by some set, you can put the vectors are row and reduce to (reduced) row echelon form. The non-zero rows of this matrix are then a basis.
Alternatively, you can iteratively write one as a linear combination of the rest and then remove that vector, until you get to a linearly independent set.
\item In linear algebra, we often want to capture invariants. That is, things that stay the same even when you alter them. We know lots of them: for instance, the row space is invariant under row operations (so if you do row operations, the row space does not change).
Along with invariants are particular parameters or characteristics. For instance, we know (sort of-we haven't prove it) that the number of nonzero rows in a matrix's row echelon form does not dependent on the row echelon form you chose. Therefore, this is a parameter that we called rank.
Now we will learn a new parameter called dimension. The definition will not make sense (as with rank) until we prove a certain invariance.
\item The \df{dimension} of a subspace is the size of a basis. We denote the dimension of $V$ by $\dim V$
\theorem If $V$ is a subspace of $\R^n$, then any two bases of $V$ have the same number of vectors.
\proof This is theorem 3.23 in the book. We will prove it in class.
\remark What should the dimension of the trivial subspace be?
\item Intuitively, the dimension is exactly what we want to capture. 1 dimensional subspaces of $\R^n$ are lines since their basis is of size $1$, so it's all vectors that are $c\vec v$. 2 dimensional objects are like planes, as it's vectors like $s\vec v + t\vec u$. 3 dimensional objects are...well, whatever they are called, which is a 3 dimensional hyperplanes, but that's not so important. Just like planes look like a copy of $\R^2$ in $\R^3$, so does a 3 dimensional subspace look like a copy of $\R^3$ in higher dimensions, like $\R^4$, it's just harder to picture.
\item
\theorem The row and column space of a matrix have the same dimension
\proof Let $A$ be the matrix. Clearly $\dim(\row(A)) = \dim(\row(A'))$ where $A'$ is in rref. $\dim(\row(A'))$ is the number of nonzero rows of the matrix which, by the many connections we've made, is the same as the rank of the matrix (i.e. the number of zero rows is the same as the number of leading entries of $A'$).
Now, although $\col(A')\neq\col(A)$ they have a fundamental relationship: $A$ is linearly independent if and only if $A'$ is, and moreover, the dependency is demonstrated in the same way (e.g. if the first column is a multiple of the second in $A'$ then it is so in $A$ as well).
We now make a claim that a moments thought makes clear: in $A'$ all the columns without leading entries can we written as a linear combination of columns with leading entries. Why? Because all the columns with leading entries are elements of the standard basis of $\R^m$ and all the columns not containing leading entries only have nonzero components in places that have a leading entry!
Therefore, the dimension of $A'$ (which as noted is the same as the dimension of $A$) is equal to the number of leading entries, which is the rank, which is the number of nonzero rows, which is the dimension of the row space! Wow, cool. \qed
\item We can then give an equivalent meaning for rank (which actually is not the last one we will see) which is more important when thinking about matrices rather than systems: The \df{rank} of a matrix is the dimension of the row and column spaces.
\item What can you say about the connection between $\rank(A)$ and $\rank(\trans{A})$?
\item With this new version of rank, one would expect that we could say more about The Rank Theorem, and rephrase it in this new matrix-centric way. You'd be right.
With systems there is a tug of war: The more the rank has, the less free variables there will be. What is the tug of war between with regards to the dimension of the column and row spaces?
\item The \df{nullity} of a matrix is the dimension of its null space.
\theorem If $A$ is $m\times n$ then:
\[
\rank(A) + \nullity(A) = n
\]
\proof View $A$ as a homogeneous system of equations. Then if the rank of $A$ is $r$ then there are $n-r$ total free variables by the rank theorem. The solution set of this system is exactly the nullspace, and as there are $n-r$ free variables, it has dimension $n-r$. Therefore, the nullity is $n-r$, and as we knew the rank is $r$, which makes the sum $n$.
\item We can now add some things to the fundamental theorem of invertible matrices:
\theorem (The fundamental theorem of invertible matrices version 2 (thm 3.27)) If $A$ is $n\times n$ then TFAE:
\begin{enumerate}
\item A is invertible.
\item $\rank(A) = n$.
\item $\nullity(A) = 0$
\item The column space of $A$ is a basis for $\R^n$.
\item The row space of $A$ is a basis for $\R^n$.
\end{enumerate}
\proof This is proved in the book with lots of other results filled in too. You should try to work out the proofs by yourself as practice.
\end{itemize}
\end{document}