\documentclass{article}
\usepackage[left=2.5cm,right=2.5cm, top=2cm, bottom=2cm]{geometry}
\usepackage{enumerate,amsmath,amssymb,amsthm,graphicx}
\newenvironment{amatrix}[1]{%
\left(\begin{array}{@{}*{#1}{c}|c@{}}
}{%
\end{array}\right)
}
\newif\ifsinglecharacter
\def\lengthone#1{\isSingleCharacterHelper(#1\empty)!!end!!\ifsinglecharacter}
\def\isSingleCharacterHelper(#1#2)!!end!!{\ifx#2\empty\singlecharactertrue\else\singlecharacterfalse\fi}
\def\splitcommainto(#1#2)!!end!!#3{\ifx#1,#3\else #1\fi\ifx#2\empty\else\splitcommainto(#2)!!end!!{#3}\fi}
\def\det(#1){\operatorname{det}(#1)}
\def\trans#1{\lengthone{#1} #1^T\else\left(#1\right)^T\fi}
\def\inv#1{\lengthone{#1} #1^{-1}\else\left(#1\right)^{-1}\fi}
\def\dotprod#1#2{\vec{#1}\cdot\vec{#2}}
\def\vec#1{\mathbf{#1}}
\def\vecsub #1_#2{\mathbf{#1_{#2}}}
\def\norm#1{\left|\!\left|#1\right|\!\right|}
\def\R{\mathbb{R}}
\def\spn{\operatorname{span}}
\def\rank{\operatorname{rank}}
\def\row{\operatorname{row}}
\def\col{\operatorname{col}}
\def\ker{\operatorname{null}}
\def\set#1{\left\{\,#1\,\right\}}
\def\s#1{\left\{#1\right\}}
\def\st{\text{\huge{.}}}
\def\colvec#1{\begin{pmatrix}\splitcommainto(#1\empty)!!end!!{\\}\end{pmatrix}}
\def\mlist#1_#2{#1_1, \ldots, #1_{#2}}
\def\veclist#1_#2{\mlist\vec{#1}_{#2}}
\def\lincomb#1#2_#3{#1_1\vec{#2_1} + \cdots + #1_{#3}\vec{#2_{#3}}}
\def\sbs{\subseteq}
\def\df#1{\textbf{#1}}
\def\theorem{\par\noindent{\bf \underline{Theorem}} }
\def\lemma{\par\noindent{\bf \underline{Lemma}} }
\def\proof{\par\noindent{\sl Proof.} }
\def\example{\par\noindent{\bf \underline{Example}} }
\def\soln{\par\noindent{\sl Solution.} }
\def\remark{\par\noindent{\bf \underline{Remark}} }
\title{Summary of Day 9}
\author{William Gunther}
\date{May 30, 2014}
\begin{document}
\maketitle
\section{Objectives}
\begin{itemize}
\item Prove the fundamental theorem of inverses.
\item Talk about subspaces of $\R^n$.
\item Prove able special subspaces from a matrix.
\item Define dimension and basis.
\end{itemize}
\section{Summary}
\begin{itemize}
\item Yesterday we stated this, but did not prove it. Today we will prove it:
\theorem (Fundamental Theorem on Invertible Matrices Version 1: Theorem 3.12 of book) Let $A$ be a square, $n\times n$ matrix. Then the following are equivalent:
\begin{enumerate}[a. ]
\item $A$ is invertible.
\item $A\vec x = \vec b$ has a unique solution for every $\vec b\in \R^n$.
\item $A\vec x = \vec 0$ has only the trivial solution.
\item The reduced row echelon form of $A$ is $I$.
\item $A$ is a product of elementary matrices.
\end{enumerate}
\item Here we prove a few concepts that are fundamental to our study of vector spaces and matrices. Recall that $\R^n$ is an example of a vector space; we define a \df{subspace} to be a nonempty subset of a vector space that is \df{closed under vector addition and scalar multiplication}. That is: $V\sbs \R^n$ is a subspace if:
\begin{enumerate}
\item $V \neq \emptyset$
\item Closed under vector addition: For every $\vec v, \vec u\in V$ we have that $\vec v + \vec u \in V$.
\item Closed under scalar addition: For every $\vec v\in V$ and every scalar $c$ we have $c\vec v\in U$.
\end{enumerate}
\example A line through origin is a subspace. Consider the line $\vec x = t\colvec{1, 2}$; or, written more formally as a set of vectors we are talking about \[ V = \set{ \vec x \in \R^n \mid \exists t \st \vec x = t\colvec{1,2}} \] Let us verify it has the properties.
\begin{enumerate}
\item $V\neq \emptyset$:
We have that $[1.2]\in V$ when $t=1$, so it is nonempty.
\item $V$ closed under vector addition:
Take $\vecsub v_1, \vecsub v_2\in V$. We want to show that $\vecsub v_1 + \vecsub v_2 \in V$. Well, since $\vecsub v_1 \in V$ we have $\vecsub v_1 = t_1 [1,2]$ for some $t_1\in\R$. Similarly, $\vecsub v_2 = t_2[1,2]$. So $\vecsub v_1 + \vecsub v_2 = (t_1 + t_2)[1,2]$, so $\vecsub v_1 + \vecsub v_2 \in V$.
\item $V$ is closed under scalar multiplication.
Take $\vec v \in V$ and $c$ a scalar. As $\vec v \in V$ there is a $t\in\R$ such that $\vec v = t[1,2]$. Then $c\vec v = (ct)[1,2]$; as $ct\in\R$ there is a real number so that $c\vec v$ is a multiple of $[1,2]$ so $c\vec v \in V$.
\end{enumerate}
\example Consider the line $\vec x = t\colvec{1, 2} + \colvec{1,1}$; is this subspace?
\example Consider $\set{\vec x\in \R^2 \mid \vec x = [x, y] \text{ and } xy\geq 0}$. Is this a subspace?
\remark What can you say about subspaces? Is there anything in common with all of them? What is the smallest subset of $\R^n$? The largest?
\theorem Let $\veclist v_k\sbs \R^n$. Then $\spn(\veclist v_k)$ is a subspace of $\R^n$.
\proof This is 3.19 of the book. We will prove it in class. \qed
\item One can prove something is a subspace by showing it is the span of some set of vectors. If $V$ is a subspace of $\R^n$ and $V=\spn(\veclist v_k)$ then we say \df{$V$ is the subspace spanned by $\veclist v_k$}
\example Consider set of all $\vec x = [x,y,z]$ such that $y=2x$ and $z = y$. This sets up a system of equations represented by this matrix:
\[
\begin{pmatrix}
-2 & 1 & 0\\
0 & -1 & 1
\end{pmatrix}
\]
Row reducing, we get:
\[
\begin{pmatrix}
1 & 0 & -1/2\\
0 & 1 & -1
\end{pmatrix}
\]
So, our solutions look like:
\[
\vec x = \colvec{z/2,z,z}
\]
(pretty obvious really), where $z$ can be any quantity. Therefore, this is a subspace since it is spanned by $[1/2,1,1]$.
\item We have actually already enountered some subspaces without mentioning it explicitly. Let $A$ be a $m\times n$ matrix.
\begin{itemize}
\item The \df{row space} of a matrix $A$ is the subspace (of $\R^n$) spanned by the rows of the matrix. We denote this subspace $\row(A)$.
\item The \df{column space} of a matrix $A$ is the subspace (or $\R^m$) spanned by the columns of the matrix. We denote this subspace $\col(A)$.
\end{itemize}
\example What is the row space and column space of $I$?
\example Is $[1,1]$ in the row space of
\[
\begin{pmatrix}
3 & 4\\
1 & 2
\end{pmatrix}
\]
\theorem Let $A$ be a matrix. Suppose that $A$ is row equivalent with a matrix $B$. Then \[\row(A) = \row(B)\]
\proof Bookkeep the row operations of obtaining $B$ from $A$. These tell you that any row $B$ can be written as a linear combination of rows of $A$, and therefore (by a homework problem), $\spn(B)\sbs\spn(A)$. But, row operations are reversible; so we can do a similarly thing starting with $B$ and getting $A$ and there $\spn(A)\sbs\spn(B)$ follows.
\item Let $A$ be an $m\times n$ matrix. The \df{null space} (also called the \df{kernel}) is the set of all solutions to the homogeneous equation represented by $A$. That is:
\[
\ker(A) = \set{ \vec x\in \R^n \mid A\vec x = \vec 0 }
\]
Here, is non-trivial that this is a subspace.
\theorem $\ker(A)$ is a subspace
\proof This is theorem 3.21 in the book. We will prove it in class.
\item A \df{basis} for a subspace $V$ is a set of vectors which spans the set $V$ and is linearly independent. As it turns out, all vector spaces haves a basis (in fact, most have infinitely many).
\example The standard unit vectors $\veclist e_n\sbs\R^n$ is a basis. They clearly span the set as $[\veclist a_n] = \lincomb a e_n$. Moreover, the are all linearly independent (why?).
\example The set of vectors $\s{[1,1], [2,2]}$ span a line, but they are not a basis for the line since they are not linearly independent.
\example If I wanted to find a basis for the line spanned by $\s{[1,1], [2,2]}$ I would just find a linear dependency (like $[1,1] = 1/2[2,2]$) and then remove the vector from the set. So the set $\s{[2,2]}$ is a basis for the line.
\example Suppose I wanted to find a basis for the row space of this matrix:
\[
\begin{pmatrix}
1 & 1 & 3 & 1 & 6\\
2 & -1 & 0 & 1 & -1\\
-3 & 2 & 1 & -2 & 1\\
4& 1& 6& 1& 3
\end{pmatrix}
\]
clearly, the row vectors span the row space since the row space is defined to be the span of the row vectors. So we need to only determine if the vectors are linearly dependent. If we row reduce the matrix, all the rows are still in the row space of the original matrix. Moreover, if we row reduce to rref the rows are linearly independent (why?). Therefore, row reducing will give us a basis for the row space. We can see when we row reduce we get:
\[
\begin{pmatrix}
1 & 0 & 1 & 0 & -1\\
0 & 1& 2& 0 & 3\\
0& 0& 0 & 1& 4\\
0& 0& 0& 0& 0
\end{pmatrix}
\]
Thus the following set is a basis for the row space:
\[
\s{ \colvec{1,0,1,0,-2}, \colvec{0,1,2,0,3}, \colvec{0,0,0,1,4} }
\]
\remark So in order to find a basis for a subspace spanned by some set, you can put the vectors are row and reduce to (reduced) row echelon form. The non-zero rows of this matrix are then a basis.
Alternatively, you can iteratively write one as a linear combination of the rest and then remove that vector, until you get to a linearly independent set.
\item In linear algebra, we often want to capture invariants. That is, things that stay the same even when you alter them. We know lots of them: for instance, the row space is invariant under row operations (so if you do row operations, the row space does not change).
Along with invariants are particular parameters or characteristics. For instance, we know (sort of-we haven't prove it) that the number of nonzero rows in a matrix's row echelon form does not dependent on the row echelon form you chose. Therefore, this is a parameter that we called rank.
Now we will learn a new parameter called dimension. The definition will not make sense (as with rank) until we prove a certain invariance.
\item The \df{dimension} of a subspace is the size of a basis. We denote the dimension of $V$ by $\dim V$
\theorem If $V$ is a subspace of $\R^n$, then any two bases of $V$ have the same number of vectors.
\proof This is theorem 3.23 in the book. We will prove it in class.
\remark What should the dimension of the trivial subspace be?
\end{itemize}
\end{document}