\documentclass{article}
\usepackage[left=4cm,right=4cm, top=2cm, bottom=2cm]{geometry}
\usepackage{enumerate,amsmath,amssymb,amsthm,graphicx}
\newenvironment{amatrix}[1]{%
\left(\begin{array}{@{}*{#1}{c}|c@{}}
}{%
\end{array}\right)
}
\newif\ifsinglecharacter
\def\lengthone#1{\isSingleCharacterHelper(#1\empty)!!end!!\ifsinglecharacter}
\def\isSingleCharacterHelper(#1#2)!!end!!{\ifx#2\empty\singlecharactertrue\else\singlecharacterfalse\fi}
\def\splitcommainto(#1#2)!!end!!#3{\ifx#1,#3\else #1\fi\ifx#2\empty\else\splitcommainto(#2)!!end!!{#3}\fi}
\def\det(#1){\operatorname{det}(#1)}
\def\trans#1{\lengthone{#1} #1^T\else\left(#1\right)^T\fi}
\def\inv#1{\lengthone{#1} #1^{-1}\else\left(#1\right)^{-1}\fi}
\def\dotprod#1#2{\vec{#1}\cdot\vec{#2}}
\def\vec#1{\mathbf{#1}}
\def\vecsub #1_#2{\mathbf{#1_{#2}}}
\def\coord #1_#2{\left[#1\right]_#2}
\def\norm#1{\left|\!\left|#1\right|\!\right|}
\def\R{\mathbb{R}}
\def\spn{\operatorname{span}}
\def\rank{\operatorname{rank}}
\def\nullity{\operatorname{nullity}}
\def\row{\operatorname{row}}
\def\col{\operatorname{col}}
\def\ker{\operatorname{null}}
\def\set#1{\left\{\,#1\,\right\}}
\def\s#1{\left\{#1\right\}}
\def\st{\text{\huge{.}}}
\def\colvec#1{\begin{pmatrix}\splitcommainto(#1\empty)!!end!!{\\}\end{pmatrix}}
\def\mlist#1_#2{#1_1, \ldots, #1_{#2}}
\def\veclist#1_#2{\mlist\vec{#1}_{#2}}
\def\lincomb#1#2_#3{#1_1\vec{#2_1} + \cdots + #1_{#3}\vec{#2_{#3}}}
\def\sbs{\subseteq}
\def\df#1{\textbf{#1}}
\def\theorem{\par\noindent{\bf \underline{Theorem}} }
\def\lemma{\par\noindent{\bf \underline{Lemma}} }
\def\proof{\par\noindent{\sl Proof.} }
\def\example{\par\noindent{\bf \underline{Example}} }
\def\soln{\par\noindent{\sl Solution.} }
\def\remark{\par\noindent{\bf \underline{Remark}} }
\title{Summary of Day 13}
\author{William Gunther}
\date{June 6, 2014}
\begin{document}
\maketitle
\section{Objectives}
\begin{itemize}
\item Talk about connection between column space and range of a linear transformation.
\item Talk about rotational matrices.
\item Define eigenvalues and eigenspace.
\item Get a geometric understanding of the eigenvalues and eigenspace.
\end{itemize}
\section{Summary}
\begin{itemize}
\item Let's first pick up where we left yesterday connecting the range of a linear transformation to the column space. What does `plugging in' a vector look like regarding the matrix multiplication? Well, it looks exactly like taking a linear combination of the columns. This is not a new realization, it's the same one that allowed us to connect a system of equations to a vector equation.
Therefore, the set of vectors you can get out of a matrix by plugging something is (which, if you interpret the matrix as a coefficient matrix, is the vectors that have solutions as the constants corresponding to that coefficient matrix) is exactly the vectors you can get by writing linear combinations of the columns.
\item Let's quickly do an example of a linear transformation which rotates the plane $\theta$ degrees in $\R^2$.
\vskip 2in
\item Often with functions, we are interested in characterizing it's behavior. I mean, what else can we do with it? A particular fruitful way of characterizing the behavior is the basis for {\bf spectral theory}. We will define what seems to be a rather artificial property of a matrix and as we will uncover it will tell us a lot about the property of the underlying linear transformation.
\item For the purposes of talking about eigenvalues we are always going to be mapping from $\R^n$ to $\R^n$. For these types of linear transformation, it helps to view them as a distortion of space. Points get moved, space gets stretched, shrunk, rotated, reflected, sheared, etc.
Lots of those actions are really the same. Stretching, shrinking, and reflecting are all the same thing. They all would correspond to:
\[
\vec v \mapsto \lambda\vec v
\]
It's stretching if $\lambda>1$, shrinking if $0 \leq \lambda < 1$ and reflecting if $\lambda < 0$. Motivated by these seeming like important vectors, we give them a name. An \df{eigenvector}, corresponding to some matrix/linear transformation $T$, is a nonzero vector for which $T\vec v = \lambda \vec v$. Every eigenvector has a corresponding value of $\lambda$ which say what the linear transformation multiplies the vector by; we call $\lambda$ an \df{eigenvalue}. The set of eigenvalues is called the \df{spectrum} of a matrix.
Why a nonzero vector? Because $\vec 0$ is too boring to be an eigenvector, and it would break a lot of our theorems if it were an eigenvector.
\example What are the eigenvectors and eigenvalues to the `rotate by $90$ degrees' linear transformation?
\example Verify that $\vec v$ is an eigenvector of $A$:
\[A = \begin{pmatrix}2 & 3 \\ 3 & 2\end{pmatrix}\quad \vec v =\colvec{2,-2} \]
\vskip 1in
\example Consider this matrix:
\[
T=
\begin{pmatrix}
1 & 1 \\
0 & 1
\end{pmatrix}
\]
Let's analyze this map and see what it does:
\vskip 3in
\item Every eigenvector corresponds to exactly one eigenvalue (it can't be stretched by both a factor of $2$ and $3$). But an eigenvalue can (actually: will) correspond to many different eigenvectors. In fact, an eigenvalues has an entire subspace associated with it called the \df{eigenspace corresponding to $\lambda$}.
\theorem Consider the set of all eigenvectors with eigenvalue $\lambda$ along with $\vec 0$. This is a subspace of $\R^n$.
\proof
\vskip 1in
\qed
\example In the example $T$ above, what are the eigenvalues and its corresponding eigenspace?
\vskip 1in
\item Let's do some examples of finding an eigenvalues and eigenvectors.
\example Let's explore how you would find the eigenspace of the following matrix corresponding to the eigenvalue $\lambda = 6$
\[
\begin{pmatrix}
7 & 1& -2\\
-3& 3& 6\\
2& 2& 2
\end{pmatrix}
\]
\vfill
\example Let's now explore how we can find the eigenvalues and all the eigenvectors of this following matrix:
\[
\begin{pmatrix}
0 & 4\\
-1 & 5
\end{pmatrix}
\]
\vfill
\newpage
\item The last example, we used the idea of invertability to help find eigenvalues. The big realization is:
\theorem $\lambda$ is an eigenvalue for $A$ if and only if $A-\lambda I$ has a non-trivial null space. Moreover, the eigenspace corresponding to $\lambda$ is the null space of this matrix.
\proof
\vskip 1in
\qed
\example What are the eigenvalues and eigenvectors of $I$?
\vskip 1in
\item We will now talk about another way to determine invertability. This will be a nice way since we'll be able to determine the eigenvalues of a matrix using it fairly efficiently. Called the determinant.
We already know how to do this for $2\times 2$ matrices. Let's do it for a larger one, and this will illustrate the technique:
\example Compute the determinant of:
\[
\begin{pmatrix}
5 & -3 & 2\\
1 & 0 & 2\\
2 & -1 & 3
\end{pmatrix}
\]
\end{itemize}
\end{document}