# Linear Assignmnet

Problem 1.  (a) Let $A$ be an idempotent matrix.  Recall that this means

$A^2 = A$.

Prove that all of $A$‘s eigenvalues are equal to 1.

(b) Is the converse of part (a) true?  That is, suppose $A$ is a square matrix whose eigenvalues are all equal to $1$.  Is $A$ necessarily idempotent?

(c ) Prove that if $A$ is a nilpotent matrix then all of $A$‘s eigenvalues are equal to 0.

(d) (Bonus) Is the converse to part (c ) true?  That is, suppose $A$ is a square matrix whose eigenvalues are all equal to $0$.  Is $A$ necessarily nilpotent?

Problem 2.  In class we discussed how the $2\times 2$ matrix

$J = \left[ \begin{array}{cc} 0 & -1 \\ 1 & 0 \end{array} \right]$

is a “rotation matrix.”  Specifically, $J$ rotates vectors in $\mathbb{R}^2$ about the origin by $90^{\circ}$ in a counter-clockwise direction.

(a) Prove that any matrix of the form

$R_{\theta} = \left[ \begin{array}{cc} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{array}\right]$

where $\theta \in [0, 2\pi)$ is also a rotation matrix.  That is, explain why / prove that $R_{\theta}$ rotates vectors in $\mathbb{R}^2$ about the origin by $\theta$ radians in a counter-clockwise direction.  (Hint: since homomorphisms are determined by their action on a __________, one only needs to check what $R_{\theta}$ does to $\vec{e}_1$ and $\vec{e}_2$.)

(b)  I claim that the $2\times 2$ matrices

$R_1 = \left[ \begin{array}{cc} 0 & 1 \\\ 1 & 0 \end{array} \right] \text{ and } R_2 = \left[ \begin{array}{cc} -1 & 0 \\ 0 & 1 \end{array} \right]$

reflect vectors across different lines in $\mathbb{R}^2$.  What line does $R_1$ reflect vectors across?  What line does $R_2$ reflect across?  Lastly, what does the composed matrix $R = R_1R_2$ do to vectors?  (Pictures will likely make your answers clearer)

Problem 3. For this problem consider the matrix

$A = \left[ \begin{array}{ccc} 1 & 2 & 0 \\ 2 & 4 & 0 \\ 0 &0 & 0 \end{array} \right]$

(a) Based on the first part of the Spectral Theorem, explain why we know that there exists an eigen-basis for $A$ that contains three vectors.

(b) Based on the fact that column 2 and column 3 of this matrix are multiplies of column 1, explain how it is we know that $\lambda_1 = 0$ is an eigenvalue with geometric multiplicity $\mu_1 = 2$.

(c ) Find the other eigenvalue for $A$.

Problem 4.  Prove that the characteristic polynomial for $A$ equals the characteristic polynomial for $A^T$.

Problem 5.  Suppose that $A$ is diagonalizable via the change-of-basis matrix $P$, so that $P^{-1}AP = D$ where $D$ is a diagonal matrix.  Prove or disprove: the matrix $A^2$ is diagonalizable via the change-of-basis matrix $P$ and it can be diagonalized to $D^2$.

Problem 6.  A basis for a vector space is a set of vectors that is __________________ and ______________________.

Problem 7.  For this problem we will use the infinite dimensional vector space $V = \{ \text{ differentiable functions } f:\mathbb{R}\to\mathbb{R} \}$ and the 4-dimensional subspace $W \subset V$ defined to be the span of the set $\mathcal{B} = \{\cos x + \sin x, \sin x - \cos x, e^x, 5-e^x \}$.  (Note that $\mathcal{B}$ is a basis for $W$.)

(a) Prove that the function $T:W \to W$ given by

$\displaystyle T(f(x)) = f'(x)$

is linear.  (Be sure to check that the image of $T$ actually is a subspace of $W$!)

(b) Compute the matrix representation

$\text{Rep}_{\mathcal{B}, \mathcal{B}}(T)$

(d) Find the eigenvalues of $T$ and determine if $T$ is diagonalizable.

Problem 8.  A square matrix is said to be an orthogonal matrix if its transpose is its inverse; i.e. $O$ is an orthogonal matrix if $OO^T = I = O^TO$.

(Opinionated note: really, these matrices should be called “orthonormal matrices.”  Parts (a) and (b) show why this opinion is sensible.)

(a) Prove that a matrix is orthogonal if and only if its columns are orthonormal vectors (with respect to the standard inner product on $\mathbb{R}^n$).

(b) Prove that a matrix is orthogonal if and only if its rows are orthonormal vectors (with respect to the standard inner product on $\mathbb{R}^n$).

(c ) Let $R_{\theta}$ be one of the $2\times 2$ rotation matrices mentioned in Problem 2.  Prove that for every choice of $\theta$ the matrix $R_{\theta}$ is orthogonal.

(d) Prove that if $O$ is orthogonal then $O^{-1}$ exists and, moreover, $O^{-1}$ is also orthogonal.

Problem 9.  Prove that $A$ and $A^T$ always have the same eigenvalues.

Problem 10.  The matrix

$A = \left[ \begin{array}{ccc} 1 & 2 & 3 \\ 2 & 4 & 6 \\ 3 & 6 & 0 \end{array} \right]$

is diagonalizable.

(a) Give a one sentence explanation as to how we know that $A$ is diagonalizable.

(b) Find all of the eigenvalues of $A$.

(c ) Build a basis for $\mathbb{R}^3$ that consists of eigenvectors for $A$ — label this basis $\mathcal{B}$.

(d)  Instead of using the basis $\mathcal{B}$ that you computed for part (c ), replace the vectors in $\mathcal{B}$ with their “unit length” versions, that is form the new basis $\mathcal{B}'$ given by

$\displaystyle \mathcal{B}' = \left\{ \frac{\vec{v}_1}{\|\vec{v}_1\|}, \frac{\vec{v}_2}{\|\vec{v}_2\|}, \frac{\vec{v}_3}{\|\vec{v}_3\|} \right\}$

Build the change-of-basis matrix, $P$, associated with this new eigenbasis $\mathcal{B}'$ so that $P^{-1}AP$ is diagonal.

(e) Lastly, check that the matrix $P^{-1}$ built in part (d) is, in fact, an orthogonal matrix.

Problem 11.  Diagonalize the matrix

$A = \left[ \begin{array}{cc} 5/2 & 3/2 \\ -1/2 & 1/2 \end{array} \right]$

or explain why $A$ cannot be diagonalized.

Problem 12.  Diagonalize the matrix

$A = \left[ \begin{array}{cc} 2 & 1 \\ 0 & 2 \end{array} \right]$

or explain why $A$ cannot be diagonalized.

Problem 13.  (a) Suppose $A$ is a $4\times 4$ matrix with characteristic polynomial

$p(\lambda) = (\lambda-1)(\lambda-2)(\lambda-3)(\lambda-4)$

What are the eigenvalues of $A$?  Is $A$ diagonalizable?  Explain your answers.

(b) Suppose $A$ is a $4\times 4$ matrix with characteristic polynomial

$p(\lambda) = (\lambda-1)^2(\lambda-2)^2$

What are the eigenvalues of $A$?  Is $A$ diagonalizable?  Explain your answers.

Problem 14.  Consider the (infinite dimensional) vector space $V$ defined by

$V = \{ \text{ all integrable functions } f:[0,1] \to \mathbb{R} \}$

Recall that a function is said to be integrable over $[0, 1]$ if the definite integral

$\int_0^1 \! f(x) \, dx$

is a finite number and that this set was proven to be a vector space on a previous homework assignment.

Define the proposed inner product $\langle\, , \, \rangle$ on $V$ by

$\displaystyle \langle \, f, g \, \rangle = \int_0^1 f(x)g(x)\, dx$

(a) Prove that this defines an inner product on $V$.

(b) Check that the functions $f(x) = (x-1/2)^3$ and $g(x) = 1$ are orthogonal vectors in $V$.

Problem 15.  Write something nice about someone else here at St. Mary’s.  This other person can be another a student, a staff member, a faculty member, or even an administrator.

I am just checking to see if anyone notices this “secret” text.  If you found it, congratulations!  Let me now take a moment to tell you something nice about you.  First, even though this has been a difficult semester for me in many respects, I have had a blast talking about linear algebra with you and everyone else in your class.  Thank you for your hard work and perseverance.  I genuinely appreciate it.

Second, thanks for being a caring and thoughtful member of our campus community.  To be completely honest, I sometimes question all of this talk about St. Mary’s being a “close knit community” that abides by “the St. Mary’s way.”  I don’t know that our school has to be this sort of community for everyone (or, in fact, that any school can actually be this for anyone); sometimes I think that claiming to provide that sort of atmosphere for students is nothing more than a marketing ploy, and that talking about “the St. Mary’s way” is only aggressive branding.  Indeed, there are likely some students who simply show up to class, work hard, learn a lot, and then call it a day.  There are also likely staff members who work at St. Mary’s so that they can support their families, and they may not have the time or desire to partake in any other aspects of our “community.”  I don’t think there is anything wrong or lacking with students and staff who do this, but I do want to point out that your hard work in our class (and presumably outside of it) does make a difference to me.  It does make St. Mary’s a special place.  Moreover, students’ responses to some of the nastier incidents on campus this semester (and last semester) have also helped me understand what it means to talk about our “community.”  It’s not that St. Mary’s is the most important place we’ll ever work and/or live, it’s not that St. Mary’s is a substitute for (or improvement on) our families and other homes, it’s that people here at St. Mary’s are decent and caring.  You are a decent, caring person who makes my job fantastic.  You are a decent, caring person who makes lots of other people’s jobs here fantastic.  In summary, thank you.

Third, let’s get some hints on this homework assignment.  You should know that problems 4 and 9 are essentially equivalent.  For the first of these: how do we show that the characteristic polynomials for a matrix and its transpose are the same?  Well, if we recall that the determinant of a matrix always equals the determinant of its transpose, then this result kind of falls right out.  More to the point: det(A-\lambda I) = det( (A-\lambda I)^T ) and the transpose of this matrix equals (A-\lambda I)^T = A^T – (\lambda I)^T = A^T – \lambda I.  Once you work this out for problem 4, problem 9 is super quick.  After all, what is an eigenvalue?  It is, by definition, a root for the characteristic polynomial.  So, if A and A^T have the same characteristic polynomial, then they have the same roots for this polynomial and so have the same eigenvalues.

Problem 8 may look long or tedious, but it can be done relatively quickly.  Think about the rows and columns of O and O^T being “dotted” to create the entries of OO^T = I.  The (1,1) entry must be one, and this entry is obtained by dotting row 1 of O with column 1 of O^T.  By definition of O^T, column  of O^T is precisely row 1 of O… and so we have that 1 = (row 1 of O) dot (row 1 of O).  This shows that this row has unit length.  Similar considerations will work for the other rows, and then for the columns.

Bye!

Problem 16.  Let $V$ be a real inner product space with a basis $\mathcal{B} = \{\vec{v}_1, \cdots, \vec{v}_n\}$.  Let $\vec{v} \in V$ be arbitrary.  Since $\mathcal{B}$ is a basis it follows that there exist (unique) scalars $c_i \in \mathbb{R}$ such that

$\vec{v} = c_1\vec{v}_1 + c_2\vec{v}_2 + \cdots + c_n\vec{v}_n$.

Prove that if $\mathcal{B}$ is an orthonormal basis then each scalar $c_i$ is given by the formula

$c_i = \langle \vec{v}, \vec{v}_i \rangle$

(Note: the “point” of this exercise is to convince you that orthonormal bases are super duper nice and easy / fun to work with.  Yeah, I ended that sentence in a preposition.  What are you going to do about it?)

Problem 17.  Suppose $h \in \mathcal{L}(\mathbb{R}^3, \mathbb{R}^3)$ has as its matrix representation the $3\times 3$ matrix (in Jordan Canonical form)

$\text{Rep}_{\mathcal{B}, \mathcal{B}}(h) = A = \left[ \begin{array}{ccc} 2 & 1 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 2 \end{array} \right]$

The linear transformation $h$ only has one eigenvalue, $\lambda_1$; what is it?  What is its associated algebraic multiplicity?  What is its associated geometric multiplicity?

Name the elements in the basis $\mathcal{B}$ by $\mathcal{B} = \{\vec{v}_1, \vec{v}_2, \vec{v}_3\}$.  Explain why / verify that

$(A-\lambda_1 I)\vec{v}_1 = \vec{0} \text{ and } (A-\lambda_1 I) \vec{v}_3 = \vec{0}$

$(A-\lambda_1 I)\vec{v}_2 = \vec{v}_1$