Problem 1. For this problem use the matrix
(a) Are there values that we can assign and so that is not an isomorphism? Explain your answer, and, if your answer is yes, provide two such values.
(b) Are there values that we can assign and so that is an isomorphism? Explain your answer, and if your answer is yes, provide two such values.
Answer 1. (a) Note that if we choose and , then row 3 equals the sum of row 1 and row 2 which implies is not an isomorphism.
(b) In general, we can compute that the determinant of is , so provided this expression is never zero, then is an isomorphism. In particular, we can choose so that
Problem 2. For this problem let be a square, matrix.
(a) Explain why there exists a non-zero vector such that .
(b) Explain why a non-zero vector is an eigenvector for (with eigenvalue ) .
Answer 2. (a) We have already shown that there exists a non-zero vector such that is not one-to-one. (By the Rank Nullity Theorem, this also implies that is also not onto.)
We have also discussed that is singular, i.e. is an isomorphism (from to ). Being an isomorphism requires that be linear, one-to-one, and onto. Since matrix multiplication is always linear, the determinant equals zero if and only if is either not one-to-one or not onto. Again, by the Rank Nullity Theorem, is not one-to-one is not onto.
Hence, is not one-to-one .
(b) By definition, an eigenvector must satisfy . Therefore, is an eigenvector for (with eigenvalue ) if and only if since, by our work in part (a), the matrix “kills” a non-zero vector if and only if its determinant is zero.
Problem 3. Let be the linear transformation
(a) Compute the matrix representation .
(b) Use your work in part (b) of Problem 2 to compute all of the possible eigenvalues of the matrix from part (a) of this problem. (Note: in this case you should find two distinct eigenvalues .)
(c ) Find a change-of-basis matrix so that
(d) Take a moment to draw yourself a congratulatory picture (say, of a rainbow or a flower or something else nice), since you just diagoanlized your first matrix! Yay you!
(a) This matrix representation is given by
(b) An eigenvalue will necessarily make the polynomial equal zero. Therefore we can compute these eigenvalues by solving
Factoring and/or the quadratic formula tells us that this polynomial has two roots, namely and .
(c ) As discussed in class, we will have
where the vectors are an eigenbasis (with respect to the given matrix). Therefore, to construct this matrix we need to find the associated eigenbasis.
This can be accomplished by first finding a basis for the eigenspace and then finding a basis for the eigenspace . Finding these bases corresponds to solving two homogeneous systems:
Each eigenspace is one-dimensional, and so we may select a single basis vector from each. Our eigenbasis is, for instance,
As a result we have
Problem 4. Compute the area of the parallelogram shown in the picture below.
(Note: unit lengths are marked on the axes above.)
Answer 4. This can be computed by writing down the two black vectors, and , as columns in a matrix
Then the desired area is .
Problem 5. (a) Find given
(b) Find given
If we want to interpret as a change-of-basis matrix, , what are the basis vectors ?
Answer 5. (a) The matrix is given by
(b) The matrix is given by
The basis contains the vectors
Problem 6. Let denote an diagonal matrix:
Under what conditions is invertible? Write down the inverse matrix (assuming the conditions you specified are met).
Answer 6. is invertible provided each diagonal entry . When this condition is met, the inverse matrix is a diagonal matrix with entries
Problem 7. The trace of a square matrix is the sum of the diagonal entries of :
(a) Prove that the trace, regarded as a function is a homomorphism.
(b) A proof is written below. Write down the theorem this proof demonstrates.
Theorem: Given two matrices, .
proof. Let .
Let us notate the -entry of by , the -entry of by , the -entry of the product by , and the -entry of the product by .
Then the traces of these matrices is given by the sum of their respective diagonal entries so that
We want to prove that , i.e. that
To prove this, we first note that
This lets us replace each expression appearing in the trace of in terms of the entries of and . In particular, we find
Similarly, the trace of the product works out as follows:
Note that in this last sum if we interchange the names of the indices and , we obtain the exact same expression for the . This implies that these two sums are equal, completing the proof.
(c ) Suppose that a square matrix is diagonalizable. Explain why the product of ‘s eigenvalues. Using part (b) from this problem, explain why the sum of ‘s eigenvalues.
Answer 7. (a) Proving the trace is linear is straightforward.
(b) The Theorem claims that .
(c ) First, we can use the fact that for determinants, . Suppose diagonalizes to so that . We then have
and since is a diagonal matrix, by our (many) formula(s) for computing the determinant we find that
where is an eigenvalue for .
For the trace claim, we can perform a similar bit of magic. Note that
and since the trace of is we find that the trace of is the sum of its eigenvalues.
Problem 8. (a) Given two square matrices, , prove that .
(b) Suppose is a non-singular matrix. Prove that .
Answer 8. (a) Let , and now that the entry of is the number
so that the entry of is
We now need to argue that the entry of the product equals . Of course
(b) To prove the claim we simply use part (a) and multiply:
and by part (a) we have
Problem 9. (Bonus) For this problem, get help from a math student who has taken Algebra I.
Prove that the set is a group under matrix multiplication.
Answer 9. (definition) A group is a set, , with a binary operation satisfying three conditions.
(3) is associative; i.e. .
To prove that is a group under matrix multiplication, we must show these three conditions hold (where means ).
First, the identity matrix, , satisfies property (1) since for all square matrices .
Second, given an invertible matrix , we have that with .
Third, by the definition of matrix multiplication it follows that — in fact, the easiest way to argue this is to view matrix multiplication not as “rows times columns,” but as representing the composition of homomorphisms.