Linear Algebra Assignment 4

Hello Linear Algebraists!

Here is a link to a 12-question assignment.  The point of this assignment is to help us build a solid understanding of the following concepts: linear (in)dependence, spanning set, and basis.

Question 1 requires you to write down a definition, as well as some thoughts on it.  Questions 2-5, 8, and 9 are computational as they deal with specific vectors and vector spaces.  Questions 6, 11, and 12 are more conceptual, while questions 7 and 10 require you to write proofs.

As mentioned on the actual assignment, these problems correlate with the textbook’s sections on subspaces, span, linear (in)dependence, basis, and dimension.

Solutions

Problem 1.  A set of vectors S is a basis for a vector space V if (1) [S] = V, i.e. S spans all of V, and (2) S is linearly independent, i.e. the only linear combination of the vectors in S that “create” the zero vector is a trivial combination.

The first condition (spanning) is important, as it means that every vector \vec{v} \in V can be “built out of” the vectors from S.  The second condition (linear independence) is important as it means that given any vector \vec{v} \in V, there is only one way to “build it” from the vectors in S.  (If there were two such ways, then one could subtract the expressions and violate the definition of linear independence; if there were no such ways, then S would not span V.)

Problem 2.  (a) To express (1, 2, 6)^T as a linear combination of the given vectors in S we need to solve the vector equation

c_1 \left( \begin{array}{c} 2 \\ 1 \\ -1 \end{array}\right) + c_2\left( \begin{array}{c}3 \\ 1 \\ 1 \end{array}\right) + c_3\left( \begin{array}{c} -4 \\ 0 \\ 6 \end{array}\right) = \left(\begin{array}{c} 1 \\ 2 \\ 6 \end{array}\right)

which turns into a linear system of equations.  One can represent the linear system by the augmented matrix

\left( \begin{array}{ccc|c} 2 & 3 & -4 & 1 \\ 1 & 1 & 0 & 2 \\ -1 & 1 & 6 & 6 \end{array}\right) \to \left( \begin{array}{ccc|c} 2 & 3 & -4 & 1 \\ 0 & 1 & -4 & -3\\ 0 & 5 & 8 & 13 \end{array}\right) \to \left( \begin{array}{ccc|c} 2 & 3 & -4 & 1 \\ 0 & 1 & -4 & -3 \\ 0 & 0 & -28 & -28 \end{array}\right)

This augmented matrix tells us that c_3 = 1 and so c_2 = 1.  Finally, we also learn that c_1 = 1.

(b) To check whether or not S is a basis we need to determine whether or not S both spans \mathbb{R}^3 and S is linearly independent.

We can determine linear (in)dependence easily since doing so requires us to set up a vector equation similar to that set up in part (a), only instead of using the vector (1, 2, 6)^T we use the vector (0, 0, 0)^T.  This yields a homogeneous system of equations with augmented matrix

\left( \begin{array}{ccc|c} 2 & 3 & -4 & 0 \\ 1 & 1 & 0 & 0 \\ -1 & 1 & 6 & 0 \end{array}\right) \to \left( \begin{array}{ccc|c} 2 & 3 & -4 & 0 \\ 0 & 1 & -4 & 0 \\ 0 & 0 & -28 & 0 \end{array}\right)

This augmented matrix implies that in order to combine S‘s vectors to produce the zero vector we must use the coefficients c_1 = c_2 = c_3 = 0, i.e. that S is linearly independent.

We can determine whether or not S spans \mathbb{R}^3 by choosing an arbitrary vector \vec{v} = (x, y, z)^T \in \mathbb{R}^3 and solving the corresponding vector equation.  This will, again, lead to a linear system of equations that can be represented by the augmented matrix

\left( \begin{array}{ccc|c} 2 & 3 & -4 & x \\ 1 & 1 & 0 & y \\ -1 & 1 & 6 & z \end{array}\right).

One can then solve this system for the coefficients c_1, c_2 and c_3.  Note that these coefficients will depend on the values of x, y, and z.

Alternatively, one can note that the dimension of \mathbb{R}^3 is already known to be \text{dim}\mathbb{R}^3 = 3, and so since S has three vectors and they are linearly independent, they have to be a basis — that is, they also have to span.  (For a contradiction, suppose that these three linearly independent vectors did not span all of \mathbb{R}^3.  Then we could select a fourth, linearly independent vector \vec{w} \in \mathbb{R}^3 to join to S.  A set of four linearly independent vectors from \mathbb{R}^3 would imply that \text{dim}\mathbb{R}^3 \geq 4, which is a contradiction.)

(c ) At first glance, it is not clear what parts (a) and (b) have to do with the given set of polynomials.  However, given our discussion of representing vectors in finite-dimensional spaces with a basis, we can relate the given set of vectors S' \subset \mathcal{P}_2 to the original set of vectors S \subset \mathbb{R}^3.  In particular, if we represent vectors from \mathcal{P}_2 using the standard basis \mathcal{B} = \{1, x, x^2\} then it is clear that the vectors in S' correspond to the vectors in S:

2+x-x^2 \leftarrow\rightarrow \left(\begin{array}{c} 2 \\ 1 \\ -1 \end{array}\right)

3+x+x^2 \leftarrow\rightarrow \left(\begin{array}{c} 3 \\ 1 \\ 1 \end{array}\right)

-4 + 6x^2 \leftarrow\rightarrow \left(\begin{array}{c} -4 \\ 0 \\ 6 \end{array}\right)

Because S is a basis for \mathbb{R}^3, this correspondence establishes the fact that S' is a basis for \mathcal{P}_2.

Problem 3.  (a) The following augmented matrix represents a linear system that, when solved, will tell us how to express (1, 5, 3, 0)^T as a linear combination of the given vectors:

\left( \begin{array}{ccc|c} 1 & 1 & 0 & 1 \\ 1 & 0 & 1 & 5 \\ 1 & 0 & 0 & 3 \\ 0 & 1 & 1 & 0 \end{array}\right).

Row-reducing this matrix tells us that c_1 = 3, c_2 = -2 and c_3 = 2 so that (1, 5, 3, 0)^T = 3(1, 1, 1, 0)^T - 2(1, 0, 0, 1)^T + 2(0, 1, 0, 1)^T.

(b) To determine whether or not (1, 1, 1, -3)^T is in the span of the given vectors, we end up analyzing the augmented matrix

\left( \begin{array}{ccc|c} 1 & 1 & 0 & 1 \\ 1 & 0 & 1 & 1 \\ 1 & 0 & 0 & 1 \\ 0 & 1 & 1 & -3 \end{array}\right).

Row-reducing this matrix produces a matrix with an “inconsistent row,” that is a row of all zeros except for the augmented entry.  This means that it is impossible to solve the system which in turn implies that (1, 1, 1, -3)^T cannot be expressed as a linear combination of the given vectors.

Problem 4.  (skipped)

Problem 5.  (skipped)

Problem 6.  (a) One such set is \{1, x, x^2, x^3, x+2\}.

(b)  One such set is \{1, x, x^2, x^3\}.

(c ) It is impossible to produce a set of three vectors that spans \mathcal{P}_3.  We know that the dimension of \mathcal{P}_3 is 4, and so every basis must have exactly 4 elements.  If a set of three vectors could span \mathcal{P}_3 this would imply that \text{dim}\mathcal{P}_3 \leq 3 < 4, which is impossible.

(d) The set from part (b) is linearly independent.

Problem 7. (Proof)  Suppose \{\vec{v}, \vec{w}\} is a linearly independent set of vectors in V.  To determine whether or not \{\vec{v}, \vec{v}+\vec{w}\} is linearly independent we examine the equation

c_1\vec{v} + c_2(\vec{v}+\vec{w}) = \vec{0}.

If the only conclusion is that c_1 = c_2 = 0, then this set of vectors is, indeed, linearly independent.  If other solutions are possible, then this set is linearly dependent.

In order to make use of our hypothesis, we can rewrite the above equation as

(c_1+c_2)\vec{v} + c_2\vec{w} = \vec{0}

and since the set \{\vec{v}, \vec{w}\} is linearly independent, this means that

c_1+c_2 = 0 \text{ and } c_2 = 0.

From these two equations we learn that c_1 = 0 = c_2, and so the set \{\vec{v}, \vec{v}+\vec{w}\} is also linearly independent.  \square

Problem 8.  The two vectors are indeed linearly independent.  Indeed, two vectors are linearly independent if and only if they are not multiples of one another, and since the two given functions never equal zero, they are multiples of one another if and only if their ratio is constant, i.e. if and only if

\frac{f(x)}{g(x)} = e^{2x} = c

for some constant c \in \mathbb{R}.  However, from Calculus 1 we know that this function is not constant, hence these functions are not constant multiples of one another and therefore the vectors f(x) and g(x) are, as Ted claimed, linearly independent.

Problem 9.  The standard basis for \mathcal{M}_{2\times 2} is given by the four vectors

\left(\begin{array}{cc} 1 & 0 \\ 0 & 0 \end{array}\right), \left(\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array}\right), \left(\begin{array}{cc} 0 & 0 \\ 1 & 0 \end{array}\right), \left(\begin{array}{cc} 0 & 0 \\ 0 & 1 \end{array}\right)

It is easy to argue that these matrices are linearly independent and that they span all of \mathcal{M}_{2\times 2}.  In particular, this shows that \text{dim}\mathcal{M}_{2\times 2} = 4.

More generally, the standard basis for \mathcal{M}_{m\times n} consists of mn matrices.  Each matrix in this basis has 0 in most of its entries, except for a 1 in a single entry.  (This shows that \text{dim}\mathcal{M}_{m\times n} = mn.)

Problem 10.  Suppose \{\vec{v}_1, \dots, \vec{v}_k\} is a basis for V.

(a) Since a basis is linearly independent, there is only one linear combination of the \vec{v}_i‘s that produces the zero vector, namely the trivial combination:

0\vec{v}_1 + 0\vec{v}_2 + \cdots + 0\vec{v}_k = \vec{0}.

(b) Given any \vec{w} \in V, there exists at least one linear combination of the vectors \vec{v}_i that equals \vec{w} since, by definition of basis, these vectors span V.

(c ) Let \vec{w} \in V and suppose there are two ways to express \vec{w} as a linear combination of the \vec{v}_i‘s.  That is, suppose

\vec{w} = c_1\vec{v}_1 + \dots c_k\vec{v}_k = d_1\vec{v}_1 + \dots + d_k\vec{v}_k.

The equations above imply that

(c_1-d_1)\vec{v}_1 + (c_2-d_2)\vec{v}_2 + \cdots + (c_k-d_k)\vec{v}_k = \vec{0}.

However, from part (a) we know that there is only one linear combination of these vectors that produces the zero vector.  Therefore, all of the coefficients in the equation above must individually equal the zero scalar; that is, c_i - d_i = 0 for all 1 \leq i \leq k.

This implies that c_i = d_i so that the two expressions for \vec{w} were, in fact, the same expressions!  This means given any \vec{w} \in V, there is only one, unique way to express \vec{w} in terms of a basis.

Problem 11.  (a) A subset (of a 4-dimensional space) with 5 vectors must always be linearly dependent.

(b) A subset (of a 4-dimensional space) with 4 vectors is not necessarily or always linearly independent.  If the set happens to be a basis for the space, then it will be linearly independent.

Problem 12.  (a) Liam was proving that the set of vectors is linearly independent.

(b) Lucia was proving that the set of vectors span all of V.

(c ) Liam and Lucia’s combined proofs show that S is a basis for V.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s