Hello Linear Algebraists!

Here is a link to a 12-question assignment. The point of this assignment is to help us build a solid understanding of the following concepts: *linear (in)dependence, spanning set, *and* basis*.

Question 1 requires you to write down a definition, as well as some thoughts on it. Questions 2-5, 8, and 9 are computational as they deal with specific vectors and vector spaces. Questions 6, 11, and 12 are more conceptual, while questions 7 and 10 require you to write proofs.

As mentioned on the actual assignment, these problems correlate with the textbook’s sections on subspaces, span, linear (in)dependence, basis, and dimension.

## Solutions

**Problem 1**. A set of vectors is a basis for a vector space if (1) , i.e. spans all of , and (2) is linearly independent, i.e. the only linear combination of the vectors in that “create” the zero vector is a trivial combination.

The first condition (spanning) is important, as it means that every vector can be “built out of” the vectors from . The second condition (linear independence) is important as it means that given any vector , there is only one way to “build it” from the vectors in . (If there were two such ways, then one could subtract the expressions and violate the definition of linear independence; if there were no such ways, then would not span .)

**Problem 2**. (a) To express as a linear combination of the given vectors in we need to solve the vector equation

which turns into a linear system of equations. One can represent the linear system by the augmented matrix

This augmented matrix tells us that and so . Finally, we also learn that .

(b) To check whether or not is a basis we need to determine whether or not both spans and is linearly independent.

We can determine linear (in)dependence easily since doing so requires us to set up a vector equation similar to that set up in part (a), only instead of using the vector we use the vector . This yields a homogeneous system of equations with augmented matrix

This augmented matrix implies that in order to combine ‘s vectors to produce the zero vector we must use the coefficients , i.e. that is linearly independent.

We can determine whether or not spans by choosing an arbitrary vector and solving the corresponding vector equation. This will, again, lead to a linear system of equations that can be represented by the augmented matrix

.

One can then solve this system for the coefficients and . Note that these coefficients will depend on the values of and .

Alternatively, one can note that the dimension of is already known to be , and so since has three vectors and they are linearly independent, they have to be a basis — that is, they also have to span. (For a contradiction, suppose that these three linearly independent vectors did *not* span all of . Then we could select a fourth, linearly independent vector to join to . A set of four linearly independent vectors from would imply that , which is a contradiction.)

(c ) At first glance, it is not clear what parts (a) and (b) have to do with the given set of polynomials. However, given our discussion of ** representing vectors in finite-dimensional spaces with a basis**, we can relate the given set of vectors to the original set of vectors . In particular, if we represent vectors from using the standard basis then it is clear that the vectors in correspond to the vectors in :

Because is a basis for , this correspondence establishes the fact that is a basis for .

**Problem 3**. (a) The following augmented matrix represents a linear system that, when solved, will tell us how to express as a linear combination of the given vectors:

.

Row-reducing this matrix tells us that and so that .

(b) To determine whether or not is in the span of the given vectors, we end up analyzing the augmented matrix

.

Row-reducing this matrix produces a matrix with an “inconsistent row,” that is a row of all zeros except for the augmented entry. This means that it is impossible to solve the system which in turn implies that cannot be expressed as a linear combination of the given vectors.

**Problem 4**. (skipped)

**Problem 5**. (skipped)

**Problem 6**. (a) One such set is .

(b) One such set is .

(c ) It is impossible to produce a set of three vectors that spans . We know that the dimension of is 4, and so every basis must have exactly 4 elements. If a set of three vectors could span this would imply that , which is impossible.

(d) The set from part (b) is linearly independent.

**Problem 7**. (Proof) Suppose is a linearly independent set of vectors in . To determine whether or not is linearly independent we examine the equation

.

If the only conclusion is that , then this set of vectors is, indeed, linearly independent. If other solutions are possible, then this set is linearly dependent.

In order to make use of our hypothesis, we can rewrite the above equation as

and since the set is linearly independent, this means that

.

From these two equations we learn that , and so the set is also linearly independent.

**Problem 8**. The two vectors are indeed linearly independent. Indeed, two vectors are linearly independent if and only if they are not multiples of one another, and since the two given functions never equal zero, they are multiples of one another if and only if their ratio is constant, i.e. if and only if

for some constant . However, from Calculus 1 we know that this function is not constant, hence these functions are not constant multiples of one another and therefore the vectors and are, as Ted claimed, linearly independent.

**Problem 9**. The standard basis for is given by the four vectors

It is easy to argue that these matrices are linearly independent and that they span all of . In particular, this shows that .

More generally, the standard basis for consists of matrices. Each matrix in this basis has in most of its entries, except for a in a single entry. (This shows that )

**Problem 10**. Suppose is a basis for .

(a) Since a basis is linearly independent, there is only one linear combination of the ‘s that produces the zero vector, namely the trivial combination:

.

(b) Given any , there exists at least one linear combination of the vectors that equals since, by definition of basis, these vectors *span* .

(c ) Let and suppose there are *two* ways to express as a linear combination of the ‘s. That is, suppose

.

The equations above imply that

.

However, from part (a) we know that there is only one linear combination of these vectors that produces the zero vector. Therefore, all of the coefficients in the equation above must individually equal the zero scalar; that is, for all .

This implies that so that the two expressions for were, in fact, the same expressions! This means given any , there is only one, unique way to express in terms of a basis.

**Problem 11**. (a) A subset (of a 4-dimensional space) with vectors must always be linearly dependent.

(b) A subset (of a 4-dimensional space) with vectors is not necessarily or always linearly independent. If the set happens to be a basis for the space, then it will be linearly independent.

**Problem 12**. (a) Liam was proving that the set of vectors is linearly independent.

(b) Lucia was proving that the set of vectors span all of .

(c ) Liam and Lucia’s combined proofs show that is a basis for .