# Linear Assignment 2

## Preview

This week we will begin our discussion and exploration of abstract Vector Spaces.  As mentioned in class, these are “places” where linear combinations make sense.  More precisely, a vector space (over the real numbers) is a set, $V$, equipped with two operations, one called addition (or vector addition) and one called scalar multiplication.   We usually denote these operations as follows:

$\vec{v} + \vec{w} \text{ and } c\cdot\vec{v}$

where $\vec{v}, \vec{w} \in V$ and $c \in \mathbb{R}$.  In order for these operations to feel or work as meaningfully as they do in $\mathbb{R}^n$, we require that they obey 10 rules or axioms.  These are conveniently outlined in Chapter 2 of our textbook, which you should be reading thoroughly.

When we want to check that a proposed set $V$ with operations $+, \cdot$ is a vector space over $\mathbb{R}$, we must carefully check each and every one of the ten axioms, and so this can be a lengthy process.

If one starts with a rather arbitrary set and proposes arbitrary notions of $+$ and $\cdot$, it is quite likely that some of the vector space axioms will fail.  That being said, our world is rich with examples of vector spaces, ones that we have worked with for quite some time.

Example 1. (Differentiable functions)  In Calculus I we study functions $f(x)$ whose derivative, $f'(x)$, can be computed.  For this example, let’s consider the set

$V = \{ \text{ differentiable functions } f(x) { with domain } \mathbb{R} \}$.

Recall from your Calculus courses that the sum of two differentiable functions is itself differentiable since

$\left( f(x) + g(x) \right)' = f'(x) + g'(x)$

and that a constant (or scalar) times a differentiable function is also differentiable since

$\left(c\cdot f(x) \right)' = c\cdot f'(x)$.

These two observations suggest that the set $V$ is a vector space under usual function addition and multiplication by constants.  We would need to check all axioms to make this conclusion official, but [spoiler alert], it works!  This set $V$ is a vector space over the reals.

## Review

This week’s assignment will also reinforce some of the computations and ideas we discussed last week, so let’s mention some of those important ideas, too.

Recall that the solution set for a linear system can always be expressed in the following form:

$\text{ solution set } = \left\{ \vec{p} + c_1\vec{\beta}_1 + \cdots c_k\vec{\beta}_k : c_i \in \mathbb{R} \right\}$

where the vector $\vec{p} \in \mathbb{R}^n$ is one particular solution to the system, and the vectors $\vec{\beta}_i$ are solutions to the the homogeneous system.

This result or observation reminds us that, by using Gauss’ method to identify leading and free variables, we can always express the solution set for a homogeneous system using a fixed number of vectors.  We say that the “dimension” of the solution set equals the number of these vectors, which also equals the number of free variables associated to the system.

From this form of the solution set we see that linear systems always have either no solutions (when no $\vec{p}$ exists), one solution (when $\vec{p}$ exists and all $\vec{\beta}_i = \vec{0}$), or infinitely many solutions (when $\vec{p}$ exists and at least one vector $\vec{\beta|_i} \neq \vec{0}$).

As a result, we also see that every homogeneous system either has one unique solution (the zero vector $\vec{0}$) or infinitely many solutions.  We say that a coefficient matrix is non-singular if the associated homogeneous system has one solution, and we say that it is singular if it has infinitely many.

Finally, we also introduced the idea of a $\lambda$-weighted self-replicating solution to a linear system.  As mentioned in your second assignment, we will say that a vector $\vec{x} = (x_1, x_2, \dots, x_n)^T$ is such an object if

$A\vec{x} = \lambda\vec{x}$.

As an example of this, we can check that the vector $(\sqrt{3}, 1)$ is a $2$-weighted self-replicating solution for the matrix

$\left(\begin{array}{cc} 1 & \sqrt{3} \\ \sqrt{3} & -1 \end{array}\right)$

That is, we want to check that the following two equations are true:

$1\left(\sqrt{3}\right) + \sqrt{3}\left(1\right) = 2\left(\sqrt{3}\right)$

$\sqrt{3}\left(\sqrt{3}\right) - 1\left(1\right) = 2\left(1\right)$