In the study of elementary linear algebra, unwary novices are often inclined to think of a vector as an ordered list of real numbers; to them, linear algebra is then conceived of as the study of multiplying matrices with column vectors. But this is a horribly impoverished perspective; we can do so much better for ourselves with a bit of abstraction and generality.

You *can* think of arrows or lists of numbers if you want or if you must, but the true, ultimate meaning of a *vector space* is ... well, anything that satisfies the vector space axioms. If you have things that you can "add" (meaning that we have an associative, commutative binary operation and we have inverse elements and an identity element with respect to this operation), and you can "multiply" these things by other things that come from a field (the "vectors" in the space and the "scalars" from the field play nicely together in a way that is distributive *&c.*), then these things you that you have are a vector space over that field, and any of the theorems that we prove about vector spaces in general apply in full force to the things you have, which don't have to be lists of real numbers; they could be matrices or polynomials or functions or whatever.

Okay, so it *turns out* that *n*-dimensional vector spaces are isomorphic to lists of *n* numbers (elements of the appropriate field), but that's not part of our *fundamental* notion of vectorness; it's something we can *prove*—

Let's say a set of *n* elements {*v*_{j}}_{j} from a vector space *V* *span* the space iff every element *v* in the space can be written as a linear combination of elements in the set: *v* = Σ_{j} *c*_{j}*v*_{j} for some coefficients *c*_{j}. Let's also say that a set of vector space elements is *linearly independent* iff the only way a linear combination of them can be the zero vector is if all the coefficients are zero: Σ_{j} *c*_{j}*v*_{j} = **0** implies ∀*j* *c*_{j} = 0. We say a set is a *basis* if and only if it spans the space and is linearly independent. Bases are important because of the following

*Theorem*. Every element of a vector space can be written uniquely as a linear combination of basis elements.

*Proof.* Consider an arbitrary *v* in a vector space *V* with basis {*v*_{j}}_{j}. Because a basis is a spanning set, it follows trivially that we can write *v* as a linear combination of basis elements, but we want to show that such a representation is unique. But uniqueness follows from the linear independence of the basis: suppose *v* = Σ_{j} *c*_{j}*v*_{j} and that *v* = Σ_{j} *d*_{j}*v*_{j}. It turns out that the corresponding *c* and *d* coefficients have to be the same: Σ_{j} *c*_{j}*v*_{j} = Σ_{j} *d*_{j}*v*_{j} implies that Σ_{j} *c*_{j}*v*_{j} – *d*_{j}*v*_{j} equals the zero vector, which implies that Σ_{j} (*c*_{j} – *d*_{j})*v*_{j} equals the zero vector, which (from linear independence) implies that ∀*j* *c*_{j} – *d*_{j} = 0 and thus that ∀*j* *c*_{j} = *d*_{j}, yielding uniqueness, which is what I've been trying to tell you this entire time.

*Because* (given a particular basis) we have a unique representation of every *v* as a linear combination of basis elements, we can use the coefficients of that linear combination as coodinates and treat the vector as a list of numbers ... but that's just our convenience; the coordinates with respect to a different basis would be different, and the vector itself simply *is*.

## Leave a Reply