In the study of elementary linear algebra, unwary novices are often inclined to think of a vector as an ordered list of real numbers; to them, linear algebra is then conceived of as the study of multiplying matrices with column vectors. But this is a horribly impoverished perspective; we can do so much better for ourselves with a bit of abstraction and generality.
You can think of arrows or lists of numbers if you want or if you must, but the true, ultimate meaning of a vector space is ... well, anything that satisfies the vector space axioms. If you have things that you can "add" (meaning that we have an associative, commutative binary operation and we have inverse elements and an identity element with respect to this operation), and you can "multiply" these things by other things that come from a field (the "vectors" in the space and the "scalars" from the field play nicely together in a way that is distributive &c.), then these things you that you have are a vector space over that field, and any of the theorems that we prove about vector spaces in general apply in full force to the things you have, which don't have to be lists of real numbers; they could be matrices or polynomials or functions or whatever.
Okay, so it turns out that n-dimensional vector spaces are isomorphic to lists of n numbers (elements of the appropriate field), but that's not part of our fundamental notion of vectorness; it's something we can prove—
Let's say a set of n elements {vj}j from a vector space V span the space iff every element v in the space can be written as a linear combination of elements in the set: v = Σj cjvj for some coefficients cj. Let's also say that a set of vector space elements is linearly independent iff the only way a linear combination of them can be the zero vector is if all the coefficients are zero: Σj cjvj = 0 implies ∀j cj = 0. We say a set is a basis if and only if it spans the space and is linearly independent. Bases are important because of the following
Theorem. Every element of a vector space can be written uniquely as a linear combination of basis elements.
Proof. Consider an arbitrary v in a vector space V with basis {vj}j. Because a basis is a spanning set, it follows trivially that we can write v as a linear combination of basis elements, but we want to show that such a representation is unique. But uniqueness follows from the linear independence of the basis: suppose v = Σj cjvj and that v = Σj djvj. It turns out that the corresponding c and d coefficients have to be the same: Σj cjvj = Σj djvj implies that Σj cjvj – djvj equals the zero vector, which implies that Σj (cj – dj)vj equals the zero vector, which (from linear independence) implies that ∀j cj – dj = 0 and thus that ∀j cj = dj, yielding uniqueness, which is what I've been trying to tell you this entire time.
Because (given a particular basis) we have a unique representation of every v as a linear combination of basis elements, we can use the coefficients of that linear combination as coodinates and treat the vector as a list of numbers ... but that's just our convenience; the coordinates with respect to a different basis would be different, and the vector itself simply is.
Leave a Reply