Theorem – Every set of vectors in a finite dimensional vector space can be extended to a basis.

Definition

Let V be a finite dimensional vector space over some set R. Let Y_1 = \{y_1, ..., y_k\} be a subset of V. If Y is not already a basis in V, let Y_2\subset{V}, with Y_2\cap{Y_1} = \emptyset (so that Y_1,Y_2 are distinct). There exist subsets Y_1, Y_2 of V such that Y_1\cup Y_2 is a basis in V.

Proof

We will use proof by contradiction to show that, given two different bases for a finite dimensional vector space V, say X = \{x_1, ..., x_i\} and Y = \{y_1, ..., y_k\}, i \ne k leads to a contradiction.

To do so, consider the set X . Since it is a basis, it is linearly independent, and a linear combination of the vectors in X can generate any vector in V. It follows that said linear combination can generate the vector y_k, so that the set

S = \{y_k, x_1, ..., x_i\}

is linearly dependent. Since S is linearly dependent, some x_j\in{S} can be written as a linear combination of the vectors preceeding it for some j = 1, ..., i . Considering the set S' = S\setminus \{x_j\}, it follows from the previous reasoning that, since we can make the vector x_j via a linear combination of the elements of S', that S' can still make any vector in V. For a similar reason, the set S' is also linearly depenedent.

Note that we have appended the last vector of Y and removed one element from X. S' is linearly dependent. If we continue this approach, at each step we have removed the same amount of elements as we have added. Thus, at the 6th step we will have removed 6 elements (x‘s) from X and added 6 elements of Y to the set X.

Now, we have two cases. Since we are assuming that i\ne k, either i < k or i > k. If