Theorem – Uniqueness of coefficients in linear combination.

Theorem

Consider a vector space V over F. Let \mathbf{v}_1\in{V} be a particular vector in V, and let \mathbf{x}_i be in a basis for V for all \mathbb{N}\cap{[1, n]}. Then, the equation

\sum\limits_{i=1}^{n} \alpha_i \mathbf{x}_i = \mathbf{v}_1

where the left hand side is a linear combination of the vectors in the basis, has a unique list of coefficients \alpha_i for each element \mathbf{v}_1\in{V}.

Proof

Let us assume that there are two distinct sets of coefficients corresponding to the same vector \mathbf{v}_1\in{V}. I.e, there exist \{\alpha_1, ..., \alpha_n\}, \{\beta_1, ..., \beta_n\} such that

\sum\limits_{i=1}^{n} \alpha_i \mathbf{x}_i = \sum\limits_{i=1}^{n} \beta_i \mathbf{x}_i = \mathbf{v}_1

Since the two sums are equal, by axiom 1.4 of a vector space, their difference is the zero vector,

\begin{aligned} &\sum\limits_{i=1}^{n} \alpha_i \mathbf{x}_i - \sum\limits_{i=1}^{n} \beta_i \mathbf{x}_i = \mathbf{0} \\ \implies & \sum\limits_{i=1}^{n} (\alpha_i - \beta_i)\mathbf{x}_i = \mathbf{0} \end{aligned}

But we know that the x_i‘s are a basis for V, so they are linearly independent. It follows then that

\sum\limits_{i=1}^{n} (\alpha_i - \beta_i)\mathbf{x}_i = \mathbf{0} \implies (\alpha_i - \beta_i) = 0, ~~~\forall{i} = 1, ..., n

Which contradicts our assumption that the sets \{\alpha_1, ..., \alpha_n\}, \{\beta_1, ..., \beta_n\} are distinct. \square