FIGURE 2.4. l.
The Swiss mathematician Leon- hard Euler (1707-1783) touched on all aspects of the mathematics and physics of his time. His com- plete works fill 85 large volumes;
some were written after he became completely blind in 1771. He was immensely influential both in his research and in teaching: all ele- mentary calculus and algebra text- books are in some sense rewrites of Euler's books.
Euler spent much of his profes- sional life in St. Petersburg. He and his wife had thirteen children, five of whom survived to adult- hood. According to the Mathe- matics Genealogy Project, he also had (as of May 1, 2015) 87941 mathematical descendants.
These ideas apply in all linear settings, such as function spaces and integral and differential equa- tions. Any time the notion of lin- ear combination makes sense we can talk about span, kernels, lin- ear independence, and so forth.
In 1750, questioning the general assumption that every system of n linear equations in n unknowns has a unique solution, the great mathematician Leonhard Euler pointed out the case of the two equations 3x - 2y = 5 and 4y = 6x - 10. "We will see that it is not possible to determine the two unknowns x and y, " he wrote, "since when one is eliminated, the other disappears by itself, and we are left with an identity from which we can determine nothing. The reason for this accident is immediately obvious, since the second equation can be changed to 6x - 4y = 10, which, being just the double of the first, is in no way different from it."
Euler concluded that when claiming that n equations are sufficient to deter- mine n unknowns, "one must add the restriction that all the equations be different from each other, and that none of them is included in the others."
Euler's "descriptive and qualitative approach" represented the beginning of a new way of thinking. At the time, mathematicians were interested in solving individual systems of equations, not in analyzing them. Even Euler began his argument by pointing out that attempts to solve the system fail;
only then did he explain this failure by the obvious fact that 3x - 2y = 5 and 4y = 6x - 10 are really the same equation.
Today, linear algebra provides a systematic approach to both analyzing and solving systems of linear equations, which was unknown in Euler's time. We have already seen something of its power. Writing a system of equations as a matrix and row reducing it to echelon form makes it easy to analyze: Theorem 2.2.1 tells us how to read the row-reduced matrix to find out whether the system has no solution, infinitely many solutions, or a unique solution (and, in the latter case, what it is).
Now we introduce vocabulary that describes concepts implicit in what we have done so far. Linear combinations, span, and linear independence give a precise way to answer the questions, given a collection of linear equations, how many genuinely different equations do we have? How many can be derived from the others?
Definition 2.4.1 (Linear combination). If v\, ... , Yk is a collection of vectors in !Rn, then a linear combination of the vi is a vector w of the form
- ~ k -
W = ~aiVi for any scalars ai. 2.4.1
i=l
The vector [ ~] is a linear combination of e1 and e2:
The vectors e1, e2 E IR.2 are lin- early independent. There is only
~ne way_ to write [ ! ] in terms of
e1 and e2:
3[~] +4[~] = [!].
But the vectors [ ~] , [ ~] , [ ~]
are not linearly independent since we can also write
[!] = [~] +2 [~].
If you drive a car that can only move parallel to the Vi vectors, the span of the Vi is the set of all accessible destinations.
Saying that the columns of [T]
are linearly independent is equiva- lent to saying that a linear trans- formation T is one to one.
Saying that the columns of T : Jin ---> Jim span Jim is equiv- alent to saying that T is onto;
both statements mean that for every b E Rm, there is at least one solution to the equation T(x) = b.
FIGURE 2.4.2.
The vectors il and v span the
plane: any vector, such as a, can be expressed as the sum of com- ponents in the directions il and v
(i.e., multiples of il and v).
But m IB not a linea< comblliation of e, ~ m and e, ~ m
Linear independence and span
Linear independence is a way to talk about uniqueness of solutions to linear equations; span is a way to talk about the existence of solutions.
Definition 2.4.2 {Linear independence). A set of vectors v\, ... , vk
is linearly independent if the only solution to
2.4.2 An equivalent definition (as you are asked to confirm in Exercise 2.4.15) is that vectors v\, ... , vk E IR.n are linearly independent if and only if a vector w E IR.n can be written as a linear combination of those vectors in at most one way:
k k
L XiVi = L YiVi implies x1 =Yi, x2 = Y2, ... , Xk = Ykã 2.4.3
i=l i=l
Yet another equivalent statement (as you are asked to show in Exercise 2.8) is to say that Vi are linearly independent if none of the Vi is a linear combination of the others.
Definition 2.4.3 (Span). The span of vi, ... , vk is the set of linear combinations afv1 + ã ã ã + akvk. It is denoted Span (vi, ... , vk)ã
The word "span" is also used as a verb. For instance, the standard basis vectors e1 and e2 span IR.2 but not JR.3 . They span the plane, because any vector in the plane is a linear combination a1 e1 + a2 e2: any point in the (x, y)-plane can be written in terms of its x and y coordinates. The vectors 'ii and v shown in Figure 2.4.2 also span the plane. (We used the word informally in this sense in Section 1.4 when we defined the "line spanned by x" as the line formed of all multiples of x.)
Exercise 2.4.5 asks you to show that Span (vi, ... , vk) is a subspace of
JR_n and is the smallest subspace containing vi, ... , vk. (Recall Definition 1.1.5 of a subspace.)
Examples 2.4.4 (Span: two easy cases). In simple cases it is possible to see immediately whether a given vector is in the span of a set of vectors.
I. fa the vecto• il ~ [ :J in the 'P"Il of W ~ [fl ? Clea<ly not; no multiple of 0 will give the 1 in the second position of 'ii.
Linear independence is not re- stricted to vectors in Rn: it also applies to functions and matri- ces (and more generally, to ele- ments of arbitrary vector spaces, discussed in Section 2.6). For ex- ample, the matrices A, B and C are linearly independent if the only solution to
a1A + a2B + a3C = [OJ is
a1 = a2 = a3 = 0.
(Recall that we denote by [OJ the matrix all of whose entries are 0.)
182 Chapter 2. Solving equations 2. Given the vectors
2.4.4
is v4 in Span{vi, v2, v3}? Check your answer in the footnote below.4 6.
Theorem 2.4.5 translates Theorem 2.2.1 on solutions to linear equations into the language of linear independence and span.
Theorem 2.4.5 (Linear independence and span). Let vi, ... , vk be vectors in ~n; let A be the n x k matrix [V1 , ... vk). Then
1. vi, ... , vk are linearly independent if and only if the row-reduced matrix A has a pivotal 1 in every column
2. vi, ... , vk span ~n if and only if A has a pivotal 1 in every row
Proof. 1. The vectors v1 , . . . , vk are linearly independent if and only if the only solution to Ax = 0 is x = 0. Thus part 1 follows immediately from part 2a of Theorem 2.2.1.
2. The vectors v1, ... , vk span ~n if and only if for any b E ~n, the equation Ax= b has a solution. The row reduction of [Alb) is [A, b). There exists b containing a pivotal 1 if and only if A contains a row of O's, (i.e., if and only if A does not have a pivotal 1 in every row). Thus the equation
Ax = b has a solution for every b if and only if A has a pivotal 1 in every row. D
It is clear from Theorem 2.4.5 that n linearly independent vectors in ~n
span ~n: the matrix A formed using those vectors as columns row reduces to the identity, so there is a pivotal 1 in every column and every row.
Example 2.4.6 (Row reducing to check span). Given the vectors
is v in the span of the other three? We row reduce
4No. Since the second and third entries of v1 are 0, if v4 were in the span of {v1, v2, v3}, its second and third entries would depend only on v2 and V3. To achieve the 0 of the second position, we must give equal weights to v2 and v3, but then we would also have 0 in the third position, whereas we need 1.