Consider the set {ap . . . , a„} of m-dimensional vectors. Whether this set is linearly dependent depends on whether a set of numbers {jc,,..., jc„}, not all of which are 0, can be found such that
7 = 1
(4.5.1) We can pose the problem in a slightly different manner by defining a matrix A whose column vectors are a ^ , . . . , a„. In partition form, this is
L^lằ ^2' • • • ằ ^ôJ- Next, we define the vector x by
X =
^2
Now we ask the question: Does the equation Ax = 0
have a nontrivial (x ^ 0) solution? This equation can be expressed as
[a^, a 2 , . . . , a„J = 0
or, carrying out the vector product,
which is the same as Eq. (4.5.1)
anXi + --- + ai„x„ = 0
(4.5.2)
(4.5.3)
(4.5.4)
(4.5.5)
(4.5.6)
(4.5.7)
ôml^l+---+flmn-ô„=0-
LINEAR DEPENDENCE OF A VECTOR SET AND THE RANK OF ITS MATRIX 151 The question of whether the vectors a j , . . . , a„ are linearly dependent is thus seen to be the question of whether the homogeneous equation Ax = 0 has a solution, and the answer to this latter question depends on the rank of A.
From what we have learned from the solvability theory of Ax = b, we can immediately draw several conclusions: (1) If A is a square nonsingular matrix, then X = 0 is the only solution to Eq. (4.5.4). Thus, a set of n n-dimensional vectors a^
are linearly independent if and only if their matrix A has a rank n, i.e., |A| ^ 0.
(2) This conclusion is a subcase of a more general one: a set of n m-dimensional vectors a^ are linearly independent if and only if their matrix A has a rank n.
This follows from the fact that Ax = 0 admits n — r solutions. (3) A set of n m-dimensional vectors a^ are linearly dependent if and only if the rank r of their matrix A is less than n. This also follows from the fact that Ax = 0 admits n — r solutions. (4) From this it follows that if n > m, then the set of n m-dimensional vectors a^ will always be linearly dependent since r < (m,n).
We are familiar with these properties from Euclidean vector space. We know that no three coplanar vectors can be used as a basis set for an arbitrary three- dimensional vector. What is useful here is the fact that analysis of the rank of the matrix whose Cartesian vector components form the column vectors of the matrix will establish whether a given set of three vectors is coplanar. Coplanar means that one of the vectors is a linear combination of the other two (linear dependence).
From the solvability theory developed in this chapter, we can use the rank of [a. , a„] not only to determine whether the set {a, , a„} is linearly dependent but also to find how many of the a^ are linearly dependent on a subset of the vectors in the set.
The rank of A is equal to the number of linearly independent column vectors and to the number of linearly independent row vectors that A contains. To prove this, consider first an m x n matrix A of rank r in which the upper left comer of A contains an rth-order nonzero minor. Proof of this special case will be shown to suffice in establishing the general case. We can rearrange the homogeneous equation Ax = 0 into the form
flll^l 4 - • • • + ^ir-^r — ^ l . r + l - ^ r + l a\„x,,
(4.5.8)
^ml^l + ' * * I (^mr^r — ^ m , r + l'^r+l ^ M I M - ^ M •
Since the minor
^21 ^22
(4.5.9)
is nonzero and the rank of A is r, the equations in Eq. (4.5.8) have a solution {jCi,..., jc^} for arbitrary jc^_j.,,..., jc„. Note that Ax = 0 always has a nontrivial solution for r < min(m, n) because the rank of [A, 0] is the same as the rank of A.
I 5 2 CHAPTER 4 GENERAL THEORY OF SOLVABILITY OF LINEAR ALGEBRAIC EQUATIONS
One solution to Eq. (4.5.8) is obtained by setting x^^i = 1 and Xj = 0 and solving for x[^\ . . . , jc^^^^ With this solution, Eq. (4.5.8) can be rearranged to get
*l,r+l = - ( a „ x r ' + f l , , x f + . . . + a.,^^'>)
(4.5.10)
or, in vector notation.
a.+i = - E ^ j % ' (4.5.11) This proves that the (r + l)th column vector of A is a linear combination of the
set {ap . . . , a^}. In general, if x^ = 1 for r > r and Xj =0 for j > r and j =^ /, then the solution {xf\ . . . , xj!^] of Eq. (4.5.8) can be found, and so
: : (4.5.12)
or
r
a, = - X ] 4 S ' ^>^' (4.5.13)
Thus, we have proven that all the column vectors a ^ ^ j , . . . , a„ are linear combina- tions of the first r column vectors a j , . . . , ậ The vectors a j , . . . , â are linearly independent because, otherwise, there would exist a set of numbers { c j , . . . , c^}, not all 0, such that
Eqa,=0
1 = 1
or
: : (4.5.14)
If this set of equations has a nontrivial solution, then the rank of the matrix [ a j , . . . , a,.] has to be less than r, which contradicts our hypothesis.
In summary, for a matrix of rank r and of the form considered here, the last n — r column vectors are linearly dependent on the first r column vectors, which themselves are linearly independent. Since the rank of the transpose A^ of A is also r, it follows that the last m — r column vectors of A^ are linearly dependent on the first r column vectors, which themselves are linearly dependent. But the column vectors of A^ are simply the row vectors of A, and so we conclude that
LINEAR DEPENDENCE OF A VECTOR SET AND THE RANK OF ITS MATRIX 153 the last m — r row vectors of A are linearly dependent on the first r row vectors, which are themselves linearly independent.
Next, consider the general case, i.e., an m x n matrix A of rank r, but in which the upper left comer does not contain a nonzero minor. By the interchange of columns and rows, however, a matrix A' can be obtained that does have a nonzero rth-order minor in the upper left comer. We have already shown that the rank of A' is also r, and so if
A ' = : [ a ; , a ^ , . . . , a ; ] , (4.5.15) then the r m-dimensional column vectors a'^ . . . , a^ are linearly independent and the n — r column vectors a|.,.p . . . , a„ are linear combinations of the first r column vectors. Also, from what we presented above, the r /i-dimensional row vectors [a,^p . . . , i^f^y, / = 1 , . . . , r, are linearly independent and the remaining m — r row vectors are linear combinations of the first r row vectors.
To prove what was just stated, note first that the relationship between A and A'is
A' = Q^^^AQ^^\ ,(2) (4.5.16) where the square matrices Q^^^ and Q^^^ are products of the 1,^ matrices that accom- plish the appropriate row interchanges and column interchanges, respectively. Since the determinants of Q^^^ and Q^^^ are ibl, it follows that the ranks of A' and A are
(4.5.17)
uic same, r i u i i i uic piupciiy *^ij*-ij = *> ^^ A^ IIUWS
Q(.)Q(/) ^ 1 or that Q^" equals its own inverse. Consequently,
A = Q"'A'Q<2'
= Q">[a;,...,a:]Q"'
=:[Q<'V„...,Q<"<]
Hn Hhx
,(2)
U=i k=\ J and, therefore,
k=\
We proved already that
(4.5.18)
(4.5.19)
7=1
(4.5.20)
I 5 4 CHAPTER 4 GENERAL THEORY OF SOLVABILITY OF LINEAR ALGEBRAIC EQUATIONS
which, when inserted into Eq. (4.5.19), yields
a/ = EA;Q^^X- (4-5-21)
where Pij = Y!k=\ ^u^^kj- "^he vector a^ is related to one of the set { a j , . . . , a„}, say a^ , by the row interchange operation Q^^\ i.e.,
a;. = Q^^^a;., (4.5.22) and it follows that
Q^^^a;. = Q^^^Q^'^â, = ậ. (4.5.23) Thus, Eq. (4.5.21) reads
where [Ij} indicates the indices of the r column vectors of A that were moved to the columns 1 , . . . , r in A to put a nonzero rth-order minor in its upper left comer.
This proves that any column vector in A is a linear combination of the r column vectors a;^,a;^,...,ậ
These r vectors are linearly independent. To prove this, assume the contrary;
i.e., assume that there exists a set of numbers {cp . . . , c^}, not all 0, such that
T.^j^ij=0. (4.5.25) 10)
By multiplying Eq. (4.5.24) by Q^^\ it follows that Eq. (4.5.24) implies
or that the set {a'p . . . , a^} is linearly dependent. However, this is a contradiction, and so the vectors a^^,..., a/^ must be linearly independent.
Similarly, by considering the transpose of A', we can prove that r of the row vectors of A are linearly independent and that the other m — r row vectors are linear combinations of these r row vectors.
THE FREDHOLM ALTERNATIVE THEOREM 155
The "take-home" lesson of this section is as follows:
THEOREM. If the rank of the m x n matrix A is r, then (a) there are r m- dimensional column vectors {and r n-dimensional row vectors) that are linearly independent and (b) the remaining n — r column vectors {and m — r row vectors) are linear combinations of the r linearly independent vectors.
EXAMPLE 4.5.1. How many of the vectors
M -
f l ' 2
[3_
, 82 =
" l "
1 1
, a3 =
' 2 ~ 3 _4_
, 34 =
"3l 5 _1 \
(4.5.26)
are linearly independent? Since the rank r of A,
A — 13], 3 2 ' 3 3 , 3 4 J —
1 1 2 3 2 1 3 5 3 1 4 7
(4.5.27)
is less than or equal to 3, we know at most three vectors are linearly independent.
By Gauss elimination, we transform A to
K =
1 0 0
1 - 1
0 2 - 1
0 3 - 1
0
(4.5.28)
Thus, the rank of A is 2. Therefore, only two of the vectors are linearly indepen- dent. Indeed, the pair aj and a2 are linearly independent and
• • • 83 - - a^ + 82 and VLA — zôa-5 a ^ . (4.5.29)