fir)
Recall from Chapter 1 that an rth-order determinant M\ of the matrix formed by striking m—r rows and n — r columns of an m x n matrix A is called an rth-order minor of A. For example, consider the matrix
A = ^22 ^23
^32 ''33 -*24
^34
(4.2.1)
If we strike the second row and the second and third columns
ô11
L ô31
ô14
ô34 J
(4.2.2)
we obtain the second-order minor
Mf = ô11 ^14
^31 ^34
n, we obtain the third-c
r(3) _
ô21
ô3 1
^12 ô13
^22 ^23
ô32 ô 33
(4.2.3)
(4.2.4)
We previously defined the rank r^ of a matrix A as the highest order minor of A that has a nonzero determinant. For instance, the rank of
1 2 1
1 1 (4.2.5)
SYLVESTER'S THEOREM A N D THE DETERMINANTS OF MATRIX PRODUCTS 125 is 2 since striking the third column gives the minor
M<^> = 1 2 1 1 - 1 . The rank of
1 1 1 1 1 1
on the other hand, is 1 since striking one column gives
(4.2.6)
(4.2.7)
M^^> = 1 1 1 1 0, and striking two columns and one row gives
M^^^ = | l | =
(4.2.8)
(4.2.9) One of the objectives of this section will be to relate the rank of the product of two matrices to the ranks of the matrices; i.e., if C = AB, what is the relationship among r^ and r^ and r^? If A is a /? x n matrix with elements a^j and B is an n X q matrix with elements b^j, then C is a /? x ^ matrix with elements
^ij = H^ikhj^ / = 1 , . . . , /7, y = 1,. . , , ^. (4.2.10)
k=i
Since only a square matrix has a determinant, we know from the outset that r^ <
min(/7, n), TQ < min(n, q), and r^ < min(/?, q). There are, of course, cases where
2 0
0 1 and B 1 0
0 2 then C = 2 0
0 2 (4.2.11)
and so r^ = r^ = r^ = 2. However, there are also cases where r^^ < r^ and r^;
e.g., if A =
and so r^ = 0, r^ = rj5 = 1. Similarly, if 1
0 1
0 and B = 1 0
- 1 0 , then C = 0 0
0 0 (4.2.12)
A = 1 1
0 0 and B = 1 1 1
-1 0 - 1 then C = 0 1 0 0 0 0
(4.2.13) and so r^ = 1, r^ = 1, and r^ = 2. The question is whether there are other possibilities. The answer is provided by
126 CHAPTER 4 GENERAL THEORY OF SOLVABILITY OF LINEAR ALGEBRAIC EQUATIONS
SYLVESTER'S THEOREM. If the rank of A is r^ and the rank o / B is r^, then the rank r^^ ofC = AB obeys the inequality
re <min(r^,rg). (4.2.14) The proof of the theorem is straightforward though somewhat tedious. If A and B are /? X n and n x q matrices, then
C =
C i i Ci
S i ^ 2 PQ J
(4.2.15)
where the elements c^j are given by Eq. (4.2.10). Let the indices h^,... ,h^ and A:,,..., /:,. denote the rows and columns of M^'\ an rth-order minor of C given as
M^^' =
^h,k, (^h,k2 ^h^k.
^h,ki ^h.kj ' • • ^Kk,
(4.2.16)
Since C;,.^, = ^",=1 ^j.bj^kr ^c can be rewritten as
M, (r)
2Z%h^hk, ^h,k2
Jl^Kh^hk, %k2
E
J]
^hJx^j.k, %k2
h
^Kh ^Kh
^h^K
^Kk,
^h.kr
^hX
^h,K
^hrK
(4.2.17)
In accomplishing the succession of equations in Eq. (4.2.17), we have utilized the elementary properties 6 and 4 of determinants discussed in Chapter 1. We can continue the process by setting c^^i^^ - I ] " =1 ^hj2^hh ^^^ carrying out the same elementary operations to obtain
%h %J2 ^h^k.
%h %J2 ^Kh
^h,kr
^h,k.
(4.2.18)
SYLVESTER'S THEOREM A N D THE DETERMINANTS OF MATRIX PRODUCTS
Continuation of the process finally yields
127
%h %J2 •"hxjr
^Kh ^Kh ' ' • ^Kjr
=E--- E b,,^...b,,Mr,
(4.2.19)
where we have identified that the remaining determinant is an rth-order minor M (r)
f(r)
of A. Note that if r > r^, then the minor M ^ is 0 by the definition of r^. Thus, Eq. (4.2.19) proves that
r c < ^ ^ . (4.2.20) Beginning again with Eq. (4.2.16), we can express the elements of the first row of
M^c^ as
71 = 1
and then use elementary properties to obtain
(4.2.21)
M[^^ = E^-hiJi
^h,kx ^hJi ^h,kr
(42,22)
Continuation of this process with the other rows eventually yields
< - E - - - E ô*,;.•--"^.y,
J\ h¥J\-Jr-
^hh ''' ^J,K (4.2.23)
Again, the rth-order minor Mg of B will be 0 if r > r^, and so we conclude
rr < rn. (4.2.24)
The combination of Eqs. (4.2,20) and (4.2.24) implies Sylvester's theorem, Eq. (4.2.14).
128 CHAPTER 4 GENERAL THEORY OF SOLVABILITY OF LINEAR ALGEBRAIC EQUATIONS
In the special case that A and B are square n x n matrices
c =
E^ô;i^-,l E % 2 ^ 2 2
and elementary operations similar to those yielding Eq. (4.2.23) give
ici=i:--- E ' l 7 l
Jr1^h-Jr-I
hr
^Jnl
b ; i 2
^;ô2
b;
Jnn
(4.2.25)
(4.2.26)
The determinant above involving the b^j elements will be nonzero only if the integers Jx, ji^ • - - ^ Jn ^^e a permutation of 1, 2 , . . . , n. Otherwise, two rows of the determinant would be the same. Moreover, the determinant in Eq. (4.2.26) is equal to (—1)''|B|, where P is the number of transpositions necessary to reorder the integers 7i, 72' • •ằ Jn to the sequence 1, 2 , . . . , n (thereby rearranging the columns to the proper order to get |B|). Thus, Eq. (4.2.26) can be rewritten as
ici = iBix:...i:(-i)%. ...fl, njn' (4.2.27)
Noting that the factor to the right of |B| is, by definition, the determinant |A|, we have proved the following:
THEOREM. If A and B are square matrices, the determinant of their product C is the product of their determinants, i.e.,
|C| = |A||B|. (4.2.28)
This result is not only useful in evaluating products of determinants, but it is also frequently employed in proving theorems concerning matrices and linear equations. For example, if A is nonsingular, i.e., |A| ^ 0 and A~^ exists, then AA-* = I and Eq. (4.2.28) implies lAA'^l = |A| |A-^| = |I| = 1, or |A-*| = 1/1 A|. This greatly simplifies finding the determinant of the inverse of A.
Before leaving this section, let us establish a corollary to Sylvester's theorem that is very useful in the theory of solvability of Ax = b.
COROLLARY. The multiplication of a matrix B by a square nonsingular matrix A does not change the rank ofB; i.e., ifC = AB, and if A is square and nonsin- gular, then
rc=^r,. (4.2.29) The proof is simple. If A is a nonsingular n x n matrix, then its rank r^ is n
since |A| :J^ 0, and so is the rank of A"^ since |A~^| ^ 0. According to Sylvester's theorem,
re < min(r^, r^) = min(n, r^). (4.2.30)
GAUSS-JORDAN TRANSFORMATION OF A MATRIX
But
B = : A 'C for which Sylvester's theorem requires
Tg < min(r^-i, r^^) = min(n, r^).
129
(4.2.31)
(4.2.32) Equations (4.2.30) and (4.2.32) combine to imply that r^^ < Vj^ and r^ < r^^, which can only be true if r^ —r^, thus proving the corollary. A similar proof establishes that the rank of BA is the same as that of B if A is a square nonsingular matrix.
EXAMPLE 4.2.1. Consider the matrices A = - 2
1 B
and
AB
1 - 2
-1 2 (4.2.33)
(4.2.34)
We note that since |A| 3, r^ = 2 , and since the order of the only nonzero minor 1. This is admittedly a rather trivial example, but in the next sections we will flex the true muscle of the corollary.
of B is 1, r^ = 1. The same is true of AB and so r^^