2 Hence the new variables are

Một phần của tài liệu Group theory; the application to quantum mechanics (Trang 89 - 117)

Fig. 3.4. l'he choice of unit vectors that establishes a decomposition of the representation into two irreducible representations. ..

From which we find the matrices

The reduction cannot be performed any further .. There is no invariant axis in the plane m2 •

80 GROUP THEORY [Ch. 3~ § 7

Hence we have found a three-dimensional representation of the group 9' 3 and we reduced it to two irreducible representations. One is the identity representation, consisting of the matrices 1, and the other a two .. dimensional representation in the plane fR2 •

The group f/ 3 has a third irreducible representation. It is one-dimensional and can be found very easily. The alternating group .RIa is an invariant sub- group of ~3. It consists of the even permutations: (1), (123) and (1 32) and we may write (compare § 3.2)

.93 == d,+(12)J7I'3.

(1 2)J:l3 is the coset associated with d, and consists of the odd permuta- tionsã (1 2), (2 3) and (3 1). If we let the number -1 correspond to the elements of the coset and ifwc let the number + 1 correspond to the elements of the subgroup, we obtain the antisymmetric representation of 9'3 It The identical representation could be called the symmetric representation ..

As a matter of fact any symmetry group f/ ft must have these two representa ...

tions as is obvious from the explanation in § 6.2. It is possible to prove that for !/ 3 only these three irreducible representations exist. As a last remark only the two-dimensional representation is faithful (c.ompare problem 391) ..

7~S. FINITE GROUPS

Theqrem: The representations of a finite group can aU be considered to be unitarY and therefore completely reducible .. Let ~ be an arbitrary represen-

tatio~; of order n of a finite group ~. Let us take an arbitrary Hermitian form: for instance, the unit form F == x~ Xl + x! %2 + ... x: XJt constructed with the variables of the representation 16. We submit ~s form to all the substitutions of ~ and add all the results. This way we obtain an Hermitian form that stays invariant under all these substitutions, since they only per- lnute the terms in the sum. By a convenient choice of coordinates we can bring this form on its main axis and by a "choice of units", into the unit form itself (compare problem 3.12) ..

The transformation just described is then used to transform aU the matrices of the representation (§ into '" == S~s -1. This new representation, equiva- lent with ~the first one, is unitary since it has the property that it leaves

"the length of a vector" (x~)ã x~ + (xl)ã xi + ... (~). ~ invariant.

It is often convenient not to limit oneself to unitary representations,

as in the example of § 7.4.

Ch. 3, § 81 UNIQUENESS THEOREM 81 8. Uniqueness Ibeorem: The decomposition of a given representation !I from

a group ~ into irreducible constituents is only possible in one lvay More precisely, if one finds two decompositions

t;I == 1 +"2 + ;t • • W II and = W 1 + ~2 + ã ... ~~, I

we must have p :: pi and the two sequences are formed by irreducible rep- resentations which are one by one equivalent after changing the order in a proper way.

In modem algebra which deals only indirectly with the notion of representa-

tions this proposition is connected to a more abstract theorem due to Jordan, Hoelder, and Noether.l

In order to give an idea of the proof of the theorem without goinS into details we will show it for c)nly three dimensions. "fhis gives the possibility of specifying the precise meaning of the process of decomposition with the help of a simple geometric example. In the case of a three-dimensional space two hypotheses are possible, either the irreducible invariant subs paces are a plane 812 == .%Oy and a line Btl == Oz" (If the representation is unitary the line is perpendicular to the plane) or they are Li.e three axes til := Oz,

9l~ :; Ox and tI~! == 0)1"

1. Suppose the first assumption is true:

813 = al1 +-2

i.e. e~h vector of Ita that iOC$ through the origin can be decomposed un- ambigUously into a. component lying in III and a component in 8f.l and eB.ch of t~se will stay in its subspace under the transformations of the represen- tati0ns 'I of the "group '6 .. It is impossible to find an invariant ,plane Iii

that does not coincide with fIl2 • Indeed if'it existe~ it would cut 812 along a line L. This line would be a.n invaria.nt subspace of 812 since it is the inter- section betWeen two invariant subspa.ces. But III is irreducible, tho tine L cannot exist and each invariant plane Iti has to coincide with 8ll o In the same way it is impossible to find an invariant line Iti outside the axes Oz == tll1 because if it existed it would determine with this axis an invariant plane different from Uta ..

2.. In t1ie second case we have

tllJ == \11 +91; +8l 1';

1 Compare for instance SPBlSD. [1937] section 11.

82 GROUP THEORY [Ch. 3jO § 8~ 9'

the irreducible representations are one-dimensional, i.e. the matrices of "

are diagonal. In this case there is no irreducible invariant plane since the intersection of such a plane with the pIaile illl m~ (which is itself an invariant plane) is invariant. Finally there could be a line L different from the three coordinate axes that forms an invariant subspace on itself. In order that the vectors 11 on this line stay on it after the transformations of the representation

!I, it would be necessary that all components of the vector v%, v, and VI: be multiplied with the same aumber. In this case the representation would have one dimension instead of three. Hence the only invariant subspaces are the three coordinate planes which are each reducible to their axes.

We see from this example the uniqueness of the decomposition ofa three- dimensional representation and a simjlar proof can be given in the general case of n dimensions.

9. Schur's Lemma and Related Theorems

The theorems discussed below are crucial in the theory of representations.

Consider two vector spaces \ll and e, one with m, the other with n dimen- sions .. Let:

1. It be a system of mappings or bilinear transformations of \ll onto itself, consisting of the matrices A R' B R • • ••

2. t§,~ i be a system of mappings of e onto itself, consisting of the matrices As, Bs. ã .•

Th~' matrices of the two systems correspond one by one to each other.

~ R and lis could be two representations of the same group, but it is not necessary to introduce the concept of a group at all.

3. Finally let there be a mapping T of 01 into e. T is a rectangular matrix such that to every vector x of m there corresponds a vector y of ®;

y = Tx. (3.10)

1'he converse in general is not true. To a null ... vector of CS corresponds a subspace ~' in 8l (this is an invariant subspace of the additive group f!l, comp. § 4.4) and to a vector y which is different from zero corresponds a coset associated with Bt' (compare § 6.2).

Hence T established a homomorphism of 81 into e, or at least between a part of the space e and the space tJl, since it may be possible that there exits

Ch~ 3, § 9] SCHUR'S LEMM£t\. 83 vectors in e which are not used at all in our homomorphism, that is they do not" correspond to any vector of R. 1

Having supposed all this we assume:

1. that the system r#R isã irreducible;

2. that the matrix T establishes between the vectors AR% and AsY, BRx and BsY, etc .... the same correspondence as between x and 1, i.e.

AsY = TAR.%'; BsY = TBRx; (3.10a)

or, by taking into aCCGunt (3.10) and leaving out the vector symbol x As T = TAR; Bs T = TBl{; . . .• (3.11) From these two hypotheses we will show the fonowing theorems (compare Fig. 3.5).

Fig. 3.5. Sylbbolic representation of the assumptions on Schur. Lemma. Spaces It and e

are represented by point sets.. The mapping T by connecting lines. e' is that part of S actually used in the mapping T. Since TXt) is equal to zero, then Txa , Txv, etc. are also

equal to zero. We prove that either . ' = 0 or Bl' = 9t

Theorem ~I. The relation T existing between 81 and ~ is either an isomor-

phism and Det T =F 0, or T is identically zero. Indeed if we consider the

subspace Dl' of Ul that corresponds to the null-vector of e and we let %0

be an arbitrary vector of this space then by hypothesis we have y = T Xo = 0 ..

1 For instance the equations

Yl = tllXl+tlJXt+tlaxa;

)" = tUXl +tasXs+/aaXa,

establish a relation sut;h that to ~ch vector It of a three-dimensional space 1t8 there corre- sponds a vector, y~t the plane y10y, in the space e and if tbis space has more than two dimensions, its vectors lying outside this plane do not correspond to any vector ina"

The vectors in It that correspond to the null-vector in e are those which lie on a line whose equations are determined by,puttina the left-hand sides of the preceding equations , equal to zero. To a given vector y =1= 0 there corresponds a set of ve;ctors in 91. !1aving their

origins at 0 and their end points on a line parallel to the line lnentioned above4

84 GROUP THEORY (Ch. 3, § 9

The matrix T makes the vectors AR%o, B .. %o, ••• correspond to AsIa, Bs10' • •• , according to (3. lOa). Because Yo == 0 the latter are all zero.

Hence A .. %o; Ba.¥o; ••• all belong to the subspace at' which appears to be invariant with respect to the transformation of the, system ' ... But we have supposed that this system is irreducible and we are left with the following alternative: either It' = It, and T %0 == 0 for every %0' ie. T sa 0; or al' == 0 and the null-vector of e uniquely corresponds to the null-vector of fR.

In the latter case we know as a result of the fundamental theorem of § 6.2 that the relation between Rand e established by T is a one-to-one corre- spondence or isomorphism. The collection of vectors 1 of e which correspond to vecton .J: in It do not have to fill up,the whole space e, but only a subspace

e' which is isomorphic to It and invariant under the transformations of!f ,- This isomorphism has as a consequence the reversibility of T, i.e. the existã

ence of T-l or the mapping of e' upon Il.

17Jeorem 11. If '6 s is also irreducible, e' is identical to e and 8l is is0-

morphic to e. They have the same number of dimensions and (3.11) can be written as

As == TA.R T-1; Bs == TBR T-1 .... ffs = TWa T-1•

The two systems {# s and ~ R are equivalent. They transform into each other through a change of coordinates.

Theorem 111. Suppose this identification is made It == '6 s == (f, then (3.11) can be written

AT == TA; BT=TB; ••••

7)ze matrix T commutes with all the matrices of the Irreducible system 'if

, and we will show below that this matrix is necessarily a multiple of the unU matrix in " dimensions, or T = A.I, A. being a number.

Let us consider the equation Det(T -AI) == O. This equation will have at' least one root which is unequal to zero. Let us use this value for A.

The matrix T -1 I will commute with all the matrices of the system !f;

for every value of ,t since T commutes with all of them. The preceding the<)-l rems confront us with the following alternative, either T - A I establishet!

also a 9ne-to-one projection of It on e and the Det (T -lI) ¢ 0, whic~

is inlPossible, or T - A, I == O. This establishes the theorem.

SummariziD, we have: A matrix T, which commutes with all the matr~

of the trr.ducible rep,uelltatlon of a group t§, is necessarily the multiple of ~

unit matrix. If the matrix relates two non-equivaJelJI irreducible l'epresenta(iold

Ch .. 3, § 9, 10] CHARACTERS OF A REPRESENTATION 85 like the relations (3.11), it is identically zero, a statement of major impor- tance in the theory of groups and quantum mechanics.

If the representation is reducible we can easily construct a matrix which will commute with all matrices of the representation and which is not a constant times the unit matrix.

Suppose the reducible representation is transformed by a similarity trans- formation S to a set of step-wise matrices. We construct a diagonal matrix T that has elements 11 , at the places which correspond to the diagonal posi- tions of the first box of the step-wise matrices, elements A.2 on places corre- sponding to the diagonal elements of the second box, etc. This matrix will, according to the previous theorem, commute with all matrices in the representation presumed above. If we bring the reducible representation back to its original form by the transformation 5-1 and if we transform T simultaneously, the commutation relation will be maintained and the matrix T will not be a constant times the unit matrix, provided of course we take 11 #: A2 :p • • ••

Hence we find that if matrices exist which commute with all the matrices of a certain representation and if they are not proportional to the unit matrix then the representation is reducible.. If they are proportional to the unit matrix, the representation is irreducible. This forms a simple criterion about irreducibility and it will finally lead to a prescription for finding the irreducible parts of a reducible matrix system (§ 12.3).

10. Characten of a Representation

10.1. DEFINITION

Let' A, B, C ... be the matrices of a representation f# of a group ~. The characters of the representations are the traces of the matrices.

Equivalent representations have the same system of characters, i.e. (compare

§ 3.5) if A' = 5A5-1; B' = SBS-1; • ... then

Tr A' = Tr A; Tr B' = Tr B; ....

The trace of a product of two matrices is independent of the order in which the matrices appear. A similar. statement for an arbitrary number of matrices holds only if" the order is changed cyclically, which gives

TrSAS-1 = TrS- l SA = TrA.

This proof is only valid for finite matrices; in the case of infinite matrices we have to consider the convergence of the sums.

86 GROUP THEORY [Ch .. 3, f to

We will designate the character of the matrix A by X(A). ~Its value depend.

on the particular matrix we have selected, just as a function depends otithe chosen value of the variable xs

The characters of the matrices representing operations belonging to the same class (in the sense of § 4.2) are identical as they can all be represented in the form SAS-1 (S runs through all the representation matrices of the group) and Tt S.AS-1 = Tr A. For this reason the character is said to be a function' of a class instead of a function of an element as was suggested above.

If the representation is irreducible the character is called primititie.1

Let t:Đ be a representation which is decomposedãinto its irreducible ele- ments

(3.12) where the integers ml indicate how many times a particular representation is contained in ill. The representations 'lI 0, 'lJ 1 , • •• are not equivalent.

Obviously the characters have a similar relation

(3.128) The symbols Xo, Xt , ... designate the character systems of the irreduciã

ble'" representations, i.e. the set of characters Xo('-')' Xo(B)~.". Xl (A), X1(B)'i • ... etc. They are different only if A"and B are elements belonging to differe~t classes. Therefore the number of relations (3. 12a), which are distinctJ

is the same as the number ofelasses. We willã see below (Đ 11) that these character sets satisfy a number of relations between themselves.

10.2. THE NUMBER. OF IRREDUCIBLE REPRESENTATIONS OF A FINITE GROUP

We have seen the importance of the notion of irreducibility in the pre- ceding paragraphs. (A crucial theorem which we will prove in § 11 brings this out very clearly .. )

WeJmow that an arbitrary representati~n of a given group can 00. decQ,,-

posed accordina ~o (3.12) into its irreduci~le parts. These irreducibl~ ~~s

may differ from one arbitrary representation to another and there seems to be no limit to the number of possible inãeducible representations resulting from these decomposltions. Actually this is not the case.

1 Some authors, comp. e.g. VAN DBIt W ABltDEN [1949] reserve the word character Cor the irreducible representation 'Qnly and use trae. or spur otherWiset

Ch. 3, § 10) CHARACfERS OF A REPRESENTATION 87 The llUTllber of non-equivalent irreducible representatlo1l8 of a finite group

_ I.f equal to the number of classes into which we can divide its elements.

We establish this theorem by studying a special representation of the group, the regular representation, which is one of the most natural ways of representing a given abstract sroup.

JO.3. REGULAR REPRESENTATION OF A GROlJP

Before we introduce the regular representation in a more formal way, we want to point out that multiplication of all group-elements with a certain fixed group element induces a permutation among the group elements. As we can see from the group table, all these permutations are di~tinct from each other.

An arbitrary permutation of elements can be represented by a matrix in the following way:

o 1 0 0 (1234) 0 0 1 0 P - 2341 -+ 0 0 0 1 1 0 0 0

This matrix contains only one unit element per row and per column. The permutation induced by the multiplication by a given element and repte$ent- eel by this type of matrix is called the regular representation.

, In order to establish this idea in a more formal way we let a variable x"

and a basis vector, correspond to eacb, operation s of a group. The collection of v~ors s spans a new space Pt the so-called group space, which has 9

dim~sion8 in the case of a finite group of order g. A vector .: of this space canã be written as follows:

(3.13) the sum being extended over the g symbols •.

We define the product of two vectors ~ and" == L }',t by the rule:

~" ;: r, ,., xllY,n (3.14)

If represents the basis vector that corresponds with the operation ST of the

&roup. The expression (3.14) may be read either as a double sum or after rop1aci.ng the products by the proper group elements as a single sum in which each element is repeated 9 times.

88 GROUP THEORY [eb. 3, § 10 It is often practical to refrain from the vector description altogether.

The , and " are then considered as hypercomplex numbers! and defined as

linear combinations of group clements with (real or complex) coefficients

X3 and Yt- Hence there is no longer any need to use bold face letters and we will refrain from doing so from now on. The symbols s, T, .... are the hasU of the hypercomplex number system and all the quantities obtained this way form. the group algebra.2 The structure of the algebra is determined by the rules that define th.e products u = ST, i.e. by the multiplication table of the group (§ 2.4).

The two expressions "group space" and "group algebra" as \vell as the notions of "vector in the group space" and "hypercomplex number of the algebra" are equivalent. We will use the latter expressions jf we want to stress the multiplication rules (3.14).

From the last equation. we find that an arbitrary operation A of the group generates a projection from the space p upon itself,

~ --+- e' = Ae = Lxsas = L,xa-ttt.

s t

(t = (IS corresponds with the product T of the operations A and s and we

have s = A -1 T) ..

The collection of projections or mappings A/orms the regular representation of the group.

Let us: designate the components of ~' by x~ then the preceding equation is equiv~ent with the set of substitutions

t = a, b, c, " ....

Hence . the matrix A can be written

A = (ats) = (<>$tO-1t) = (Ja•t.'f-1) l3.1S)

in which the rows and columns are labeled with the help of the elements of the group themselves: E, A, B, .... S •.• T •.• and h",ts-1 is equal to 1 if

A = TS-! and zero otherwise. The matrix contains mainly zeros and only a single 1 per row and per column.

If we return to the example of the group 9' 3 of permutation of three objects, referring to its group table displayed in § 2,,4 (second form), we see that the~D:iatrjx A that represents the abstract element A is the following

1 Compare problem no. 3.

2 '"fIle ~Nord algebra j:-, ~Pg;ested because both SUIt) 3.nd. product are defined. ..

Ch. 3, § 10, 111 ORTHOGONALITY RETALIONS

arrangcl11ent of 0 and 1

A=

/

001000 100000 010000 000010 000001 000100

89

The importance of the regular representation is due to the following theo- renl. All irreducible representations of a group r9 can be obtained by reducing its regular representation ~,. .. The number of times an irreducible representa- tion appears in the regular representation is equal to the dimensionality of the matrices of that jJarticu/ar representation.

The proof of this theorem can be established directly from the result in

§ 11.2.. [Compare problem 3.8 where it is explicitly indicated that one can obtain this proof from (3.23).]

1'\aklng up again our example of Y3; if one would reduce tile regular representation of this group, one vlould find tIle unit representation once, the alternating~ one-dimensional representation once, and the irreducible two-dimensional representation mentioned in § 7.4, twice.

11. Orthogonality Relations (Finite Groups)

11.1.. GENk.RAL FORMULAS

Let r!§ ~nd t§' be two irreducible representations of the finite group and

(alk), (aik) their representation Dlatrices; nand n' the order of these matrices.

An important case is the one in which t§' is unitary, but first we will deal with the g~nera1 case. Consider a rectangular matrix S = (S,p.) with n rows and n' colurnns and let us form the sum

(i)

extended over all operations A of i§ .. (The number is of course equal to the order of the group g.) T wiU be, like S, a rectangular matrix with n rows and n' columns. Let us write this ex:pression explicitly:

tile = L (all S'p a~iãl + bll SIJt b~; 1 ••• ).

j. I'

Let {--: a;'ld C' be t\VO matrices of t§. and ~f cOJLresponding to an arbitrary

Một phần của tài liệu Group theory; the application to quantum mechanics (Trang 89 - 117)

Tải bản đầy đủ (PDF)

(297 trang)