1. Trang chủ
  2. » Khoa Học Tự Nhiên

Handbook of mathematics for engineers and scienteists part 35 docx

7 262 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Algebra
Trường học University of Mathematics and Science
Chuyên ngành Mathematics
Thể loại Thesis
Năm xuất bản 2023
Thành phố Hanoi
Định dạng
Số trang 7
Dung lượng 399,2 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Any bounded linear operator A in a real Hilbert space has a unique transpose operator.. The properties of transpose operators in a real Hilbert space are similar to the properties of adj

Trang 1

The rank of a linear operator A is the dimension of its range: rank (A) = dim (im A).

Properties of the rank of a linear operator:

rank (AB)≤min{rank (A), rank (B)},

rank (A) + rank (B) – nrank (AB),

where A and B are linear operators in L( V, V) and n = dim V.

Remark. If rank (A) = n then rank (AB) = rank (BA) = rank (B).

THEOREM Let A : V → V be a linear operator Then the following statements are

equivalent:

1 A is invertible (i.e., there exists A–1)

2 ker A =0

3 im A =V.

4 rank (A) = dimV.

5.6.1-5 Notion of a adjoint operator Hermitian operators

Let A L(V, V) be a bounded linear operator in a Hilbert space V The operator A ∗ in

L(V, V) is called its adjoint operator if

(Ax)y = x(Ay) for all x and y inV.

THEOREM Any bounded linear operator A in a Hilbert space has a unique adjoint

operator

Properties of adjoint operators:

(A + B)= A+ B, (λA) = ¯λA ∗, (A) = A, (AB) = BA, O= O, I = I, (A–1)= (A)–1, A = A, AA = A2,

(Ax)(By)x(ABy)(BAx)y for all x and y inV,

where A and B are bounded linear operators in a Hilbert spaceV, ¯λ is the complex conjugate

of a number λ.

A linear operator AL( V, V) in a Hilbert space V is said to be Hermitian (self-adjoint) if

A= A or (Ax)y = x(Ay).

A linear operator A(V, V) in a Hilbert space V is said to be skew-Hermitian if

A= –A or (Ax)y = –x(Ay).

5.6.1-6 Unitary and normal operators

A linear operator UL( V, V) in a Hilbert space V is called a unitary operator if for all x

and y inV, the following relation holds:

(Ux)(Uy) = xy.

This relation is called the unitarity condition.

Trang 2

Properties of a unitary operator U:

U = U–1 or UU = UU = I,

Ux = x for all x in V.

A linear operator A in L( V, V) is said to be normal if

AA = AA THEOREM A bounded linear operator A is normal if and only ifAx = A x.

Remark Any unitary or Hermitian operator is normal.

5.6.1-7 Transpose, symmetric, and orthogonal operators

The transpose operator of a bounded linear operator AL( V, V) in a real Hilbert space V

is the operator AT L( V, V) such that for all x, y in V, the following relation holds:

(Ax)y = x(ATy).

THEOREM Any bounded linear operator A in a real Hilbert space has a unique transpose

operator

The properties of transpose operators in a real Hilbert space are similar to the properties

of adjoint operators considered in Paragraph 5.6.1-5 if one takes AT instead of A

A linear operator AL(V, V) in a real Hilbert space V is said to be symmetric if

AT = A or (Ax)y = x(Ay).

A linear operator AL( V, V) in a real Hilbert space V is said to be skew-symmetric if

AT = –A or (Ax)y = –x(Ay).

The properties of symmetric linear operators in a real Hilbert space are similar to the

properties of Hermitian operators considered in Paragraph 5.6.1-5 if one takes AT instead

of A

A linear operator PL( V, V) in a real Hilbert space V is said to be orthogonal if for

any x and y inV, the following relations hold:

(Px)(Py) = xy.

This relation is called the orthogonality condition.

Properties of orthogonal operator P:

PT = P–1 or PTP = PPT = I,

Px = x for all x in V.

5.6.1-8 Positive operators Roots of an operator

A Hermitian (symmetric, in the case of a real space) operator A is said to be

a) nonnegative (resp., nonpositive), and one writes A ≥ 0 (resp., A≤ 0) if (Ax)x ≥ 0

(resp., (Ax)x≤ 0) for any x inV.

b) positive or positive definite (resp., negative or negative definite) and one writes A >0

(A <0) if (Ax)x >0(resp., (Ax)x <0) for any x≠ 0

An mth root of an operator A is an operator B such that B m= A.

THEOREM If A is a nonnegative Hermitian (symmetric) operator, then for any positive

integer m there exists a unique nonnegative Hermitian (symmetric) operator A1/m

Trang 3

5.6.1-9 Decomposition theorems.

THEOREM1 For any bounded linear operator A in a Hilbert space V, the operator

H1 = 12(A + A)is Hermitian and the operator H2 = 12(A – A)is skew-Hermitian The

representation of A as a sum of Hermitian and skew-Hermitian operators is unique: A =

H1+ H2

THEOREM2 For any bounded linear operator A in a real Hilbert space, the operator

S1 = 12(A + AT) is symmetric and the operator S2 = 12(A – AT) is skew-symmetric The

representation of A as a sum of symmetric and skew-symmetric operators is unique: A =

S1+ S2

THEOREM3 For any bounded linear operator A in a Hilbert space, AAand AAare nonnegative Hermitian operators

THEOREM4 For any linear operator A in a Hilbert spaceV, there exist polar

decom-positions

A = QU and A = U1Q1,

where Q and Q1are nonnegative Hermitian operators, Q2 = AA, Q21 = AA, and U, U1 are unitary operators The operators Q and Q1 are always unique, while the operators U and U1are unique only if A is nondegenerate.

5.6.2 Linear Operators in Matrix Form

5.6.2-1 Matrices associated with linear operators

Let A be a linear operator in an n-dimensional linear space V with a basis e1 , , e n Then

there is a matrix [a j j] such that

Aej =

n



i=1

a i

jei

The coordinates y jof the vector y = Ax in that basis can be represented in the form

y i=n

j=1

a i

j x j (i =1, 2, , n), (5.6.2.1)

where x j are the coordinates of x in the same basis e1, , e n The matrix A[a i j] of size

n×n is called the matrix of the linear operator A in a given basis e1, , e n

Thus, given a basis e1, , e n, any linear operator y = Ax can be associated with its

matrix in that basis with the help of (5.6.2.1)

If A is the zero operator, then its matrix is the zero matrix in any basis If A is the unit

operator, then its matrix is the unit matrix in any basis

THEOREM1 Let e1, , enbe a given basis in a linear spaceV and let A[a i j be a

given square matrix of size n×n Then there exists a unique linear operator A : V → V whose matrix in that basis coincides with the matrix A.

THEOREM2 The rank of a linear operator A is equal to the rank of its matrix A in any basis: rank (A) = rank (A).

THEOREM3 A linear operator A :V → V is invertible if and only if rank (A) = dim V

In this case, the matrix of the operator A is invertible.

Trang 4

5.6.2-2 Transformation of the matrix of a linear operator.

Suppose that the transition from the basis e1, , e nto another basis2e1, ,2enis determined

by a matrix U[u ij ] of size n×n, i.e.

2ei =

n



j=1

u ijej (i =1, 2, , n).

THEOREM Let A and 2 Abe the matrices of a linear operator A in the basis e1, , e n

and the basis2e1, ,2en, respectively Then

A = U–1AU2 or A2 = U AU– 1. Note that the determinant of the matrix of a linear operator does not depend on the

basis: det A = det 2 A Therefore, one can correctly define the determinant det A of a linear

operator as the determinant of its matrix in any basis:

det A = det A.

The trace of the matrix of a linear operator, Tr(A), is also independent of the basis Therefore,

one can correctly define the trace Tr(A) of a linear operator as the trace of its matrix in any

basis:

Tr(A) = Tr(A).

In the case of an orthonormal basis, a Hermitian, skew-Hermitian, normal, or unitary operator in a Hilbert space corresponds to a Hermitian, skew-Hermitian, normal, or unitary matrix; and a symmetric, skew-symmetric, or transpose operator in a real Hilbert space corresponds to a symmetric, skew-symmetric, or transpose matrix

5.6.3 Eigenvectors and Eigenvalues of Linear Operators

5.6.3-1 Basic definitions

1 A scalar λ is called an eigenvalue of a linear operator A in a vector space V if there is

a nonzero element x inV such that

A nonzero element x for which (5.6.3.1) holds is called an eigenvector of the operator A

corresponding to the eigenvalue λ Eigenvectors corresponding to distinct eigenvalues are linearly independent For an eigenvalue λ≠ 0, the inverse μ =1/λ is called a characteristic

value of the operator A.

THEOREM If x1, , x kare eigenvectors of an operator A corresponding to its

eigen-value λ, then α1x1+· · · + α kxk (α21+· · · + α2k≠ 0) is also an eigenvector of the operator A

corresponding to the eigenvalue λ.

The geometric multiplicity m i of an eigenvalue λ i is the maximal number of linearly

independent eigenvectors corresponding to the eigenvalue λ i Thus, the geometric

multi-plicity of λ iis the dimension of the subspace formed by all eigenvectors corresponding to

the eigenvalue λ i

The algebraic multiplicity m  i of an eigenvalue λ i of an operator A is equal to the

algebraic multiplicity of λ i regarded as an eigenvalue of the corresponding matrix A.

Trang 5

The algebraic multiplicity m  i of an eigenvalue λ i is always not less than the geometric

multiplicity m iof this eigenvalue

The trace Tr(A) is equal to the sum of all eigenvalues of the operator A, each eigenvalue

counted according to its multiplicity, i.e.,

Tr(A) =

i

m 

i λ i

The determinant det A is equal to the product of all eigenvalues of the operator A, each

eigenvalue entering the product according to its multiplicity,

det A =

i

λ m  i

i .

5.6.3-2 Eigenvectors and eigenvalues of normal and Hermitian operators

Properties of eigenvalues and eigenvectors of a normal operator:

1 A normal operator A in a Hilbert space V and its adjoint operator A have the same

eigenvectors and their eigenvalues are complex conjugate

2 For a normal operator A in a Hilbert spaceV, there is a basis{e }formed by eigenvectors

of the operators A and A Therefore, there is a basis inV in which the operator A has

a diagonal matrix

3 Eigenvectors corresponding to distinct eigenvalues of a normal operator are mutually orthogonal

4 Any bounded normal operator A in a Hilbert space V is reducible The space V can

be represented as a direct sum of the subspace spanned by an orthonormal system of

eigenvectors of A and the subspace consisting of vectors orthogonal to all eigenvectors

of A In the finite-dimensional case, an orthonormal system of eigenvectors of A is a

basis ofV.

5 The algebraic multiplicity of any eigenvalue λ of a normal operator is equal to its

geometric multiplicity

Properties of eigenvalues and eigenvectors of a Hermitian operator:

1 Since any Hermitian operator is normal, all properties of normal operators hold for Hermitian operators

2 All eigenvalues of a Hermitian operator are real

3 Any Hermitian operator A in an n-dimensional unitary space has n mutually orthogonal

eigenvectors of unit length

4 Any eigenvalue of a nonnegative (positive) operator is nonnegative (positive)

5 Minimax property Let A be a Hermitian operator in an n-dimensional unitary space V,

and letE m be the set of all m-dimensional subspaces of V (m < n) Then the eigenvalues

λ1, , λ n of the operator A (λ1 ≥ .λ n) can be defined by the formulas

λ m+1 = min

Y E m

max

x⊥Y

(Ax)x

xx .

6 Let i1, , i n be an orthonormal basis in an n-dimensional space V, and let all i k

are eigenvectors of a Hermitian operator A, i.e., Aik = λ kik Then the matrix of

the operator A in the basis i1, , i n is diagonal and its diagonal elements have the

form a k k = λ k

Trang 6

7 Let i1, , i n be an arbitrary orthonormal basis in an n-dimensional Euclidean space V.

Then the matrix of an operator A in the basis i1, , i nis symmetric if and only if the

operator A is Hermitian.

8 In an orthonormal basis i1, , i nformed by eigenvectors of a nonnegative Hermitian

operator A, the matrix of the operator A1/mhas the form

λ1/m

0 λ1/m

2 · · · 0

. .

0 0 · · · λ1n /m

⎠.

5.6.3-3 Characteristic polynomial of a linear operator

Consider the finite-dimensional case The algebraic equation

fA(λ)det(A – λI) =0 (5.6.3.2)

of degree n is called the characteristic equation of the operator A and fA(λ) is called the

characteristic polynomial of the operator A.

Since the value of the determinant det(A – λI) does not depend on the basis, the

coefficients of λ k (k = 0,1, , n) in the characteristic polynomial fA(λ) are invariants

(i.e., quantities whose values do not depend on the basis) In particular, the coefficient

of λ k–1is equal to the trace of the operator A.

In the finite-dimensional case, λ is an eigenvalue of a linear operator A if and only if λ is

a root of the characteristic equation (5.6.3.2) of the operator A Therefore, a linear operator

always has eigenvalues

In the case of a real space, a root of the characteristic equation can be an eigenvalue of

a linear operator only if this root is real In this connection, it would be natural to find a class of linear operators in a real Euclidean space for which all roots of the corresponding characteristic equations are real

THEOREM The matrix A of a linear operator A in a given basis i1, , i nis diagonal if

and only if all iiare eigenvectors of this operator

5.6.3-4 Bounds for eigenvalues of linear operators

The modulus of any eigenvalue λ of a linear operator A in an n-dimensional unitary space

satisfies the estimate:

|λ| ≤min(M1, M2), M1= max

1≤in

n



j=1

|a ij|, M2 = max

1≤jn

n



i=1

|a ij|,

where A[a ij] is the matrix of the operator A The real and the imaginary parts of

eigenvalues satisfy the estimates:

min 1≤in (Re a ii – P i)≤Re λ≤ max

1≤in (Re a ii + P i),

min 1≤in (Im a ii – P i)≤Im λ≤ max

1≤in (Im a ii + P i),

Trang 7

where P i =

n



j=1 ,ji

|a ij|, and P i can be replaced by Q i=

n



j=1 ,ii

|a ji|

The modulus of any eigenvalue λ of a Hermitian operator A in an n-dimensional unitary

space satisfies the inequalities

|λ|2≤

i



j

|a ij|2, |λ| ≤A = sup

x=1[(Ax)x],

and its smallest and its largest eigenvalues, denoted, respectively, by m and M , can be

found from the relations

m= inf

x=1[(Ax)x], M = sup

x=1[(Ax)x].

5.6.3-5 Spectral decomposition of Hermitian operators

Let i1, , i n be a fixed orthonormal basis in an n-dimensional unitary space V Then any

element ofV can be represented in the form (see Paragraph 5.4.2-2)

x =

n



j=1

(xi )ij

The operator Pk (k =1,2, , n) defined by

Pkx = (xik)ik

is called the projection onto the one-dimensional subspace generated by the vector i k The

projection Pkis a Hermitian operator

Properties of the projection Pk:

PkPl=P

k for k = l,

O for kl, Pm k = Pk (m =1,2,3, ),

n



j=1Pj = I, where I is the identity operator.

For a normal operator A, there is an orthonormal basis consisting of its eigenvectors,

Aik = λi k Then one obtains the spectral decomposition of a normal operator:

Ak=

n



j=1

λ k

jPj (k =1, 2, 3, ). (5.6.3.3)

Consider an arbitrary polynomial p(λ) =m

j=1c j λ

j By definition, p(A) =m

j=1c jA

j Then,

using (5.6.3.3), we get

p(A) =

m



i=1

p(λ i)Pi

CAYLEY-HAMILTON THEOREM Every normal operator satisfies its own characteristic

equation, i.e., fA (A) = O.

Ngày đăng: 02/07/2014, 13:20

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm