1. Trang chủ
  2. » Luận Văn - Báo Cáo

On approximate karush kuhn tucker sequential optimality conditions for smooth constrained optimization

37 6 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 37
Dung lượng 371,05 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

HANOI PEDAGOGICAL UNIVERSUTY 2DEPARTMENT OF MATHEMATICSDEPARTMENT OF MATHEMATICS Dao Thi Thao ON APPROXIMATE KARUSH-KUHN-TUCKER OPTIMALITY CONDITIONS FOR SMOOTH CONSTRAINED OPTIMIZATION

Trang 1

HANOI PEDAGOGICAL UNIVERSUTY 2DEPARTMENT OF MATHEMATICS

DEPARTMENT OF MATHEMATICS

Dao Thi Thao

ON APPROXIMATE KARUSH-KUHN-TUCKER

OPTIMALITY CONDITIONS FOR SMOOTH CONSTRAINED OPTIMIZATION

BACHELOR THESIS

Hanoi – 2019

Trang 2

HANOI PEDAGOGICAL UNIVERSUTY 2DEPARTMENT OF MATHEMATICS

DEPARTMENT OF MATHEMATICS

Dao Thi Thao

ON APPROXIMATE KARUSH-KUHN-TUCKER

OPTIMALITY CONDITIONS FOR SMOOTH CONSTRAINED OPTIMIZATION

BACHELOR THESIS Major: Analysis

SUPERVISOR

Dr NGUYEN VAN TUYEN

Hanoi – 2019

Trang 3

Thesis acknowledgment

I would like to express my gratitude to the teachers of the Department of ics, Hanoi Pedagogical University 2, the teachers in the analysis group as well as theteachers involved The lecturers have imparted valuable knowledge and facilitated for

Mathemat-me to complete the course and the thesis

In particular, I would like to express my deep respect and gratitude to Dr NguyenVan Tuyen, who has direct guidance, help me complete this thesis

Due to time, capacity and conditions are limited, so the thesis can not avoiderrors So, I look forward to receiving valuable comments from teachers and friends

Hanoi, May , 2019Student

Dao Thi Thao

Trang 4

Thesis assurance

I assure that the data and the results of this thesis are true and not identical to othertopics I also assure that all the help for this thesis has been acknowledge and that theresults presented in the thesis has been identified clearly

Hanoi, May , 2019Student

Dao Thi Thao

Trang 5

Preface 2

1 Preliminaries 3

1.1 Convex sets 3

1.2 Convex functions 5

1.3 Cones 5

1.4 Tangent cones 9

1.5 Optimality conditions for smooth problem 11

2 Approximate Karush–Kuhn–Tucker optimality conditions 16

2.1 Approximate-KKT conditions 17

2.1.1 AKKT(I) is an optimality condition 19

2.1.2 AKKT(I) is a strong optimality condition 21

2.2 Approximate gradient projection conditions 23

2.2.1 C-AGP condition 24

2.2.2 L-AGP condition 29

2.2.3 Remarks 30

Bibliography 31

Trang 6

Karush–Kuhn–Tucker (KKT) optimality conditions are one of the most importantresults in optimization theory However, KKT optimality conditions do not need to befulfilled at local minimum points unless some constraint qualifications are satisfied Inother words, usual first-order necessary optimality conditions are of the form KKT ornot-CQ

We note here that a local minimizer might not be KKT, but it can always beapproximated by a sequence of “approximate-KKT” points This leads one to study adifferent type of optimality conditions

In this thesis, based on the recent work by Andreani, Haese, and Mart´ıne [5], westudy sequential first-order optimality conditions for nonlinear programming problems

We first examine some sequential optimality conditions, such as approximate KKT andapproximate gradient projection conditions, which may be used as stopping criteria ofoptimization algorithms Then, we investigate the relationships between the sequentialoptimality conditions and several necessary optimality conditions

The thesis is organized as follows In Chapter 1, we recall some basic definitionsand preliminaries from convex analysis, which are widely used in the sequel In Chap-ter 2, we introduce Approximate KKT conditions for nonlinear programming prob-lems These optimality conditions must be satisfies by the minimizers of optimizationproblems The relationships between the sequential optimality conditions and severalnecessary optimality conditions are also investigated

Trang 7

α.x1+ (1 − α).x2, 0 < α < 1.

The following lemma say that convexity is preserved by operation of intersection.Lemma 1.2 Let I be an arbitrary index set If the sets Xi ⊂ Rn, i ∈ I, are convex,then the set X = T

i∈I

Xi is convex

These operations preserve convexity

Lemma 1.3 Let X and Y be convex sets in Rn and let c and d be real numbers Thethe set Z = cX + dY is convex

Definition 1.4 A point x is called a convex combination of points x1, , xm if thereexist α1 ≥ 0, , αm ≥ 0 such that

x = α1x1+ α2x2+ + αmxmand

α1+ α2+ + αm = 1

Definition 1.5 The convex hull of the set X (denoted by convX) is the intersection

of all convex sets containing X

Trang 8

The relation between these two concepts is the subject of the next lemma.Lemma 1.6 The set convX is the set of all convex combinations of points of X.Lemma 1.7 If X ⊂ Rn, then every element of convX is a convex combination of atmost n + 1 points of X.

Lemma 1.8 If X is convex, then its interior intX and its closure X are convex.Lemma 1.9 Assume that the set X ⊂ Rn is convex Then int X = ∅ if and only if X

is contained in a linear manifold of dimension smaller than n

Consider a convex closed set V ⊂ Rn and a point x ∈ Rn We call the point in

V that is closest to x the projection of x on V and we denote it by PV(x) Obviously,

if x ∈ V then PV(x) = x, but the projection is always well defined, as the followingresult shows

Theorem 1.10 If the set V ⊂ Rn is nonempty, convex and closed, then for every

x ∈ Rn there exists exactly one point z ∈ V that is closest to x

Lemma 1.11 Assume that V ⊂ Rn is a closed convex set and let x ∈ Rn Then

z = PV (x) if and only if z ∈ V and

hv − z, x − zi ≤ 0for all v ∈ V

Theorem 1.12 Assume that V ⊂ Rn is a closed convex set Then for all x ∈ Rn and

y ∈ Rn we have

kPV (x) − PV (y)k ≤ kx − yk

A convex closed set and a point outside of it can be separated by a plane.Theorem 1.13 Let X ⊂ Rn be a closed convex set and let x /∈ X Then there exist anonzero y ∈ Rn and ε > 0 such that

hy, vi ≤ hy, xi − ε for all v ∈ X

Theorem 1.14 Let X ⊂ Rn be a convex set and let x /∈ X Then there exists anonzero y ∈ Rn such that

hy, vi ≤ hy, xi for all v ∈ X

Trang 9

Theorem 1.15 Let X1 and X2 be closed convex sets in Rn If X1 ∩ X2 = ∅, thenthere exists a nonzero y ∈ Rn such that

hy, x1i ≤ hy, x2i for all x1 ∈ X1, x2 ∈ X2.Theorem 1.16 Let X1 and X2 be closed convex sets in Rn and let X1 be bounded If

X1∩ X2 = ∅, then there exists a nonzero y ∈ Rn and ε > 0 such that

hy, x1i ≤ hy, x2i − εfor all x1 ∈ X1 and all x2 ∈ X2

Definition 1.17 A function f is called convex if epif is a convex set

Theorem 1.18 A function f is convex if and only if for all x1 and x2 and for all

Trang 10

Lemma 1.21 Assume that X is a convex set Then the set

In the last two equations we used the fact that X is a cone

Definition 1.23 Let X ⊂ Rm be a convex set The set

X∞

= {d : X + d ⊂ X}

is called the recession cone of X

We shall show that X∞is a convex cone We first note that for each d ∈ X∞ andfor every m:

X + md ⊂ X + (m − 1) d ⊂ · · · ⊂ X + d ⊂ X

Using convexity of X we infer that X + τ d ⊂ X for all τ ≥ 0 Hence τ d ∈ X∞ for all

τ ≥ 0 This means that X∞ is a cone

The fact that X∞ is convex can be verifed directly from the definition Indeed,

if d1 ∈ X∞ and d2 ∈ X∞, then

x + αd1+ (1 − α) d2 = α x + d1 + (1 − α) x + d1 + (1 − α) x + d2 ∈ X,for all x ∈ X and all α ∈ (0, 1)

Definition 1.24 Let K be a cone in Rn The set

K◦ ∆= {y ∈ Rn : hy, xi ≤ 0, f or all x ∈ K}

is called the polar cone of K

Example 1.25 Let K1, , Km be cones in Rn and let K = K1 + K2 + · · · + Km.Clearly, K is a cone We shall calculate its polar cone If z ∈ K◦ then for every

x1 ∈ K1, , xm ∈ Km we have

1 + · · · + hz, xmi ≤ 0

Let us choose j ∈ {1, , n} Setting all xi = 0, except for i = j, we conclude that

Trang 11

hz, xji ≤ 0 for all xj ∈ Kj.Consequently, z ∈ Kj◦ As j was arbitrary,

K◦ ⊂ K1◦∩ · · · ∩ Km◦

On the other hand, for every element z of K1◦∩ · · · ∩ K◦

m is satisfied, and thus z ∈ Ko.Therefore,

(K1+ · · · + Km)◦ = K1◦∩ · · · ∩ K◦

m.Lemma 1.26 For every convex cone K ⊂ Rn:

(i) The polar cone Ko is convex and closed;

K◦ =AT

λ : λ ∈ C◦ Corollary 1.30 Let A be an m × n matrix and let

K = {x ∈ Rn: Ax ≤ 0} Then

K◦ =y ∈ Rn: y = ATλ, λ ∈ Rm, λ ≥ 0 The above fact is frequently formulated as an alternative: exactly one of the fol-lowing two systems has a solution, either

(i) Ax ≤ 0 and hc, xi > 0; or

(ii) c = ATλ, λ ≥ 0

Trang 12

Theorem 1.31 Let K1, K2, , Km be convex cones in Rn If K1∩ K2∩ · · · ∩ Km = ∅,then there exist yi ∈ K◦

i, i = 1, 2, , m, not all equal 0, such that

y1+ y2 + · · · + ym = 0

Lemma 1.32 If x ∈ intK, then hy, xi < 0 for all nonzero y ∈ Ko

Theorem 1.33 Let K1, , Km be convex cones in Rn and let K = K1∩ ∩ Km If

K1∩ intK2 ∩ intKm 6= ∅, then

K◦ = K1◦+ K2◦+ · · · + Km◦.Theorem 1.34 Assume that K1 and K2 are closed convex cones, and K is defined by

K = {x ∈ K1 : Ax ∈ K2} , If

0 ∈ int {Ax − y : x ∈ K1, y ∈ K2} ,then

K◦ = K1◦+ATλ : λ ∈ K2◦ Definition 1.35 Consider a convex closed set X ⊂ Rn and a point x ∈ X The set

NX(x)= [cone (X − x)]∆ ◦

is called the normal cone to X at x

As a polar cone, the normal cone is closed and convex It follows from thedefinition that v ∈ NX (x) if and only if

hv, y − xi ≤ 0 for all y ∈ X

Lemma 1.36 Let X be a closed convex set and let x ∈ X Then

NX(x) = {v ∈ Rn : PX(x + v) = x} Example 1.37 Suppose C is a closed convex cone in R and that z ∈ Rn Considerthe projection

x = PC(z) Define y = z − x It follows from Lemma 1.36 that

y ∈ NC(x) = [KC(x)]◦

Trang 13

By Example 1.22, KC(x) = C + {τ x : τ ∈ R} Example 1.25 then yields

Lemma 1.38 Assume that X = X1 ∩ · · · ∩ Xm, where Xi are closed convex sets,

i = 1, , m, and let x ∈ X If X1∩ intX2∩ · · · ∩ intXm 6= ∅, then

NX(x) = NX1(x) + · · · + NXm(x) Example 1.39 Let

X1 =x ∈ R2 : kxk ≤ 1 , X2 =x ∈ R2 : x1 = 1 ,and let x = (1, 0) We have

NX1(x) =v ∈ R2 : v1 ≥ 0, v2 = 0 , NX2(x) = v ∈ R2 : v2 = 0

On the other hand X = X1∩ X2 contains just the point x, and

NX(x) = R2.The operation x → NX(x) is upper semicontinuous in the following sense.Lemma 1.40 Assume that X ⊂ Rn is a closed convex set and that a sequence xk

of elements of X is convergent to a point x Then for every convergent sequence ofbelements vk∈ NX xk its limit bv is an element of NX(x).b

Trang 14

Definition 1.41 A direction d is called tangent to the set X ⊂ Rnat the point x ∈ X

if there exist sequences of points xk ∈ X and scalars τk > 0, k = 1, 2, , such that

In order to develop algebraic forms of tangent cones to such sets, it is convenient

to consider an abstract system

g(x) ∈ Y0,

x ∈ X0

(1.1)

Here g :Rn → Rm is continuously differentiable, Y0 is a closed convex set in Rm, and

X0 is a closed convex set in Rn

Definition 1.44 System (1.1) is called metrically regular at he point x0 ∈ X if thereexist ε > 0 and C such that for all x ande eu satisfying kx − xe 0k ≤ ε and keuk ≤ ε wecan find xR∈ X0 satisfying the inclusion

g (xR) −eu ∈ Y0,and such that

kxR−exk ≤ C (dist (ex, X0) + dist (g (ex) −u, Ye 0))

Trang 15

The relevance of the concept of metric regularity is demonstrated in the followingtheorem.

Theorem 1.45 If system (1.1) is metrically regular, then

TX (x0) = {d ∈ Rn: d ∈ TX0(x0) , g0(x0) d ∈ TY0(g (x0))} Lemma 1.46 Assume that there exists a point xM F ∈ intX0 such that

h∇gi(x0) , xM F − x0i < 0, i ∈ I◦(x0) ,h∇hi(x0) , xM F − x0i = 0, i = 1, , p,and that the gradients ∇hi(x0) , i = 1, , p, are linearly independent Then system

In this case it is equivalent to metric regularity

Lemma 1.48 System (1.2) with X0 = Rn satisfies the Mangasarian-Fromovits straint qualification (MFCQ) condition at a point x0 if and only if it is metricallyregular at x0

con-1.5 Optimality conditions for smooth problem

Consider the constrained optimization problem

minimize

with a differentiable function f: Rn→ R and a set X ⊂ Rn

Theorem 1.49 Assume that bx is a local minimum of problem (1.3) and that f (·) isdifferentiable at x Let Tb X(bx) be a tangent cone to the set X at bx Then

−∇f (bx) ∈ [TX(bx)]◦ (1.4)Conversely, if the function f (·) is convex, the set X is convex , and a point bx ∈ Xsatisfies relation (1.4), then bx is a global minimum of problem (1.3)

Trang 16

Proof Suppose our assertion is false:

−∇f (bx) /∈ [TX(bx)]◦.This means that there exists a direction d ∈ TX(bx) such that

a contradiction Therefore, relation (1.4) is valid

Assume now that the function f (·) and set X are convex, and that (1.4) is satisfied

at a point x ∈ X Since the set X is convex, for every y ∈ X the directionb

d = y −bx

Trang 17

is a tangent direction for X at bx Thus, condition (1.4) implies that

h∇f (x) , y − xi ≥ 0.bThe function f (·) is convex, then

f (y) ≥ f (bx) + h∇f (bx) , y − xi Therefore f (y) ≥ f (x) for all y ∈ X, as required.b

Consider the nonlinear optimization problem

minimize f (x)subject to

gi(x) ≤ 0, i = 1, , m,

hi(x) = 0, i = 1, , p,

x ∈ Xo.

(1.7)

We assume that the functions f : Rn → R, gi : Rn → R, i = 1, , m, and hi : Rn →

R, i = 1, , p, are continuously differentiable, and that the set Xo⊂ Rn is convex andclosed The feasible set of this problem is denoted by X

The optimality condition of Theorem 1.49 for problem (1.7) involve the tangentcone to the feasible set X at the optimal point bx We established that if the system ofconstraints of problem (1.7) is metrically regular at the pointx, then the tangent coneb

to the feasible set X has the form

1, , m, and µbi ∈, i = 1, , p, such that

b

λigi(bx) = 0, i = 1, , m (1.10)

Trang 18

Proof By the constraint qualification condition, the TX(bx) defined by (1.8) is thetangent cone to the feasible set X at bx Then, by virtue of (1.49),

−∇f (bx) ∈ (TX(bx))◦

It remains to describe the polar to the tangent cone Assume for simplicity that

I0(x) = {1, , mb 0} Robinson’s condition implies that the assumptions of Theorem(1.34) are satisfied with

.(∇gm0(bx))T(∇h1(x))b T

.(∇hp(x))b T

Lemma 1.51 Let bx be a local minimum of problem (1.4) and let bΛ (x) be the set ofbLagrange multipliers bΛ ∈ Rm

+ and µ ∈ Rb p satisfying (1.5)-(1.6)

(i) The set bΛ (bx) is convex and closed

(ii) If problem (1.4) satisfies Robinson’s condition atx, then the set bb Λ (x) is alsobbounded

Theorem 1.52 Assume that the functions f (·) and gi(i), i = 1, , m, are convexand the functions hi(·), i = 1, , m, are affine If the point bx ∈ X and multipliersb

λi ≥ 0, i = 1, , m, and µbi ∈ R, i = 1, , p, satisfy condition (1.9) - (1.10), then bx is

a global minimum of problem (1.7)

Trang 19

Proof It follows from the assumptions that the Lagrangian L x, bλ,µb is convex withrespect to x We have

Lbx, bλ,µb≤ Lx, bλ,µb for all x ∈ X0 (1.11)

At feasible points of problem (1.7) we have

Lx, bλ,µb≤ f (x) ,and at the point bx condition (1.10) implies that

f (x) = Lb x, bb λ,µb.Hence f (bx) ≤ f (x) for all x ∈ X

Trang 20

Chapter 2

Approximate Karush–Kuhn–Tucker optimality conditions

• If v ∈ Rn, we denote v+ = (max {v1, 0} , , max {vn, 0})T

• If v ∈ Rn, we denote v− = (min {v1, 0} , , min {vn, 0})T

• A ⊂ B means that the set A is contained in B

• B (x, δ) = { z ∈ Rn| kz − xk ≤ δ}

• PΩ(x) is the Euclidean projection of x on Ω

Trang 21

2.1 Approximate-KKT conditions

We consider the nonlinear programming problem in the form

minimize f (x) subject to h(x) = 0, g(x) ≤ 0, (2.1)Where f : Rn→ R, h : Rn

→ Rm

, g : Rn→ Rp are smooth

Definition 2.1 Let I ⊂ {1, , p} We say that I satisfies the sufficient interiorproperty if for all feasible point x there exists a sequence of feasible point zk such that

zk → x and gi zk < 0 for all i ∈ I

Note that I = ∅ satisfies the sufficient interior property

Definition 2.2 Let I satisfies the sufficient interior The feasible point x∗ fulfills theApproximate-KKT condition associated with I (AKKT(I)) if there exists a sequence

xk that converges to x∗ and satisfies the following:

For all k ∈ N there exist λk∈ Rm, µk∈ Rp+ such that

lim

k→∞ ∇f (xk) + ∇h xk λk+ ∇g xk µk = 0, (2.2)

µki = 0 for all i such that gi(x∗) < 0, (2.3)

Remark 2.3 We will write AKKT = AKKT (∅) from now on

Lemma 2.4 A feasible point x∗ satisfies AKKT(I) if and only if, there exist sequences

εk= max ∇f xk + ∇h xk λk+ ∇g xk µk , −gj xk , j ∈ J (2.8)

Ngày đăng: 07/04/2021, 08:04

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN