求解無約束優(yōu)化問題的混合共軛梯度算法
發(fā)布時間:2018-06-09 14:20
本文選題:無約束優(yōu)化 + 混合共軛梯度法; 參考:《西南大學》2017年碩士論文
【摘要】:共軛梯度法是求解大規(guī)模無約束優(yōu)化問題的一類常用的而且十分有效的迭代算法,相比于Newton法和擬Newton法,它的顯著優(yōu)點是算法簡單和存儲空間小.眾所周知,經典的共軛梯度法中,不同的方法其全局收斂性和數值表現(xiàn)也有所不同.自然地,很多學者試著構造一種既具有良好的全局收斂性又具有優(yōu)秀的數值表現(xiàn)的新算法.一種想法,是直接對經典的共軛參數βk進行改良.另一種想法,就是將收斂性好和數值計算性能優(yōu)良的共軛參數βk進行有效混合.本文主要考慮通過后一種想法來構造新算法.最近,學者們提出了一些混合共軛梯度法并得到了一些好的成果.受他們的啟發(fā),本文提出了兩類新的混合共軛梯度法,分析 了其性質和全局收斂性,并給出了大量的數值結果.其主要成果如下:1.受Dai 和 Wen(Applied Mathematics and Computation,2012,218(14):7421-7430.),Jian 等(Applied Mathematical Modelling,2015,39(3):1281-1290.)和Wei 等(Applied Mat,hematics and Computation,2006,183(2):1341-1350.)的啟發(fā),本文提出了 NHC法,并給出了新共軛參數βkNHC的計算公式.共軛參數βkNHC具有性質(?)無論采用何種線搜索策略,NHC法在每一步迭代過程中都能生成一個充分下降方向.而且,在標準的Wolfe線搜索條件下,提出的算法能全局收斂.最后,我們做了大量的數值實驗.數值結果也說明了提出的算法具有良好的計算性能.2.受 Dai 和 Wen(Applied Mathematics and Computation,2012,218(14):7421-7430.)和 Wei 等(Applied Mathematics and Computation,2006,179(2):407-430.)的啟示,我們提出了一類新的求解無約束優(yōu)化問題的混合共軛梯度法,即HZW法.HZW法的共軛參數βkNZW滿足0 ≤ βkHZW ≤ βkFR這樣的性質.并且,在每一步迭代過程中,HZW法總是能生成一個充分下降方向.在標準的Wolfe線搜索下,HZW法具有全局收斂性.此外,數值實驗也表明我們的算法是有效的和可行的.
[Abstract]:Conjugate gradient method is a kind of commonly used and very effective iterative algorithm for solving large-scale unconstrained optimization problems. Compared with Newton method and quasi-Newton method, it has the advantages of simple algorithm and small storage space. As we all know, in the classical conjugate gradient method, the global convergence and numerical performance of different methods are different. Naturally, many scholars try to construct a new algorithm with good global convergence and excellent numerical performance. One idea is to directly improve the classical conjugate parameter 尾 k. Another idea is to effectively mix the conjugate parameter 尾 k, which has good convergence and good numerical performance. In this paper, we mainly consider constructing the new algorithm through the latter idea. Recently, some mixed conjugate gradient methods have been proposed and some good results have been obtained. Inspired by them, two new mixed conjugate gradient methods are proposed, their properties and global convergence are analyzed, and a large number of numerical results are given. Its main achievements are as follows: 1. By Dai and Wenzhang Applied Mathematics and Computation / 2014 2218 / 14: 7421-7430./ Jian et al. Applied Mathematical Modelling / 2015 39 / 3 / 1281-1290.) And Wei et al. Applied Mathematics and computer / 2 / 1341-1350) In this paper, the NHC method is proposed and the formula for calculating the new conjugate parameter 尾 kNHC is given. The conjugate parameter 尾 kNHC has some properties. No matter what line search strategy is used, the NHC method can generate a sufficient descent direction in each step of iteration. Moreover, under the standard Wolfe line search condition, the proposed algorithm can converge globally. Finally, we do a lot of numerical experiments. Numerical results also show that the proposed algorithm has good computational performance. 2. By Dai and Wenzhang Applied Mathematics and Computation / 2012218 / 14: 7421-7430.) And Wei et al., Applied Mathematics and Computation / 2006179 / 2: 407-430.) We propose a new class of mixed conjugate gradient method for solving unconstrained optimization problems, that is, the conjugate parameter 尾 kNZW of HZW method and HZW method satisfies the property of 0 鈮,
本文編號:2000027
本文鏈接:http://sikaile.net/kejilunwen/yysx/2000027.html