基于韋增欣等的共軛梯度參數(shù)的修正共軛梯度算法
發(fā)布時間:2018-06-19 02:54
本文選題:非精確線搜索 + 共軛梯度法; 參考:《重慶師范大學》2015年碩士論文
【摘要】:本文基于韋增欣等的共軛梯度參數(shù),提出一些修正共軛梯度算法,建立了這些算法的收斂性定理,并通過大量的數(shù)值試驗檢驗所提出的算法的有效性.第一章介紹了非線性共軛梯度法的基本知識,一些已有的共軛梯度法全局收斂性的結(jié)果,本文的主要工作以及一些重要引理和假設(shè).第二章,基于韋增欣等的共軛梯度參數(shù),我們提出了四個修正的非線性共軛梯度法,分別稱為NVLS,NVPRP*,NVHS*以及NVLS*方法.在強Wolfe線搜索條件且下證明了NVLS方法的充分下降性和全局收斂性;在強Wolfe線搜索條件且下證明了NVPRP*方法的充分下降性和全局收斂性;在強Wolfe線搜索條件且下證明了NVHS*方法的充分下降性和全局收斂性;在強Wolfe線搜索條件且下證明了NVLS*方法的充分下降性和全局收斂性.數(shù)值結(jié)果表明,NVPRP*方法優(yōu)于NVPRP方法,NVHS*方法優(yōu)于NVHS方法,NVLS*方法優(yōu)于NVLS方法.第三章,我們提出了一個雙參數(shù)的共軛梯度法簇,稱為THCG*方法,它可被看作是作者在本文第二章中提出的NVPRP*,NVHS*和NVLS*方法的凸組合.在強Wolfe線搜索條件下證明了THCG*方法的充分下降性和全局收斂性.數(shù)值結(jié)果表明,THCG*方法雖然略差于NVHS*方法,但是比PRP,NVPRP*,NVLS*,THCG方法都好.第四章,我們分別對第二章的NVPRP*,NVHS*和NVLS*方法進行修正,提出MDPRP*.MDHS*,MDLS*方法.當μ≥0時,在強Wolfe線搜索條件且證明了NVPRP*方法的充分下降性和全局收斂性;在強Wolfe線搜索條件且下證明了NVHS*方法的充分下降性和全局收斂性;在強Wolfe線搜索條件且下,證明了NVLS*方法的充分下降性和全局收斂性.當μ2時,在Wolfe線搜索條件下分別證明了NVPRP*,NVHS*,NVLS*方法的充分下降性和全局收斂性.數(shù)值結(jié)果表明,MDPRP*方法優(yōu)于NVPRP*方法,MDHS*方法優(yōu)于NVHS*方法,MDLS*方法優(yōu)于NVLS*方法.第五章,我們提出了一個修正的Dai-Liao共軛梯度法,稱為MDL*方法.在強Wolfe線搜索條件且下證明了MDL*方法的充分下降性和全局收斂性.數(shù)值結(jié)果表明,MDL*方法優(yōu)于PRP,DL,MDL,NVHS*及DY方法.
[Abstract]:In this paper, based on the conjugate gradient parameters of Wei Zengxin and others, some modified conjugate gradient algorithms are proposed, their convergence theorems are established, and the validity of the proposed algorithm is verified by a large number of numerical tests. In the first chapter, we introduce the basic knowledge of nonlinear conjugate gradient method, some results of global convergence of conjugate gradient method, the main work of this paper, and some important Lemma and assumptions. In the second chapter, based on the conjugate gradient parameters of Wei Zengxin and others, we propose four modified nonlinear conjugate gradient methods, which are called NVLS NVPRPU NVHS* and NVLS * methods. The sufficient descent and global convergence of NVLS method are proved under strong Wolfe line search condition, and the sufficient descent and global convergence of NVPRP * method are proved under strong Wolfe line search condition. The sufficient descent and global convergence of NVHS * method are proved under strong Wolfe line search condition, and the sufficient descent and global convergence of NVHS * method are proved under strong Wolfe line search condition. Numerical results show that the NVPRP * method is superior to the NVPRP method and the NVHS* method is superior to the NVHS method and the NVLS* method is superior to the NVLS method. In Chapter 3, we propose a two-parameter conjugate gradient method, called THCG * method, which can be regarded as a convex combination of NVPRPU NVHS* and NVLS* methods proposed by the author in Chapter 2. The sufficient descent and global convergence of the THCG * method are proved under the condition of strong Wolfe line search. The numerical results show that the THCG * method is slightly inferior to the NVHS * method, but it is better than the PRP NVPRP / NVLPRP / THCG method. In Chapter 4, we revise the NVPRPU NVHS* and NVLS* method in Chapter 2, and propose MDPRPU. MDHSN MDLS* method. When 渭 鈮,
本文編號:2038065
本文鏈接:http://sikaile.net/kejilunwen/yysx/2038065.html
最近更新
教材專著