基于人工蜂群算法的支持向量機(jī)集成研究
本文關(guān)鍵詞: 人工蜂群算法 特征選擇 支持向量機(jī) 同步優(yōu)化 集成學(xué)習(xí) 出處:《湖北工業(yè)大學(xué)》2017年碩士論文 論文類(lèi)型:學(xué)位論文
【摘要】:支持向量機(jī)(Support Vector Machine,SVM)是一種建立在統(tǒng)計(jì)學(xué)習(xí)理論基礎(chǔ)上適用于小樣本情況的機(jī)器學(xué)習(xí)技術(shù),已經(jīng)被廣泛地應(yīng)用于模式識(shí)別各領(lǐng)域。SVM分類(lèi)器的性能很大程度上受其自身參數(shù)和使用特征的影響,傳統(tǒng)方法是將參數(shù)尋優(yōu)問(wèn)題和特征選擇問(wèn)題進(jìn)行分開(kāi)處理,難以得到分類(lèi)性能整體最優(yōu)的SVM,但是隨著優(yōu)化計(jì)算技術(shù)在模式識(shí)別領(lǐng)域中的廣泛應(yīng)用,將參數(shù)尋優(yōu)和特征選擇問(wèn)題進(jìn)行同步優(yōu)化已變成了一種趨勢(shì)。另外,由于實(shí)際問(wèn)題的復(fù)雜性,SVM的泛化能力也需要進(jìn)一步提高。集成學(xué)習(xí)為提高分類(lèi)系統(tǒng)的泛化能力提供了一條新途徑,它通過(guò)訓(xùn)練和組合多個(gè)有差異的分類(lèi)器,從而提高分類(lèi)器的性能,已經(jīng)取得了較好的進(jìn)展和成果,然而相關(guān)工作并未完善,值得進(jìn)一步研究。從這一現(xiàn)狀出發(fā),本文主要研究了使用人工蜂群算法對(duì)支持向量機(jī)進(jìn)行參數(shù)特征同步優(yōu)化和集成研究。首先研究了使用人工蜂群算法(Artificial Bee Colony Algorithm,ABC)進(jìn)行特征選擇和支持向量機(jī)參數(shù)優(yōu)化。進(jìn)而將SVM的參數(shù)尋優(yōu)問(wèn)題和特征選擇問(wèn)題視為最優(yōu)化問(wèn)題同步處理,在提高SVM分類(lèi)精度的同時(shí)盡可能選擇少的特征數(shù)目,獲得整體性能最優(yōu)的SVM參數(shù)和特征子集。為了進(jìn)一步的提高SVM分類(lèi)系統(tǒng)的泛化能力,在實(shí)現(xiàn)特征參數(shù)同步優(yōu)化的基礎(chǔ)上,再引進(jìn)加權(quán)投票集成學(xué)習(xí)技術(shù),分別構(gòu)建若干個(gè)SVM分類(lèi)器,在對(duì)每個(gè)SVM分類(lèi)器進(jìn)行學(xué)習(xí)后,得到若干個(gè)具有差異性的SVM分類(lèi)器,并設(shè)置單個(gè)SVM分類(lèi)器的集成投票權(quán)重為每個(gè)SVM分類(lèi)器的分類(lèi)準(zhǔn)確率和總分類(lèi)器數(shù)目的比值,將若干個(gè)具有差異性的SVM分類(lèi)器采用加權(quán)投票規(guī)則的方式進(jìn)行組合,以期能夠得到更優(yōu)的集成分類(lèi)性能。為了驗(yàn)證所提出方法的性能,利用部分UCI數(shù)據(jù)集進(jìn)行實(shí)驗(yàn)驗(yàn)證,本文還將ABC算法與常用的遺傳算法和粒子群優(yōu)化算法進(jìn)行了對(duì)比分析。實(shí)驗(yàn)研究結(jié)果顯示,將其與遺傳算法和粒子群優(yōu)化算法相比,ABC算法在SVM分類(lèi)器的優(yōu)化中具有更好的表現(xiàn);進(jìn)一步,基于ABC-SVM的加權(quán)投票集成算法具有很好的自適應(yīng)性和分類(lèi)精度,能夠提高基本SVM分類(lèi)器性能的同時(shí)選擇出更少的特征數(shù)目,并獲取整體性能最優(yōu)的SVM參數(shù)和特征子集。
[Abstract]:Support Vector Machine (SVM) is a machine learning technology based on statistical learning theory. The performance of SVM classifier has been widely used in various fields of pattern recognition. The performance of SVM classifier is greatly affected by its own parameters and usage features. The traditional method is to deal with the problem of parameter optimization and feature selection separately. It is difficult to obtain the SVM with overall optimal classification performance, but with the wide application of optimization computing technology in the field of pattern recognition, it has become a trend to synchronize parameter optimization and feature selection. Because of the complexity of practical problems, the generalization ability of SVM also needs to be further improved. Integrated learning provides a new way to improve the generalization ability of classification systems. In order to improve the performance of the classifier, good progress and achievements have been achieved, but the relevant work is not perfect, which is worthy of further study. This paper mainly studies the parameter synchronization optimization and ensemble research of support vector machine using artificial bee colony algorithm. Firstly, we study the feature selection and parameter optimization of support vector machine using artificial Bee Colony algorithm. Furthermore, the parameter optimization problem and the feature selection problem of SVM are regarded as the synchronization processing of the optimization problem. In order to improve the generalization ability of SVM classification system, we can improve the accuracy of SVM classification and select as few feature numbers as possible, and obtain the best SVM parameters and feature subsets. In order to improve the generalization ability of SVM classification system, we can realize the synchronization optimization of feature parameters. Then the weighted voting ensemble learning technique is introduced to construct several SVM classifiers respectively. After learning each SVM classifier, several SVM classifiers with differences are obtained. The integrated voting weight of a single SVM classifier is set as the ratio of the classification accuracy of each SVM classifier to the number of general classifiers, and several SVM classifiers with differences are combined by weighted voting rules. In order to verify the performance of the proposed method, some UCI data sets are used to verify the performance of the proposed method. The ABC algorithm is compared with the usual genetic algorithm and particle swarm optimization algorithm. The experimental results show that compared with the genetic algorithm and particle swarm optimization algorithm, the ABC algorithm has better performance in the optimization of SVM classifier. Furthermore, the weighted voting ensemble algorithm based on ABC-SVM has good adaptability and classification accuracy. It can improve the performance of the basic SVM classifier and select fewer features, and obtain the best SVM parameters and feature subsets.
【學(xué)位授予單位】:湖北工業(yè)大學(xué)
【學(xué)位級(jí)別】:碩士
【學(xué)位授予年份】:2017
【分類(lèi)號(hào)】:TP18
【參考文獻(xiàn)】
相關(guān)期刊論文 前10條
1 董明;;基于機(jī)器學(xué)習(xí)與圖像處理的目標(biāo)Mark識(shí)別算法[J];計(jì)算機(jī)與數(shù)字工程;2016年12期
2 陳江;單桂軍;李正明;;基于支持向量機(jī)集成學(xué)習(xí)的網(wǎng)絡(luò)故障診斷方法[J];計(jì)算機(jī)測(cè)量與控制;2014年12期
3 劉培;杜培軍;譚琨;;一種基于集成學(xué)習(xí)和特征融合的遙感影像分類(lèi)新方法[J];紅外與毫米波學(xué)報(bào);2014年03期
4 譚愛(ài)平;陳浩;吳伯橋;;基于SVM的網(wǎng)絡(luò)入侵檢測(cè)集成學(xué)習(xí)算法[J];計(jì)算機(jī)科學(xué);2014年02期
5 付忠良;;多標(biāo)簽代價(jià)敏感分類(lèi)集成學(xué)習(xí)算法[J];自動(dòng)化學(xué)報(bào);2014年06期
6 肖劍;周建中;李超順;王常青;張孝遠(yuǎn);肖漢;;基于混合蜂群算法特征參數(shù)同步優(yōu)化支持向量機(jī)的水電機(jī)組軸心軌跡識(shí)別方法研究[J];電力系統(tǒng)保護(hù)與控制;2013年21期
7 高偉;王中卿;李壽山;;基于集成學(xué)習(xí)的半監(jiān)督情感分類(lèi)方法研究[J];中文信息學(xué)報(bào);2013年03期
8 付忠良;;通用集成學(xué)習(xí)算法的構(gòu)造[J];計(jì)算機(jī)研究與發(fā)展;2013年04期
9 宋靜;;支持向量機(jī)的應(yīng)用研究[J];電腦知識(shí)與技術(shù);2012年33期
10 楊威;付耀文;龍建乾;黎湘;;基于有限集統(tǒng)計(jì)學(xué)理論的目標(biāo)跟蹤技術(shù)研究綜述[J];電子學(xué)報(bào);2012年07期
相關(guān)碩士學(xué)位論文 前2條
1 卞桂龍;在線學(xué)習(xí)的集成分類(lèi)器研究[D];浙江大學(xué);2014年
2 曹彥;基于支持向量機(jī)的特征選擇及其集成方法的研究[D];鄭州大學(xué);2010年
,本文編號(hào):1543502
本文鏈接:http://sikaile.net/shoufeilunwen/xixikjs/1543502.html