非均衡隨機(jī)梯度下降SVM在線算法
發(fā)布時(shí)間:2018-09-14 17:30
【摘要】:隨機(jī)梯度下降方法(SGD)已被應(yīng)用于大規(guī)模支持向量機(jī)訓(xùn)練,隨機(jī)梯度下降方法在訓(xùn)練時(shí)采取隨機(jī)選點(diǎn)的方式,對(duì)于非均衡分類問題,必然導(dǎo)致多數(shù)類點(diǎn)被抽取到的概率要遠(yuǎn)遠(yuǎn)大于少數(shù)類點(diǎn),這就造成了種種計(jì)算上的不平衡。為了處理大規(guī)模非均衡數(shù)據(jù)分類問題,提出了加權(quán)隨機(jī)梯度下降的SVM在線算法,對(duì)于多數(shù)類中的樣例被賦予較小的權(quán)值,而少數(shù)類中的樣例被賦予較大的權(quán)值,然后利用加權(quán)隨機(jī)梯度下降算法對(duì)SVM原問題進(jìn)行求解,減少了超平面向少數(shù)類的偏移,較好地解決了大規(guī)模學(xué)習(xí)中非均衡數(shù)據(jù)的分類問題。同時(shí),通過采用二分對(duì)數(shù)損失函數(shù)、不同的加權(quán)方式以及采用小批量梯度下降算法,從而在精度以及穩(wěn)定性上使算法對(duì)非均衡數(shù)據(jù)的分類效果得到提升。
[Abstract]:Random gradient descent (SGD) method has been applied to large-scale support vector machine training. The random gradient descent (SGD) method uses random selection of points in training. For unbalanced classification problems, the probability of most class points being extracted is much higher than that of a few class points, which results in various computational imbalances. In this paper, a weighted stochastic gradient descent SVM on-line algorithm is proposed to classify data with unbalanced scale. For most classes, the samples are given smaller weights, while the samples in a few classes are given larger weights. Then the original SVM problem is solved by weighted stochastic gradient descent algorithm, which reduces the offset from hyperplane to minority classes. At the same time, by using the Binary Logarithmic loss function, different weighting methods and the small batch gradient descent algorithm, the accuracy and stability of the algorithm are improved.
【學(xué)位授予單位】:河北大學(xué)
【學(xué)位級(jí)別】:碩士
【學(xué)位授予年份】:2017
【分類號(hào)】:TP181
本文編號(hào):2243391
[Abstract]:Random gradient descent (SGD) method has been applied to large-scale support vector machine training. The random gradient descent (SGD) method uses random selection of points in training. For unbalanced classification problems, the probability of most class points being extracted is much higher than that of a few class points, which results in various computational imbalances. In this paper, a weighted stochastic gradient descent SVM on-line algorithm is proposed to classify data with unbalanced scale. For most classes, the samples are given smaller weights, while the samples in a few classes are given larger weights. Then the original SVM problem is solved by weighted stochastic gradient descent algorithm, which reduces the offset from hyperplane to minority classes. At the same time, by using the Binary Logarithmic loss function, different weighting methods and the small batch gradient descent algorithm, the accuracy and stability of the algorithm are improved.
【學(xué)位授予單位】:河北大學(xué)
【學(xué)位級(jí)別】:碩士
【學(xué)位授予年份】:2017
【分類號(hào)】:TP181
【參考文獻(xiàn)】
相關(guān)期刊論文 前4條
1 繆志敏;潘志松;袁偉偉;趙陸文;;一種新的基于SVDD的多類分類算法[J];計(jì)算機(jī)科學(xué);2009年03期
2 王和勇;樊泓坤;姚正安;;SMOTE和Biased-SVM相結(jié)合的不平衡數(shù)據(jù)分類方法[J];計(jì)算機(jī)科學(xué);2008年05期
3 李昆侖,黃厚寬,田盛豐,劉振鵬,劉志強(qiáng);模糊多類支持向量機(jī)及其在入侵檢測中的應(yīng)用[J];計(jì)算機(jī)學(xué)報(bào);2005年02期
4 王華忠,張雪申,俞金壽;基于支持向量機(jī)的故障診斷方法[J];華東理工大學(xué)學(xué)報(bào);2004年02期
相關(guān)博士學(xué)位論文 前1條
1 楊智明;面向不平衡數(shù)據(jù)的支持向量機(jī)分類方法研究[D];哈爾濱工業(yè)大學(xué);2009年
,本文編號(hào):2243391
本文鏈接:http://sikaile.net/kejilunwen/zidonghuakongzhilunwen/2243391.html
最近更新
教材專著