天堂国产午夜亚洲专区-少妇人妻综合久久蜜臀-国产成人户外露出视频在线-国产91传媒一区二区三区

當(dāng)前位置:主頁 > 文藝論文 > 器樂論文 >

基于聲學(xué)特征和音樂特征的音樂流派分類研究

發(fā)布時間:2018-04-21 07:28

  本文選題:音樂流派分類 + 聲學(xué)特征; 參考:《江南大學(xué)》2014年碩士論文


【摘要】:音樂流派自動分類是利用信號處理和模式識別等方法對數(shù)字音樂樣本按照流派風(fēng)格通過計算機達到類別自動識別的過程,音樂自動檢索與流派分類成為了近幾年研究的熱點。音樂樣本時間長且復(fù)雜多變,由不同人聲、樂器的多路混合信號組成,因此音樂流派分類是一項困難的模式識別問題,同時具有很大的研究和應(yīng)用價值。本文基于聲學(xué)特征和音樂特征,對流派分類識別特征及其提取方法進行了研究,具體的研究工作如下: 1、研究了音樂節(jié)拍的提取方法,提出結(jié)合節(jié)拍語義特征和MFCC聲學(xué)特征的音樂流派分類方法。由于音樂節(jié)拍的強度、快慢、持續(xù)時間等反映了音樂不同流派風(fēng)格的重要語義特征,而音樂節(jié)拍多屬于由打擊樂器所產(chǎn)生的低頻部分,因此利用小波變換對音樂信號進行6層分解提取低頻節(jié)拍特征;針對節(jié)拍特征差異不明顯的音樂流派,用描述頻域能量包絡(luò)的MFCC聲學(xué)特征與節(jié)拍特征結(jié)合,并基于音樂流派機理分析用8階MFCC代替常用的12階MFCC。對8類音樂流派實驗仿真結(jié)果表明,基于語義特征和聲學(xué)特征結(jié)合的方法,總體分類準(zhǔn)確率可達68.4%,同時特征維數(shù)增加對分類時間影響很小。 2、研究了基于譜圖分離的調(diào)制譜特征的音樂流派分類方法。通過分析形成音樂節(jié)奏的沖擊成分和形成韻律的和聲成分在音樂信號中的時頻特性,發(fā)現(xiàn)直接從音樂信號中提取特征會受到這兩種成分相互影響。利用節(jié)奏與和聲在時頻平面具有不同規(guī)律的特點,對音樂信號的譜圖濾波,分離出音樂中的打擊成分與和聲成分;對打擊與和聲譜圖分別進行小波調(diào)制,得到表現(xiàn)音樂節(jié)奏和韻律譜規(guī)律的調(diào)制譜特征,它表達了音樂流派特點的長時中級特征。仿真實驗結(jié)果表明:分離后的打擊與和聲成分譜圖更清晰地表征了音樂節(jié)奏和韻律的特點和規(guī)律,對8類音樂流派提取打擊與和聲調(diào)制譜特征,經(jīng)LDA降維后利用SVM分類,分類準(zhǔn)確率達到了73.5%。 3、研究了基于多尺度Gabor圖像紋理特征的音樂流派分類方法。由于一般音樂流派分類系統(tǒng)多基于聲學(xué)特征,從多路混合的聲音信號中提取聲學(xué)特征,會因為各音樂元素相互間的影響而降低聲學(xué)特征的分類性能,同時音樂語義元素在時頻譜圖上呈現(xiàn)出清晰的視覺紋理信息,譜圖紋理的疏密、方向間接反映了音樂節(jié)奏、韻律等流派特點;因此從圖像處理角度通過提取譜圖圖像多尺度、多方向的二維Gabor紋理特征來獲得音樂信號不同角度的時頻特征,對8類音樂流派實驗仿真結(jié)果表明,,基于多尺度Gabor圖像紋理特征的分類效果與聲學(xué)特征相當(dāng),總體分類準(zhǔn)確率為73.1%,最高可達83.3%。
[Abstract]:The automatic classification of music schools is a process of automatic classification of digital music samples according to genre style by means of signal processing and pattern recognition. Automatic music retrieval and genre classification has become a hot topic in recent years. The music sample is long and complex, and is composed of mixed signals of different voices and instruments. Therefore, the classification of music schools is a difficult problem of pattern recognition, and has great research and application value. Based on acoustic features and music features, this paper studies the genre classification recognition features and their extraction methods. The specific research work is as follows: 1. The extraction method of music beat is studied, and a method of music genre classification combining the semantic feature of rhythm and MFCC acoustic feature is proposed. Because the intensity, speed and duration of the musical beat reflect the important semantic features of different genres of music, the musical beat mostly belongs to the low-frequency part produced by the percussion instrument. Therefore, the wavelet transform is used to decompose the music signal into six layers to extract the low frequency rhythm feature. For the music genre with no obvious difference in the rhythm characteristic, the MFCC acoustic feature which describes the energy envelope in the frequency domain is combined with the beat feature. Based on the analysis of music genre mechanism, 8 order MFCC is used to replace 12 order MFCC. The simulation results of 8 kinds of music schools show that, based on the combination of semantic features and acoustic features, the overall classification accuracy can reach 68.4, and the increase of feature dimension has little effect on the classification time. 2. The method of music genre classification based on spectral separation is studied. Based on the analysis of the time-frequency characteristics of the impact components forming the musical rhythm and the harmonic components forming the rhythm in the music signal, it is found that the extraction of the characteristics from the music signal directly will be affected by the interaction of the two components. Based on the different characteristics of rhythm and harmony in time-frequency plane, the music signal spectrum is filtered to separate the striking and harmonic components, and the beat and harmonic spectrum are modulated by wavelet, respectively. The modulation spectrum features which express the rhythm and rhythm of music are obtained, which express the long-term and intermediate characteristics of the music genre. The simulation results show that the characteristics and rules of music rhythm and prosody are more clearly represented by the separated beat and harmony component spectrum, and the characteristics of striking and harmonic modulation spectrum are extracted from 8 music schools, and then classified by SVM after dimension reduction by LDA. The classification accuracy reached 73.5%. 3. The music genre classification method based on multi-scale Gabor image texture feature is studied. Because the general music genre classification system is mostly based on acoustic features, extracting acoustic features from multi-channel mixed sound signals will reduce the classification performance of acoustic features because of the influence of each musical element on each other. At the same time, the semantic elements of music show clear visual texture information on the time-frequency spectrum, and the texture of the spectrum is dense, the direction indirectly reflects the characteristics of music rhythm, rhythm and other genres, so from the perspective of image processing, the multi-scale image of the spectrum image is extracted from the point of view of image processing. Multi-directional two-dimensional Gabor texture features are used to obtain the time-frequency features of different angles of music signals. The simulation results of 8 kinds of music schools show that the classification effect based on multi-scale Gabor image texture features is similar to that of acoustic features. The total classification accuracy was 73.1 and the highest was 83.3.
【學(xué)位授予單位】:江南大學(xué)
【學(xué)位級別】:碩士
【學(xué)位授予年份】:2014
【分類號】:J61;TP391.41

【參考文獻】

相關(guān)期刊論文 前2條

1 馬光志,秦丹;利用互信息實現(xiàn)音樂風(fēng)格的分類[J];計算機應(yīng)用;2005年05期

2 王曉慧;;線性判別分析與主成分分析及其相關(guān)研究評述[J];中山大學(xué)研究生學(xué)刊(自然科學(xué)、醫(yī)學(xué)版);2007年04期



本文編號:1781472

資料下載
論文發(fā)表

本文鏈接:http://sikaile.net/wenyilunwen/qiyueyz/1781472.html


Copyright(c)文論論文網(wǎng)All Rights Reserved | 網(wǎng)站地圖 |

版權(quán)申明:資料由用戶9338a***提供,本站僅收錄摘要或目錄,作者需要刪除請E-mail郵箱bigeng88@qq.com