天堂国产午夜亚洲专区-少妇人妻综合久久蜜臀-国产成人户外露出视频在线-国产91传媒一区二区三区

基于時(shí)空興趣點(diǎn)的生豬行為識(shí)別

發(fā)布時(shí)間:2019-05-09 04:37
【摘要】:基于計(jì)算機(jī)視覺的動(dòng)物行為參數(shù)分析具有重要研究?jī)r(jià)值和應(yīng)用空間,其對(duì)傳統(tǒng)的畜禽養(yǎng)殖方式產(chǎn)生深遠(yuǎn)的影響。如今大多數(shù)規(guī);B(yǎng)豬場(chǎng)中,仍然采用人工監(jiān)控生豬養(yǎng)殖視頻的方式,其監(jiān)控實(shí)時(shí)性不高,容易因疲勞造成誤檢和漏檢。為了解決這一問題,本文以規(guī);B(yǎng)豬場(chǎng)的生豬為研究對(duì)象,重點(diǎn)研究基于時(shí)空興趣點(diǎn)和詞袋模型的生豬行為描述與行為識(shí)別等算法。本文以導(dǎo)師主持的廣東省科技計(jì)劃項(xiàng)目“豬只采食與排泄行為智能識(shí)別及異常預(yù)警系統(tǒng)(2012A020602043)”為背景,針對(duì)規(guī);B(yǎng)豬場(chǎng)的實(shí)際情況,設(shè)計(jì)了一套基于網(wǎng)絡(luò)攝像頭的生豬監(jiān)控方法與一種基于局部表示的生豬行為描述方法,實(shí)現(xiàn)了生豬的一些主要行為的識(shí)別。在生豬行為描述方面,通過實(shí)驗(yàn)分析發(fā)現(xiàn)幀差法、混合高斯背景建模法、光流法等基于整體表示的生豬行為描述方法存在無法精確地分割得到生豬輪廓且因?yàn)閳D像噪聲和生豬被部分遮擋而魯棒性偏低等問題,本文選擇局部表示方法作為生豬行為描述的方法。通過生豬生物學(xué)與行為學(xué)研究和當(dāng)前健康養(yǎng)殖的需要,本文選擇扎堆取暖、采食、探究、慢走等生豬行為進(jìn)行識(shí)別研究。通過實(shí)驗(yàn)比較Harris和SUSAN兩種角點(diǎn)檢測(cè)算法在生豬圖像上的興趣點(diǎn)檢測(cè)效果,發(fā)現(xiàn)Harris算法比SUSAN算法檢測(cè)生豬興趣點(diǎn)的效果更好,本文采用Harris3D時(shí)空興趣點(diǎn)檢測(cè)算法檢測(cè)生豬行為視頻中變化劇烈的像素點(diǎn),通過對(duì)比實(shí)驗(yàn)和結(jié)合生豬視頻監(jiān)控實(shí)際發(fā)現(xiàn)生豬視頻金字塔總層數(shù)為3時(shí),Harris3D時(shí)空興趣點(diǎn)對(duì)生豬行為興趣點(diǎn)檢測(cè)的效果較好。通過對(duì)比分析生豬扎堆取暖、采食、探究、慢走行為的時(shí)空興趣點(diǎn)分布情況,發(fā)現(xiàn)Harris3D時(shí)空興趣點(diǎn)能較好地檢測(cè)出生豬四種行為的運(yùn)動(dòng)部位及各自規(guī)律。為了統(tǒng)計(jì)時(shí)空興趣點(diǎn)鄰域內(nèi)的梯度方向和光流方向統(tǒng)計(jì)量,采用以時(shí)空興趣點(diǎn)為中心的(△x,△y,△t)時(shí)空體內(nèi)的HOG/HOF描述子描述生豬局部時(shí)空特征。在生豬行為識(shí)別建模與分類方面,本文采用詞袋模型對(duì)生豬四種行為進(jìn)行建模,采用Matlab實(shí)現(xiàn)了對(duì)HOG/HOF描述子的K-Means聚類和將HOG/HOF映射到詞袋中。經(jīng)過K-Means聚類和HOG/HOF描述子映射后,得到生豬四種行為的平均直方圖,分析發(fā)現(xiàn)生豬四種行為之間差異性較大,基于Harris3D時(shí)空興趣點(diǎn)和詞袋模型的生豬行為識(shí)別可行,且估計(jì)生豬行為識(shí)別準(zhǔn)確率較高。將對(duì)生豬行為建模的詞頻直方圖向量作為SVM的特征向量進(jìn)行行為分類識(shí)別實(shí)驗(yàn)。最后分別選擇廣州市從化、天河兩個(gè)不同的規(guī);B(yǎng)豬場(chǎng)進(jìn)行了總共5天的視頻采集。實(shí)驗(yàn)結(jié)果表明,本文提出的生豬行為識(shí)別算法準(zhǔn)確率達(dá)到92.31%,能夠很好地識(shí)別生豬行為,詞典大小為100時(shí)行為識(shí)別算法的性能最高。
[Abstract]:The analysis of animal behavior parameters based on computer vision has important research value and application space, and it has far-reaching influence on traditional livestock and poultry breeding methods. Nowadays, most large-scale pig farms still adopt the method of manual monitoring of live pig breeding video, whose real-time monitoring is not high, and it is easy to misdetect and miss detection due to fatigue. In order to solve this problem, this paper takes pigs from large-scale pig farms as the research object, and focuses on the algorithms of pig behavior description and behavior recognition based on space-time interest points and word bag model. Based on the Guangdong Science and Technology Program Project "Intelligent Identification and abnormal early warning system (2012A020602043) for Pig feeding and excretion behavior", this paper aims at the actual situation of large-scale pig farms. A set of live pig monitoring method based on webcam and a pig behavior description method based on local representation are designed to realize the identification of some main behaviors of live pigs. In the description of pig behavior, the frame difference method, mixed with Gao Si background modeling method, was found through experimental analysis. There are some problems, such as optical flow method, in describing pig behavior based on global representation, such as the problem that the pig contour can not be accurately segmented and the robustness is low because of image noise and partial occlusion of live pig, and so on. In this paper, the local representation method is selected as the method of pig behavior description. Based on the study of pig biology and behavior and the needs of healthy breeding, this paper chooses to identify the behavior of live pigs such as heating, feeding, exploring, slow walking and so on. By comparing the effects of Harris and SUSAN corner detection algorithms on pig images, it is found that Harris algorithm is better than SUSAN algorithm in detecting interest points in live pigs. In this paper, Harris3D space-time interest point detection algorithm is used to detect pixel points in pig behavior video. Through comparative experiments and combined with pig video monitoring, it is found that the total number of pyramid layers of live pig video is 3. The Harris3D space-time interest point has a good effect on the detection of pig behavior interest point. By comparing and analyzing the distribution of temporal and spatial interest points of pig heating, feeding, exploring and slow walking behavior, it is found that Harris3D space-time interest points can well detect the movement sites and their respective laws of the four kinds of behavior of live pigs. In order to calculate the gradient direction and optical flow direction statistics in the neighborhood of space-time interest points, the local space-time characteristics of live pigs are described by using HOG/HOF predictors in (x, y, t) space-time with space-time interest points as the center. In the aspect of pig behavior recognition modeling and classification, this paper uses word bag model to model four kinds of pig behavior, uses Matlab to realize K-Means clustering of HOG/HOF predictors and maps HOG/HOF to word bag. After K-Means clustering and HOG/HOF descriptive mapping, the average histogram of the four behaviors of live pigs is obtained. it is found that there are great differences among the four behaviors of live pigs, and the recognition of pig behavior based on Harris3D space-time interest points and word bag model is feasible. It is estimated that the accuracy of pig behavior recognition is high. The word-frequency histogram vector of pig behavior modeling is used as the feature vector of SVM to carry on the behavior classification and recognition experiment. Finally, two different large-scale pig farms in Guangzhou, Conghua and Tianhe, were selected to collect video for a total of 5 days. The experimental results show that the accuracy of the proposed algorithm is 92.31%, and it can recognize the behavior of live pigs very well, and the performance of the algorithm is the highest when the dictionary size is 100.
【學(xué)位授予單位】:華南農(nóng)業(yè)大學(xué)
【學(xué)位級(jí)別】:碩士
【學(xué)位授予年份】:2016
【分類號(hào)】:TP391.41;S828

【參考文獻(xiàn)】

相關(guān)期刊論文 前10條

1 劉操;鄭宏;黎曦;余典;;基于多通道融合HOG特征的全天候運(yùn)動(dòng)車輛檢測(cè)方法[J];武漢大學(xué)學(xué)報(bào)(信息科學(xué)版);2015年08期

2 雷慶;陳鍛生;李紹滋;;復(fù)雜場(chǎng)景下的人體行為識(shí)別研究新進(jìn)展[J];計(jì)算機(jī)科學(xué);2014年12期

3 劉碩明;劉佳;;基于生成/判別混合模型的動(dòng)作識(shí)別[J];電子技術(shù)與軟件工程;2014年12期

4 溫長(zhǎng)吉;王生生;趙昕;王_,

本文編號(hào):2472479


資料下載
論文發(fā)表

本文鏈接:http://sikaile.net/yixuelunwen/dongwuyixue/2472479.html


Copyright(c)文論論文網(wǎng)All Rights Reserved | 網(wǎng)站地圖 |

版權(quán)申明:資料由用戶ef775***提供,本站僅收錄摘要或目錄,作者需要?jiǎng)h除請(qǐng)E-mail郵箱bigeng88@qq.com