天堂国产午夜亚洲专区-少妇人妻综合久久蜜臀-国产成人户外露出视频在线-国产91传媒一区二区三区

當(dāng)前位置:主頁 > 科技論文 > 自動化論文 >

全景視覺環(huán)境避障測距方法研究

發(fā)布時間:2018-08-09 15:45
【摘要】:在移動機(jī)器人自主避障導(dǎo)航領(lǐng)域中,視覺傳感器在獲取周圍環(huán)境信息方面有著許多優(yōu)勢,例如,圖像信息豐富,多個視覺傳感器之間聯(lián)合工作時相互干擾較小。而相比傳統(tǒng)視覺的狹窄視場,全景視覺擁有的廣闊視野能夠彌補(bǔ)傳統(tǒng)視覺在視場上的缺陷。因此,全景視覺開始廣泛應(yīng)用于自主機(jī)器人導(dǎo)航、三維重構(gòu)、視頻監(jiān)控等領(lǐng)域。國內(nèi)外專家學(xué)者對全景視覺已經(jīng)作了許多方面的應(yīng)用研究,但是還存在許多不足。目前,絕大部分應(yīng)用于機(jī)器人避障方向的全景相機(jī)使用的平臺都是單視點(diǎn)全景相機(jī),獲得的圖像往往失真嚴(yán)重。多目全景相機(jī)可以同時獲得360度的多張圖像,并且畸變較小,而針對多目全景視覺的應(yīng)用極少。因此,尋求一種適用于多目全景相機(jī)的穩(wěn)定、有效、方便的測距方式具有重要的研究意義。同時,本實(shí)驗(yàn)室在Bug避障算法已經(jīng)取得了一定的工作成績。Bug避障算法是一要求傳感器具有360度探測范圍的簡單避障算法。本實(shí)驗(yàn)室基于激光測距儀已經(jīng)實(shí)現(xiàn)了非360度探測范圍的Bug避障算法,并且能以平滑的路徑繞開障礙物達(dá)到終點(diǎn)。但是,由于不能得到全方位的環(huán)境信息,機(jī)器人在避障過程中需要頻繁轉(zhuǎn)向,導(dǎo)致避障效率不高。因此,本文針對上述存在問題,嘗試將激光測距儀置換為360度視覺傳感器,利用全景相機(jī)Ladybug3系統(tǒng)平臺提出了一種基于單雙目結(jié)合的全景測距算法。對如何實(shí)現(xiàn)全景避障測距進(jìn)行了深入研究。為實(shí)現(xiàn)該測距算法,本文作了如下工作:(1)研究了全景相機(jī)測距的基礎(chǔ)知識。主要內(nèi)容包括相機(jī)的標(biāo)定:對比幾種標(biāo)定法后,決定采用張正友標(biāo)定法;測距圖像預(yù)處理:采用直方圖均衡化和中值濾波增強(qiáng)圖像對比度和去除噪聲;立體匹配:分析了幾種匹配方法,采用改進(jìn)的SURF匹配法提取特征點(diǎn)對的坐標(biāo),提高了匹配魯棒性。(2)闡述了單雙目融合全景測距原理。第一,探討了如何確定障礙物是在重疊區(qū)域或非重疊區(qū)域;第二,確定了在重疊區(qū)域采用雙目測距,在非重疊區(qū)域采用單目測距的測距機(jī)制;第三,利用透視原理實(shí)現(xiàn)雙目測距,對測距原理作了詳細(xì)推導(dǎo);第四,采用非線性回歸建模方法實(shí)現(xiàn)單目測距,并闡述了建立非線性回歸模型具體過程。(3)完成了單雙目測距實(shí)驗(yàn)。選擇學(xué)校操場作為試驗(yàn)場地,進(jìn)行了標(biāo)定實(shí)驗(yàn),測距圖像預(yù)處理,測距圖像立體匹配實(shí)驗(yàn),雙目測距試驗(yàn)和單目測距實(shí)驗(yàn)。分別對實(shí)驗(yàn)結(jié)果和誤差進(jìn)行了對比分析。實(shí)驗(yàn)結(jié)果表明,雙目測距結(jié)果的測距誤差在1.08%~4.48%范圍,最大誤差4.48%出現(xiàn)在4m處。單目測距結(jié)果的誤差范圍為0.27%~12.57%,最大誤差為1.4 m時的12.57%,誤差的變化較為隨機(jī),并不隨著距離的增大而增大,但是誤差明顯比雙目測距要大。上述誤差均在可接受范圍內(nèi),可用來實(shí)現(xiàn)避障。
[Abstract]:In the field of autonomous obstacle avoidance navigation of mobile robots, visual sensors have many advantages in obtaining information about the surrounding environment. For example, the image information is abundant and the interference between multiple vision sensors is small when they work together. Compared with the narrow field of view of traditional vision, panoramic vision has a wide field of vision to make up for the shortcomings of traditional vision in the field of view. Therefore, panoramic vision has been widely used in autonomous robot navigation, 3D reconstruction, video surveillance and other fields. Experts and scholars at home and abroad have done a lot of research on panoramic vision, but there are still many shortcomings. At present, most of the panoramic cameras used in obstacle avoidance are single view panoramic cameras, and the images are often distorted seriously. Multi-view panoramic camera can obtain 360-degree images at the same time, and the distortion is small, but there are few applications for multi-view panoramic vision. Therefore, it is of great significance to seek a stable, effective and convenient ranging method for multi-view panoramic cameras. At the same time, our lab has made some achievements in the Bug obstacle avoidance algorithm. The Bug obstacle avoidance algorithm is a simple obstacle avoidance algorithm which requires the sensor to have a 360-degree detection range. Based on the laser rangefinder, a non-360 degree range Bug obstacle avoidance algorithm has been implemented in our laboratory, and a smooth path can be used to bypass the obstacle to reach the end point. However, due to the lack of comprehensive environmental information, the robot needs to turn frequently in the process of obstacle avoidance, which leads to the low efficiency of obstacle avoidance. Therefore, aiming at the above problems, this paper attempts to replace the laser rangefinder with a 360-degree vision sensor, and puts forward a panoramic ranging algorithm based on a panoramic camera Ladybug3 system platform. How to realize panoramic obstacle avoidance ranging is studied. In order to realize the ranging algorithm, the following work is done: (1) the basic knowledge of panoramic camera ranging is studied. The main contents include camera calibration: after comparing several calibration methods, Zhang Zhengyou calibration method is adopted; ranging image preprocessing: histogram equalization and median filter are used to enhance image contrast and remove noise; Stereo matching: several matching methods are analyzed and the coordinates of feature pairs are extracted by improved SURF matching method to improve the matching robustness. (2) the principle of monocular binocular fusion panoramic ranging is expounded. First, it discusses how to determine whether the obstacle is located in an overlapping area or a non-overlapping area; secondly, it determines a binocular ranging mechanism in an overlapped area and a single visual distance measurement mechanism in a non-overlapping region; and third, Using the perspective principle to realize binocular ranging, the principle of ranging is deduced in detail. Fourthly, the single visual distance is realized by nonlinear regression modeling method, and the concrete process of establishing nonlinear regression model is described. (3) Mono-binocular ranging experiment is completed. The school playground is chosen as the test site, and the calibration experiment, ranging image preprocessing, ranging image stereo matching experiment, binocular ranging experiment and single eye distance measurement experiment are carried out. The experimental results and errors are compared and analyzed respectively. The experimental results show that the ranging error of binocular ranging results is in the range of 1.08% and 4.48%, and the maximum error is 4.48% at 4m. The error range of the single visual distance is 0.27 and 12.57, and the maximum error is 12.57 when the maximum error is 1.4 m. The variation of the error is random and does not increase with the increase of distance, but the error is obviously larger than that of binocular ranging. The above errors are within acceptable range and can be used to avoid obstacles.
【學(xué)位授予單位】:華南農(nóng)業(yè)大學(xué)
【學(xué)位級別】:碩士
【學(xué)位授予年份】:2016
【分類號】:TP391.41;TP242

【相似文獻(xiàn)】

相關(guān)期刊論文 前10條

1 王健;張振海;李科杰;許濤;石志國;張東紅;邵海燕;張亮;;全景視覺系統(tǒng)發(fā)展與應(yīng)用[J];計算機(jī)測量與控制;2014年06期

2 榮玉斌;胡英;馬孜;;基于全景視覺的海上救助機(jī)器人[J];清華大學(xué)學(xué)報(自然科學(xué)版);2007年S2期

3 毛臣健;周憶;;一種通用的全景視覺水平邊線快速檢測方法[J];制造技術(shù)與機(jī)床;2012年03期

4 龔小林;田向陽;;全景視覺還原算法分析與應(yīng)用[J];計算機(jī)技術(shù)與發(fā)展;2014年06期

5 馬子領(lǐng);王建中;;作戰(zhàn)機(jī)器人折反射全景視覺偵察技術(shù)[J];兵工學(xué)報;2011年04期

6 李盛輝;周俊;姬長英;田光兆;顧寶興;王海青;;基于全景視覺的智能農(nóng)業(yè)車輛運(yùn)動障礙目標(biāo)檢測[J];農(nóng)業(yè)機(jī)械學(xué)報;2013年12期

7 楊鵬;高晶;劉作軍;萬文獻(xiàn);;基于全景與前向視覺的足球機(jī)器人定位方法研究[J];控制與決策;2008年01期

8 夏桂華;韓瑞;馬鐵鋼;沈建永;;防爆型高分辨率全景視覺監(jiān)控系統(tǒng)設(shè)計[J];微計算機(jī)信息;2008年25期

9 劉棟棟;;基于全景視覺的監(jiān)控系統(tǒng)設(shè)計[J];微型電腦應(yīng)用;2012年03期

10 錢厚亮;梅雪;林錦國;;多目移動機(jī)器人全景視覺子系統(tǒng)標(biāo)定研究[J];計算機(jī)工程與應(yīng)用;2010年13期

相關(guān)會議論文 前3條

1 劉偉;劉斐;鄭志強(qiáng);;機(jī)器人單視點(diǎn)全景視覺系統(tǒng)綜述[A];2004中國機(jī)器人足球比賽暨學(xué)術(shù)研討會論文集[C];2004年

2 唐超;高朝輝;唐慶博;童科偉;張霞;王小錠;;基于雙目全景視覺的月面信息檢測技術(shù)[A];中國宇航學(xué)會深空探測技術(shù)專業(yè)委員會第十屆學(xué)術(shù)年會論文集[C];2013年

3 武娜;李小堅(jiān);;Mean shift算法在全景視覺中的應(yīng)用與研究[A];中國計量協(xié)會冶金分會2010年會論文集[C];2010年

相關(guān)博士學(xué)位論文 前2條

1 張帆;全景視覺圖像質(zhì)量優(yōu)化方法研究[D];哈爾濱工程大學(xué);2010年

2 李科;移動機(jī)器人全景視覺歸航技術(shù)研究[D];哈爾濱工程大學(xué);2011年

相關(guān)碩士學(xué)位論文 前10條

1 張柱;基于全景視覺的足球機(jī)器人目標(biāo)跟蹤研究[D];湘潭大學(xué);2015年

2 楊善寶;基于DSP的全景視覺泊車輔助系統(tǒng)研究[D];東北大學(xué);2014年

3 曹文君;全景視覺環(huán)境避障測距方法研究[D];華南農(nóng)業(yè)大學(xué);2016年

4 唐超;基于雙目全景視覺的運(yùn)動目標(biāo)特征檢測技術(shù)[D];哈爾濱工程大學(xué);2012年

5 陳澤茂;基于全景視覺的汽車安全駕駛輔助系統(tǒng)的平臺設(shè)計與實(shí)現(xiàn)[D];華南理工大學(xué);2014年

6 韓瑞;高分辨率全景視覺監(jiān)控系統(tǒng)設(shè)計[D];哈爾濱工程大學(xué);2008年

7 吳自新;全景視覺系統(tǒng)設(shè)計與圖像處理技術(shù)研究[D];哈爾濱工程大學(xué);2006年

8 黃苗;足球機(jī)器人全景視覺系統(tǒng)研究與設(shè)計[D];長沙理工大學(xué);2010年

9 凌云峰;移動機(jī)器人全景視覺應(yīng)用研究[D];哈爾濱工程大學(xué);2007年

10 林蓓;基于達(dá)芬奇平臺的全景視覺處理系統(tǒng)關(guān)鍵技術(shù)的研究[D];浙江工業(yè)大學(xué);2012年

,

本文編號:2174570

資料下載
論文發(fā)表

本文鏈接:http://sikaile.net/kejilunwen/zidonghuakongzhilunwen/2174570.html


Copyright(c)文論論文網(wǎng)All Rights Reserved | 網(wǎng)站地圖 |

版權(quán)申明:資料由用戶222ad***提供,本站僅收錄摘要或目錄,作者需要刪除請E-mail郵箱bigeng88@qq.com