天堂国产午夜亚洲专区-少妇人妻综合久久蜜臀-国产成人户外露出视频在线-国产91传媒一区二区三区

當(dāng)前位置:主頁 > 碩博論文 > 信息類博士論文 >

大視距非合作目標(biāo)視覺位姿測(cè)量系統(tǒng)關(guān)鍵技術(shù)研究

發(fā)布時(shí)間:2019-01-17 09:21
【摘要】:在航天器在軌服務(wù)、攻防對(duì)抗、以及空間垃圾處理等任務(wù)中需要測(cè)量?jī)蓚(gè)物體之間的相對(duì)位姿。由于測(cè)量距離由遠(yuǎn)及近跨度比較大,測(cè)量目標(biāo)上不能安裝對(duì)應(yīng)的合作靶標(biāo),因此需要進(jìn)行大視距非合作目標(biāo)位姿測(cè)量;谝曈X的位姿測(cè)量系統(tǒng)因?yàn)榫哂邢到y(tǒng)體積小,功耗小以及同時(shí)可以獲得實(shí)時(shí)圖像等優(yōu)點(diǎn)而受到廣泛關(guān)注。但是由于非合作目標(biāo)先驗(yàn)知識(shí)較少,大視距下視覺位姿測(cè)量精度變低等導(dǎo)致其不能滿足實(shí)際要求。針對(duì)這些缺點(diǎn),即如何在被測(cè)目標(biāo)已知先驗(yàn)知識(shí)較少情況下能夠?qū)崿F(xiàn)位姿測(cè)量,以及在大視距下提高位姿測(cè)量精度成為研究難點(diǎn)。針對(duì)這些難點(diǎn)本文主要從以下幾個(gè)方面進(jìn)行了研究,具體內(nèi)容如下:針對(duì)非合作目標(biāo)先驗(yàn)知識(shí)較少情況下能夠?qū)崿F(xiàn)位姿測(cè)量的問題,本文提出基于垂直直線特征的雙目視覺位姿測(cè)量算法。該算法僅需已知被測(cè)目標(biāo)上兩條直線相互垂直關(guān)系,利用四元素法和直線的垂直關(guān)系求解出被測(cè)目標(biāo)的相對(duì)位姿。仿真實(shí)驗(yàn)表明在0.1和0.5像素標(biāo)準(zhǔn)差的圖像噪聲影響下,位姿測(cè)量算法在0.3m~25m具有較好的測(cè)量精度。為了使得位姿測(cè)量精度能夠滿足大視距的要求,需要高精度地標(biāo)定出位姿測(cè)量模型中的各種參數(shù)。針對(duì)大視距中廣角鏡頭的畸變較大的問題,本文采用了消隱點(diǎn)共線約束優(yōu)化算法來提高交比不變性逐點(diǎn)畸變校正精度。該算法首先利用畸變較小的中心區(qū)域圖像點(diǎn)根據(jù)交比不變性來逐點(diǎn)計(jì)算周圍圖像點(diǎn)的理想值,然后利用消隱點(diǎn)共線約束算法來優(yōu)化計(jì)算圖像點(diǎn),最終提高了畸變校正精度。攝像機(jī)內(nèi)部參數(shù)和外部參數(shù)誤差耦合使得攝像機(jī)相對(duì)于標(biāo)定靶標(biāo)不同姿態(tài)、不同位置時(shí)標(biāo)定精度不同。為了提高攝像機(jī)內(nèi)外參數(shù)標(biāo)定精度,需要減小攝像機(jī)內(nèi)外參數(shù)標(biāo)定誤差之間的耦合關(guān)系對(duì)標(biāo)定參數(shù)精度的影響。本文提出了內(nèi)外參數(shù)解耦算法,該算法根據(jù)內(nèi)外參數(shù)誤差存在的線性耦合關(guān)系,利用靶標(biāo)平移后的多次標(biāo)定結(jié)果,對(duì)攝像機(jī)像面主點(diǎn)、攝像機(jī)焦距和平移向量的標(biāo)定誤差進(jìn)行修正。實(shí)驗(yàn)結(jié)果表明,在噪聲和畸變影響下,解耦算法減小了內(nèi)外參數(shù)誤差耦合對(duì)標(biāo)定參數(shù)精度的影響,提高了參數(shù)標(biāo)定精度。在航天某實(shí)驗(yàn)衛(wèi)星預(yù)研的硬件平臺(tái)上驗(yàn)證了本文所提出的算法。該平臺(tái)包括兩個(gè)攝像機(jī)、被測(cè)目標(biāo)、嵌入式圖像處理硬件和地面驗(yàn)證設(shè)備。嵌入式硬件中寫入本文所提出的位姿測(cè)量算法。地面驗(yàn)證設(shè)備采用光學(xué)暗室來模擬太空環(huán)境,采用六自由度運(yùn)動(dòng)系統(tǒng)來檢測(cè)視覺系統(tǒng)測(cè)量精度。最終實(shí)驗(yàn)數(shù)據(jù)表明非合作目標(biāo)視覺位姿測(cè)量精度滿足指標(biāo)要求,從而也驗(yàn)證了本文所提算法的有效性。
[Abstract]:It is necessary to measure the relative position and orientation of two objects in the missions of spacecraft in orbit service, attack and defense countermeasures, and space garbage disposal. Because the measurement distance is large from far to short span, the corresponding cooperative target can not be installed on the measurement target, so it is necessary to measure the position and attitude of the non-cooperative target with large visual range. The position and pose measurement system based on vision has attracted much attention because of its advantages of small size, low power consumption and the ability to obtain real-time images at the same time. However, due to the lack of prior knowledge of non-cooperative targets and the low accuracy of vision pose measurement under large visual range, it can not meet the practical requirements. In view of these shortcomings, how to realize the pose and attitude measurement under the condition of the known prior knowledge of the target under test, and how to improve the precision of the position and attitude measurement under the large visual range become the research difficulties. In view of these difficulties, this paper mainly studies the following aspects, the specific contents are as follows: aiming at the problem that the position and attitude measurement can be realized under the condition of little prior knowledge of non-cooperative target, In this paper, a binocular vision pose measurement algorithm based on vertical line features is proposed. The algorithm only needs to know the vertical relationship between two lines on the target under test, and solves the relative pose of the target by using the method of four elements and the vertical relation of the line. The simulation results show that the pose measurement algorithm has a good accuracy in 0.3m~25m under the influence of image noise of 0.1 and 0.5 pixel standard deviation. In order to make the precision of pose measurement meet the requirement of large visual range, it is necessary to calibrate the parameters in the model of pose measurement with high precision. Aiming at the problem of large distortion of wide-angle lens in large visual range, the collinear constrained optimization algorithm of blanking point is adopted to improve the accuracy of point by point distortion correction of cross-ratio invariance. The algorithm firstly calculates the ideal value of the surrounding image point by using the image points of the center region with small distortion according to the invariance of the cross ratio, and then optimizes the calculation of the image points by using the collinear constraint algorithm of the blanking points to improve the accuracy of distortion correction. The coupling of the internal and external parameters of the camera makes the camera with different attitude to the target, and the calibration accuracy is different in different positions. In order to improve the accuracy of camera calibration, it is necessary to reduce the effect of coupling relationship between camera internal and external parameters calibration error on the calibration accuracy. In this paper, a decoupling algorithm for internal and external parameters is proposed. According to the linear coupling relationship between the internal and external parameter errors, the camera image plane is determined by the multiple calibration results of the target translation. The calibration error of camera focal length and translation vector is corrected. The experimental results show that under the influence of noise and distortion, the decoupling algorithm reduces the effect of the error coupling of internal and external parameters on the calibration accuracy and improves the calibration accuracy. The algorithm proposed in this paper is verified on the hardware platform of an experimental satellite. The platform includes two cameras, target to be tested, embedded image processing hardware and ground verification equipment. The position and pose measurement algorithm proposed in this paper is written in embedded hardware. The optical darkroom is used to simulate the space environment and the 6-DOF motion system is used to detect the measuring accuracy of the visual system. Finally, the experimental data show that the accuracy of the non-cooperative target vision pose measurement meets the requirements of the target, which also verifies the effectiveness of the proposed algorithm.
【學(xué)位授予單位】:哈爾濱工業(yè)大學(xué)
【學(xué)位級(jí)別】:博士
【學(xué)位授予年份】:2016
【分類號(hào)】:TP391.41

【相似文獻(xiàn)】

相關(guān)期刊論文 前10條

1 龍宇峰;全R型彈性元件機(jī)器人手部位姿研究[J];南昌大學(xué)學(xué)報(bào)(工科版);1990年03期

2 王殿君;劉淑晶;任福君;王茁;孟慶鑫;;“穿地龍”機(jī)器人及其位姿檢測(cè)系統(tǒng)的研究[J];機(jī)械科學(xué)與技術(shù);2005年12期

3 郝穎明;朱楓;歐錦軍;吳清瀟;周靜;付雙飛;;基于點(diǎn)特征的位姿測(cè)量系統(tǒng)魯棒性分析[J];計(jì)算機(jī)應(yīng)用;2008年07期

4 劉偉;趙劍波;高峰;戚開誠;;基于相對(duì)坐標(biāo)的機(jī)器人末端位姿測(cè)量方法[J];機(jī)器人;2009年01期

5 王慧;許琢;;掘進(jìn)機(jī)器人的位姿檢測(cè)與機(jī)體定位[J];制造業(yè)自動(dòng)化;2013年08期

6 嚴(yán)國全;張學(xué)鋒;余利;李博;葉蔭球;;乒乓球機(jī)器人球拍位姿的確定[J];工業(yè)控制計(jì)算機(jī);2013年04期

7 王超;張曉琳;唐文彥;王軍;馬強(qiáng);;大尺寸箭彈質(zhì)量特性測(cè)量過程中位姿標(biāo)定方法研究[J];兵工學(xué)報(bào);2014年01期

8 孫先逵;秦嵐;;一種新型非接觸位姿檢測(cè)系統(tǒng)研究[J];光電工程;2007年01期

9 歐錦軍;朱楓;吳清瀟;周靜;付雙飛;郝穎明;;位姿測(cè)量結(jié)果對(duì)標(biāo)定誤差的魯棒性分析[J];微計(jì)算機(jī)信息;2008年07期

10 李繡峰,劉桂雄,謝存禧;一種機(jī)器人螺紋裝配位姿調(diào)整新方法的理論研究[J];機(jī)床與液壓;2000年01期

相關(guān)會(huì)議論文 前6條

1 沙晶晶;陳志平;施滸立;呂興榮;;基于懸絲機(jī)構(gòu)的目標(biāo)飛行物位姿解算分析與實(shí)現(xiàn)[A];2005年機(jī)械電子學(xué)學(xué)術(shù)會(huì)議論文集[C];2005年

2 吳勇;郝礦榮;丁永生;張淑平;;基于視覺位姿的機(jī)器人測(cè)速系統(tǒng)[A];第三屆中國智能計(jì)算大會(huì)論文集[C];2009年

3 葉榛;項(xiàng)安波;陶品;朱紹文;;基于PSD的動(dòng)態(tài)位姿檢測(cè)技術(shù)[A];1998年中國智能自動(dòng)化學(xué)術(shù)會(huì)議論文集(下冊(cè))[C];1998年

4 郝穎明;朱楓;;視覺位姿測(cè)量中的點(diǎn)狀目標(biāo)中心自適應(yīng)定位方法[A];中國儀器儀表學(xué)會(huì)第九屆青年學(xué)術(shù)會(huì)議論文集[C];2007年

5 田原;董愛鋒;;一種攝像機(jī)和激光指向儀相對(duì)位姿標(biāo)定方法[A];煤礦自動(dòng)化與信息化——第20屆全國煤礦自動(dòng)化與信息化學(xué)術(shù)會(huì)議暨第2屆中國煤礦信息化與自動(dòng)化高層論壇論文集[C];2010年

6 徐旭明;葉榛;張?jiān)倥d;王家^,

本文編號(hào):2409886


資料下載
論文發(fā)表

本文鏈接:http://sikaile.net/shoufeilunwen/xxkjbs/2409886.html


Copyright(c)文論論文網(wǎng)All Rights Reserved | 網(wǎng)站地圖 |

版權(quán)申明:資料由用戶422b8***提供,本站僅收錄摘要或目錄,作者需要?jiǎng)h除請(qǐng)E-mail郵箱bigeng88@qq.com