天堂国产午夜亚洲专区-少妇人妻综合久久蜜臀-国产成人户外露出视频在线-国产91传媒一区二区三区

當(dāng)前位置:主頁(yè) > 科技論文 > 自動(dòng)化論文 >

基于手勢(shì)理解的UR機(jī)器人視覺(jué)引導(dǎo)與控制方法研究

發(fā)布時(shí)間:2018-01-11 05:30

  本文關(guān)鍵詞:基于手勢(shì)理解的UR機(jī)器人視覺(jué)引導(dǎo)與控制方法研究 出處:《中國(guó)科學(xué)院長(zhǎng)春光學(xué)精密機(jī)械與物理研究所》2016年碩士論文 論文類型:學(xué)位論文


  更多相關(guān)文章: 視覺(jué)引導(dǎo)控制 UR機(jī)器人 手勢(shì)識(shí)別 手勢(shì)跟蹤 Shi-Tomasi算法 KLT算法


【摘要】:隨著機(jī)器人技術(shù)在工業(yè)制造、軍事作戰(zhàn)和醫(yī)療領(lǐng)域中的廣泛應(yīng)用,人與機(jī)器人的交互方式更多地向著“以人為主體”的方向發(fā)展。依賴鼠標(biāo)、鍵盤、操作面板等硬件設(shè)備的傳統(tǒng)交互方式已經(jīng)越來(lái)越無(wú)法滿足人們的需要,因此,基于手勢(shì)理解的控制方式已逐步發(fā)展為實(shí)現(xiàn)自然人機(jī)交互的新趨勢(shì)。本文的主要研究?jī)?nèi)容是基于手勢(shì)理解的UR機(jī)器人視覺(jué)引導(dǎo)與控制方法,對(duì)手勢(shì)識(shí)別方法、手勢(shì)跟蹤方法、UR機(jī)器人遠(yuǎn)程控制方法進(jìn)行了詳細(xì)的闡述和驗(yàn)證,并建立了基于手勢(shì)識(shí)別與跟蹤的人機(jī)交互系統(tǒng),實(shí)現(xiàn)了對(duì)機(jī)械臂末端運(yùn)動(dòng)的視覺(jué)引導(dǎo)控制。在手勢(shì)識(shí)別方面,提出了結(jié)合膚色分割和Viola-Jones算法的識(shí)別方法,先對(duì)輸入圖像應(yīng)用膚色分割模塊去除背景中大部分非膚色區(qū)域,再由Viola-Jones算法離線訓(xùn)練的手勢(shì)目標(biāo)檢測(cè)器完成識(shí)別。在膚色分割模塊,在YCb Cr色彩空間中建立膚色模型,實(shí)現(xiàn)待檢測(cè)圖像中膚色區(qū)域與背景的分割,然后通過(guò)形態(tài)學(xué)濾波操作去除噪聲的干擾;應(yīng)用Viola-Jones算法時(shí),采用Haar特征、積分圖策略以及級(jí)聯(lián)結(jié)構(gòu)來(lái)訓(xùn)練識(shí)別三種目標(biāo)手勢(shì)的檢測(cè)器。通過(guò)本文的識(shí)別方法與傳統(tǒng)檢測(cè)器對(duì)于三種手勢(shì)識(shí)別效果的對(duì)比試驗(yàn),結(jié)果表明本文提出的方法對(duì)手勢(shì)識(shí)別較為理想,滿足人機(jī)交互系統(tǒng)的要求。在手勢(shì)跟蹤方面,結(jié)合改進(jìn)的Shi-Tomasi特征點(diǎn)提取算法與融合KLT跟蹤算法和Kalman濾波器所構(gòu)造的雙模塊跟蹤器完成手勢(shì)跟蹤。通過(guò)改進(jìn)的ShiTomasi算法,剔除未分布在手勢(shì)目標(biāo)上以及對(duì)噪聲敏感的特征點(diǎn),將可靠穩(wěn)定的特征點(diǎn)送入跟蹤器,由跟蹤器中的KLT模塊匹配特征點(diǎn)進(jìn)而實(shí)現(xiàn)對(duì)手勢(shì)目標(biāo)的定位。特征點(diǎn)丟失時(shí),啟動(dòng)跟蹤器中的Kalman濾波模塊預(yù)測(cè)手勢(shì)位置,縮小檢測(cè)器的檢測(cè)范圍,從而實(shí)現(xiàn)高效檢測(cè)與連續(xù)跟蹤,解決了由于發(fā)生遮擋或重疊而導(dǎo)致的跟蹤失敗、跟蹤引導(dǎo)信號(hào)不連續(xù)問(wèn)題。在UR機(jī)器人遠(yuǎn)程控制方面,深入剖析了UR機(jī)器人的運(yùn)動(dòng)控制機(jī)制,設(shè)計(jì)了一種UR機(jī)器人的遠(yuǎn)程運(yùn)動(dòng)控制方法。通過(guò)MATLAB平臺(tái)驗(yàn)證該方法,使UR機(jī)器人末端實(shí)現(xiàn)了正弦軌跡跟蹤,并且完成了對(duì)機(jī)器人的狀態(tài)監(jiān)控,驗(yàn)證了本方法的正確性和有效性。最后在上述研究基礎(chǔ)上,將手勢(shì)檢測(cè)、手勢(shì)跟蹤與機(jī)器人遠(yuǎn)程控制方法相結(jié)合,應(yīng)用到機(jī)器人視覺(jué)控制引導(dǎo)系統(tǒng)中,建立了一個(gè)以用戶手勢(shì)為輸入的人機(jī)交互系統(tǒng),通過(guò)對(duì)手勢(shì)的識(shí)別與跟蹤來(lái)控制機(jī)械臂運(yùn)動(dòng),從而實(shí)現(xiàn)了操作者與UR機(jī)械臂平臺(tái)的實(shí)時(shí)、友好交互,實(shí)現(xiàn)了基于手勢(shì)理解的UR機(jī)器人視覺(jué)引導(dǎo)控制。
[Abstract]:Along with the development of robot technology in manufacturing industry, widely used in the field of medical and military operations, the way people interact with the robot more toward the "human-oriented" direction. Relying on the mouse, the keyboard, the traditional interactive mode of operation panel and other hardware equipment has become increasingly unable to meet the needs of the people, therefore, understand the gesture control based on the development of a new trend has been gradually realize natural human-computer interaction. The main content of this paper is to study the UR robot vision gesture understanding guidance and control method based on the method of gesture recognition, gesture tracking method, UR robot remote control method are described and validated in detail, and the establishment of gesture recognition and human-computer interaction system based on the tracking of the manipulator motion vision guided control. In the aspect of recognition, is presented based on skin color segmentation and Viola-Jones algorithm The recognition method, first divide the most non skin regions in the input image to remove the background module application of color, then by gesture target detector Viola-Jones algorithm off-line training. In recognition of the color segmentation module, a skin color model in YCb Cr color space, to achieve detection of skin color and background in the image segmentation, and then through the morphology the filtering operation to remove noise interference; the application of Viola-Jones algorithm, using Haar feature, integral image and cascade detector strategy training structure to identify three kinds of target gestures. Comparison test for three kinds of gesture recognition effect by identifying this method with the conventional detector, the results show that this method is ideal for gesture recognition, human-computer meet interactive system. In gesture tracking, combined with improved Shi-Tomasi feature extraction algorithm and fusion KLT tracking algorithm The structure of double module tracker and Kalman filter to complete the gesture tracking. By the improved ShiTomasi algorithm, the distribution in the target gesture not excluding the feature points and noise sensitive, reliable and stable feature points will be sent to the tracker by the KLT module in the tracker to match feature points so as to realize the positioning of the target. The gesture features are lost when start, Kalman filter module tracker forecast gesture position, narrowing the range of detector, in order to achieve continuous tracking and efficient detection, solved the problems caused by the occurrence of occlusion or overlapping tracking failure, tracking guidance and continuous signal. In the remote control of UR robot, an in-depth analysis of the UR robot motion control mechanism. The design of the remote motion control method for UR robot. The method is verified by the MATLAB platform, the UR robot realize sinusoidal track tracking, And the completion of the monitoring of the robot, to verify the correctness and validity of this method. Finally, on the basis of the above study, the combination of hand gesture detection, gesture tracking and robot remote control method applied to robot vision control and guide system, establish a human-computer interaction system with user input gesture. Through the recognition and tracking of gestures to control the mechanical arm movement, so as to realize the operator and the UR manipulator platform in real-time, friendly interaction, realize the guidance control of UR robot vision based gesture based on understanding.

【學(xué)位授予單位】:中國(guó)科學(xué)院長(zhǎng)春光學(xué)精密機(jī)械與物理研究所
【學(xué)位級(jí)別】:碩士
【學(xué)位授予年份】:2016
【分類號(hào)】:TP391.41;TP242

【參考文獻(xiàn)】

相關(guān)期刊論文 前10條

1 張毅;姚圓圓;羅元;張?zhí)?;一種改進(jìn)的TLD動(dòng)態(tài)手勢(shì)跟蹤算法[J];機(jī)器人;2015年06期

2 扈立超;史再峰;龐科;劉江明;曹清潔;;用于圖像匹配的改進(jìn)Harris特征點(diǎn)檢測(cè)算法[J];計(jì)算機(jī)工程;2015年10期

3 艾斯本·奧斯特加;;工業(yè)機(jī)器人的未來(lái)之路[J];辦公自動(dòng)化;2015年11期

4 劉真;白韜韜;盧鵬;;一種解密圖像無(wú)背景噪聲的加密全息數(shù)字水印技術(shù)[J];光學(xué)學(xué)報(bào);2015年02期

5 畢國(guó)玲;趙建;續(xù)志軍;孫強(qiáng);;基于角點(diǎn)和局部特征描述子的快速匹配算法[J];光電工程;2014年09期

6 吳江梅;張瑜慧;孫瑩;劉海朦;;一種基于單目視覺(jué)的人手檢測(cè)與識(shí)別方法[J];計(jì)算機(jī)與數(shù)字工程;2014年07期

7 劉彥妤;;UR機(jī)器人助力精密工程公司降低成本優(yōu)化生產(chǎn)[J];工程機(jī)械文摘;2014年03期

8 丁雄飛;張春燕;;基于Moravec算子和改進(jìn)的SIFT算法的圖像匹配[J];合肥學(xué)院學(xué)報(bào)(自然科學(xué)版);2013年03期

9 譚民;王碩;;機(jī)器人技術(shù)研究進(jìn)展[J];自動(dòng)化學(xué)報(bào);2013年07期

10 王曉華;李才順;胡敏;朱弘;;服務(wù)機(jī)器人手勢(shì)識(shí)別系統(tǒng)研究[J];電子測(cè)量與儀器學(xué)報(bào);2013年04期

相關(guān)碩士學(xué)位論文 前3條

1 韋慧怡;基于形狀特征的手勢(shì)識(shí)別方法研究[D];蘭州理工大學(xué);2014年

2 孟祥媛;基于FPGA的KLT算法設(shè)計(jì)與實(shí)現(xiàn)[D];長(zhǎng)春理工大學(xué);2013年

3 蘇功宴;內(nèi)容與形式相統(tǒng)一的智能交互空間[D];上海交通大學(xué);2007年

,

本文編號(hào):1408331

資料下載
論文發(fā)表

本文鏈接:http://sikaile.net/kejilunwen/zidonghuakongzhilunwen/1408331.html


Copyright(c)文論論文網(wǎng)All Rights Reserved | 網(wǎng)站地圖 |

版權(quán)申明:資料由用戶e3683***提供,本站僅收錄摘要或目錄,作者需要?jiǎng)h除請(qǐng)E-mail郵箱bigeng88@qq.com