基于機器視覺的懸鏈線上吊鉤的識別研究
[Abstract]:Catenary production line is widely used in industrial production, such as painting, shot, drying, livestock slaughtering and so on. At present, the loading and unloading of workpieces on catenary lines is still done manually, relying on the physical strength of the workers to carry them up and unload them. The workers have great labor intensity and low production efficiency. In addition, painting and other operations will also have a serious impact on human health. So only by using automatic loading and unloading equipment on catenary line can the next parts meet the requirements of high-speed and high efficiency of product quality and production efficiency. In order to ensure the automatic loading and unloading of workpiece, it is necessary to find the hooks on the catenary and the position of the workpiece so that the workpiece can be placed on the hooks. In view of this characteristic, this paper introduces machine vision into the identification of hooks on catenary. Machine vision is used to determine the position of the hook, and the automatic loading and unloading equipment adjusts the movement track to place the workpiece accurately. The research contents of this paper include the following aspects: firstly, the design of hook image acquisition hardware system, including lighting system, camera, acquisition card and other hardware, is completed. The Bumblebee2 parallel binocular camera from Point Grey Company of Canada was used to collect the images of the hooks on the catenary line, and the calibration of the internal and external parameters of the binocular camera was completed by using Zhang Zhengyou plane calibration method. The error analysis of the obtained parameters is also given. Secondly, taking the hooks on the poultry slaughtering line as the research object, according to the characteristics of this kind of hooks, the median filtering algorithm is first used to de-noising the image. On the basis of comparing various feature matching methods, the matching points between two images are obtained by using the SIFT feature matching algorithm and the method of calculating projection transformation matrix, and the corresponding relation between the matching points is obtained. Finally, the spatial three-dimensional information of the hooks is obtained according to the triangle rule, which provides a reliable recognition image for automatic recognition of the hooks by the robot. Thirdly, taking the hooks on the catenary of the investment casting workshop as the research object, the recognition area of the hooks is narrowed to the target range by Hough transform after pre-processing the collected images. The target area is clearly identified by horizontal and vertical grayscale projection. By matching the two images, the spatial information of the target recognition area of the hook is obtained and compared with the actual size of the hook.
【學位授予單位】:濟南大學
【學位級別】:碩士
【學位授予年份】:2012
【分類號】:TH22
【參考文獻】
相關(guān)期刊論文 前10條
1 胡亮;段發(fā)階;丁克勤;葉聲華;;鋼板表面缺陷檢測光學系統(tǒng)的設(shè)計[J];傳感技術(shù)學報;2005年04期
2 周子嘯;趙曉林;胡峰;張利;;基于仿真機器人的雙目視覺定位[J];電視技術(shù);2010年08期
3 孟穎;李長春;王士鋒;;懸鏈線上自動裝卸工件的研究[J];國防交通工程與技術(shù);2010年04期
4 張舞杰;李迪;葉峰;;基于視覺的貼片元件檢測算法[J];華南理工大學學報(自然科學版);2010年01期
5 王力;張茂青;王洪東;;基于機器視覺的貼片機元件定位的圖像處理算法研究[J];電工電氣;2009年01期
6 彭惠青,李海燕;機器人視覺定位的圖像處理[J];計算技術(shù)與自動化;2004年01期
7 范秋鳳;楊國勝;馬曉燕;王應軍;;基于對應點匹配的物體深度信息測量[J];計算機應用;2006年08期
8 唐新星;趙丁選;黃海東;倪濤;;基于圖像疊合的工程機器人立體視覺系統(tǒng)[J];武漢理工大學學報(交通科學與工程版);2009年01期
9 戴君,趙海洋,馮心海;機器視覺[J];機械設(shè)計與制造工程;1998年04期
10 李云峰;賈晨輝;;基于灰度投影的快速紙幣圖像幾何校正[J];計算機應用與軟件;2009年06期
相關(guān)博士學位論文 前1條
1 伍濟鋼;薄片零件尺寸機器視覺檢測系統(tǒng)關(guān)鍵技術(shù)研究[D];華中科技大學;2008年
相關(guān)碩士學位論文 前10條
1 劉華冠;基于機器視覺的袋裝物料位姿自動識別研究[D];濟南大學;2011年
2 王士鋒;基于機器視覺的車輪圖像處理與分析[D];濟南大學;2011年
3 張春秀;基于雙目視覺的三維重建[D];天津大學;2010年
4 庹宇翔;基于雙目視覺的移動焊接機器人障礙物三維定位方法[D];上海交通大學;2011年
5 陳勇;基于機器視覺的表面缺陷檢測系統(tǒng)的算法研究及軟件設(shè)計[D];天津大學;2006年
6 趙玉芹;基于多幅標定圖像的管件三維重建[D];南京航空航天大學;2008年
7 林琳;機器人雙目視覺定位技術(shù)研究[D];西安電子科技大學;2009年
8 劉霞;工業(yè)零件形狀尺寸的機器視覺檢測系統(tǒng)的研究[D];哈爾濱理工大學;2009年
9 張?zhí)旌?基于機器視覺的編織袋圖像處理與分析[D];濟南大學;2010年
10 劉學山;基于機器視覺的鋰離子電池極片檢測系統(tǒng)的研究與設(shè)計[D];華南理工大學;2010年
本文編號:2457738
本文鏈接:http://sikaile.net/kejilunwen/jixiegongcheng/2457738.html