LAIWR系统双目立体视觉手眼标定方法.pdf
70页上海交通大学硕士学位论文ILAIWR 系统双目视觉手眼标定方法摘 要目前示教再现弧焊机器人由于不具备自主获取工件定位信息和焊缝空间位置信息的能力,因此研究模拟熟练焊工的观察、分析和实施焊接操作行为,诸如基于视觉信息识别焊接环境、工件接头形式与初始焊接位置、并将机器人导引到初始焊接位置,在焊接过程中直接实现基于视觉传感的机器人焊缝跟踪这些关键技术的研究对提高机器人焊接质量和生产效率以及提高焊接机器人的自主能力和智能化水平具有较大的实际应用价值焊接过程是一种较为复杂的工作环境,对视觉反馈要求比较高,所以机器人完成模拟人类眼睛观察、手操作功能的手眼协调系统一般采取把摄像机同工具一起固定安装在机器人关节末端,这样不存在视线遮挡问题,而且通过控制机器人位姿的改变带动摄像机的位姿大范围的改变,可以观察环境局部细节,识别多目标及描述多目标之间复杂的相对位置关系等这种方式常常会使目标超出摄像机视场,从而出现视觉盲区更重要的是,这种方式使摄像机运动与手臂运动有了耦合这样,再根据视觉反馈策略设计摄像机观察控制可能会出现冲突,使机器人控制和任务规划难度大大增加为消除视觉盲区的出现和消除耦合的现象,就要确定摄像机所在坐标系与手爪所在坐标系之间的关系。
上海交通大学焊接工程研究所焊接机器人智能化技术研究室在国产新松机器人 RH6 机械本体基础上,研制而成的一套局部自主智能焊接机器人(LAIWR)系统本文针对这套系统上双目视觉传感器的标定过程问题进行了研究,同时本文也是上海交通大学焊接技术研究所智上海交通大学硕士学位论文II能焊接机器人实验室与 ABB 中国研究中心的合作项目“基于视觉传感的 ABB 弧焊机器人焊缝识别、跟踪与成形自主控制技术研究”第一阶段的重要组成部分首先根据 Zhang 提出的平面标定法,以及双目视觉传感器标定的原理采用 MATLAB 程序提取出标定模板中特征点的图像位置,获取到了对应的图像坐标,根据获得的图像坐标计算出单个 CCD 的内参数以及相对图像标定平面的外参数通过两个 CCD 分别与图像标定平面 i的外部参数值,计算出双目传感器系统中两个 CCD 摄像机之间的相对位置关系用 MATLAB编写函数,与前面的程序相结合,利用前面的程序获取同一位置左右 CCD 摄像机拍摄标定物上特征点的图像坐标,算出特征点在左摄像机坐标系下的三维坐标详细分析了对称式双目视觉传感器结构参数与测量误差之间关系特征,归结出了其一般设计方法,具有一定的指导意义。
根据新松机器人控制器只能提供 T6 矩阵的特点,编写程序完成T6 矩阵到适合手眼关系运算的机器人齐次方程表达式的转换结合实验室的 LAIWR系统平台,提出了一套适合新松机器人的手眼标定算法最后建立了 LAIWR系统双目视觉传感器的物象关系,推导了从图像上一点的图像二维坐标到机器人基坐标系的三维坐标之间的关系设计实验验证前面标定过程的准确性,分析误差存在的原因但是本文提出的标定方法具有良好的可实现性和可移植性关键词关键词:机器视觉,局部自主焊接机器人 ,双目视觉,手眼标定上海交通大学硕士学位论文IIICALIBRATION TECHNOLOGY FOR BINOCULAR VISION SYSTEM ON LAIWR SYSTEMABSTRACTAt present, teaching and playback welding robot is not provided with theability to obtain the 3-D information of work-pieces and the weld seamduring welding process. In order to eliminate the influence of uncertainfactors on weld quality during welding, some technologies will be putforward to imitate skilled welders’ actions to certain degree for weldingrobot system, such as observation of the welding environment, analysis ofwelding process, using visual sensors to recognize joint shape of work-piece,initial weld position, seam tracking. These key technologies are very usefuland will improve the intelligence level and reliability of welding robot.Welding process is very complicated, and it has high requirements for visualfeedback. Wrist-mounted sensors are often put into use, just like humanbeing to use eyes to observe and hands to manipulate, in this way, it will bemuch easier to observe the whole environment, and even more, it is alsovery convenience to expand the filed of vision, acquire the detailedinformation of local part, recognize multi-objects and establish therelationships of the multi-objects through changing the pose of the weldingrobot. However, sometimes the wrist-mounted sensors will lose targetsbecause of the limitation of CCD cameras’ views. So this will result in ablind area for the visual sensors. The more important thing is that it willmake cameras’ motions and gripper’s motions have coupling. In this way,and then it will appear conflicts if based on the visual feedback to control上海交通大学硕士学位论文IVand design the cameras and the difficulties will increase greatly. To avoidthe appearance of the blind area and the coupling between the cameras’motion and gripper’s motions, it is very necessary to establish the hand-eyecalibration.The LAWIR (Local Autonomous Welding Intelligent Robot) system isdeveloped by Intelligent Welding Robot Laboratory of Shanghai JiaotongUniversity, which is based on homemade “XINSONG” RH6 welding robot.This paper have done some research about the hand-eye calibration ofbinocular vision sensors for LAIWR system, meanwhile, it is also animportant part of the cooperative project named “Research on Recognitionand Tracking Seam, and Autonomous Control of Weld Shaping Based onVisual Sensing for ABB Arc Welding Robot” between Intelligent WeldingRobot Laboratory and ABB CHINA Research Center.According to the Zhang’s method, we analyze the calibration model ofbinocular vision sensors. MATLAB program (Calibration Toolbox) is usedto extract the points in the images of calibration template and attain thecoordinate of these points. Following that, the intrinsic and extrinsicparameters can be obtained through the calculation of these points’coordinate. Finally we can get the relative position and orientation of the twoCCD cameras through two CCD cameras’ extrinsic parameters comparing tothe calibration plane.Through trigonometric method, we use MATLAB program to calculatethe 3-D coordinate of the points in the left camera’s coordinate. A parametermodel of symmetrical binocular vision sensor is investigated. The erroranalysis explicitly illustrates the dependence of the vision sensor’s accuracyon the sensor structural parameters. Based on the comprehensive上海交通大学硕士学位论文Vconsideration for designing an ideal vision sensor, the principle and methodof the sensor design is derived.Because the control box of RH6 can only show the T6 matrix, so atransformation program is used to realize the function of transformationfrom T6 matrix to the homogeneous matrix that expresses the pose of thewelding robot. A hand-eye calibration method based which。





