image coordinates to robot coordinates
Hello to Community members.
2D image coordinates to 3D robot coordinates is an old topic with so many threads but there is a confusion i need to ask..
i am working on a project of guiding the 6 DOF ABB Industrial robotic arm using a Full HD Webcam to reach a designated location. Almost all the previous threads i have gone through mentions calibration and transformation of smaller robots where calculations between image and robot base can be done usign small ruler.
2nd camera is calibrated using checkerboard method. which is suggested to be calibrated using 20 or more checkerboard images. after calibration it gives "Rotation Matrices of 3*3 for 20 images and translational vectors of 1*3 for 20 images.
As in actual process guide, for camera extrinsic calculation R and T matrices are used, Which one rotataion and translational vector among these 20 we will use for calculation of base transformation in actual process guide? Sorry for a childish question but i will be thankful for a decent guide.
can somebody share his experience in detail that how to transform or map 2D image coordinates of calibrated fisheye camera to 3D robot coordinates (for 6 dof industrial robot) so the robotic arm moves to exact position deteted by webcam.? (Distance in real time between Image coordinates (u,v) and robot 3D coordinate (x,y,z) of base or TCP.
i am not asking for a complete solution, rather i need some useful hints to guide me in proper way. (the camera will be mounted on robotic arm)
regardssHello to Community members.
2D image coordinates to 3D robot coordinates is an old topic with so many threads but there is a confusion i need to ask..
i am working on a project of guiding the 6 DOF ABB Industrial robotic arm using a Full HD Webcam to reach a designated location. Almost all the previous threads i have gone through mentions calibration and transformation of smaller robots where calculations between image and robot base can be done usign small ruler.
2nd camera is calibrated using checkerboard method. which is suggested to be calibrated using 20 or more checkerboard images. after calibration it gives "Rotation Matrices of 3*3 for 20 images and translational vectors of 1*3 for 20 images.
As in actual process guide, for camera extrinsic calculation R and T matrices are used, Which one rotataion and translational vector among these 20 we will use for calculation of base transformation in actual process guide? Sorry for a childish question but i will be thankful for a decent guide.
can somebody share his experience in detail that how to transform or map 2D image coordinates of calibrated fisheye camera to 3D robot coordinates (for 6 dof industrial robot) so the robotic arm moves to exact position deteted by webcam.? (Distance in real time between Image coordinates (u,v) and robot 3D coordinate (x,y,z) of base or TCP.
i am not asking for a complete solution, rather i need some useful hints to guide me in proper way. (the camera will be mounted on robotic arm)
regardss Hello to Community members.
2D image coordinates to 3D robot coordinates is an old topic with so many threads but there is a confusion i need to ask..
i am working on a project of guiding the 6 DOF ABB Industrial robotic arm using a Full HD Webcam to reach a designated location. Almost all the previous threads i have gone through mentions calibration and transformation of smaller robots where calculations between image and robot base can be done usign small ruler.
2nd camera is calibrated using checkerboard method. which is suggested to be calibrated using 20 or more checkerboard images. after calibration it gives "Rotation Matrices of 3*3 for 20 images and translational vectors of 1*3 for 20 images.
As in actual process guide, for camera extrinsic calculation R and T matrices are used, Which one rotataion and translational vector among these 20 we will use for calculation of base transformation in actual process guide? Sorry for a childish question but i will be thankful for a decent guide.
can somebody share his experience in detail that how to transform or map 2D image coordinates of calibrated fisheye camera to 3D robot coordinates (for 6 dof industrial robot) so the robotic arm moves to exact position deteted by webcam.? (Distance in real time between Image coordinates (u,v) and robot 3D coordinate (x,y,z) of base or TCP.
i am not asking for a complete solution, rather i need some useful hints to guide me in proper way. (the camera will be mounted on robotic arm)
regardss robot coordinates MATLAB Answers — New Questions