JOURNAL ARTICLE

3D object pose estimation using stereo vision for object manipulation system

Abstract

Object pose estimation is one of the crucial parts in vision-based object manipulation system using standard industrial robot manipulator, particularly in pose estimation of the end effector of the robot arm to grasp the object targeted. This paper presents the utilization of stereo vision system to estimate the 3D (3 dimensional) object position and orientation to pick up and place the object targeted in an arbitrary location within the workspace. In order to accomplish this task, a calibrated stereo camera in the eye to hand configuration is used to capture the images of the object on the left and right camera. Then, the specific object feature is extracted and the 3D position and orientation of the object are calculated using image processing algorithm. Finally, the end effector of robot arm equipped with gripper will pick up the object targeted according to the object pose estimation output, and then place it to the desired location. The experimental results using 6 DOF robot arm are demonstrated and show the effectiveness of the proposed approach with good performance.

Keywords:
Computer vision Pose Artificial intelligence Object (grammar) Computer science GRASP Robot end effector 3D pose estimation Orientation (vector space) Workspace Robotic arm Stereopsis Robot Stereo camera Position (finance) Feature (linguistics) Object detection Stereo cameras Computer stereo vision Machine vision Pattern recognition (psychology) Mathematics

Metrics

21
Cited By
2.51
FWCI (Field Weighted Citation Impact)
6
Refs
0.90
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Industrial Vision Systems and Defect Detection
Physical Sciences →  Engineering →  Industrial and Manufacturing Engineering
Image Processing Techniques and Applications
Physical Sciences →  Engineering →  Media Technology
Robotics and Sensor-Based Localization
Physical Sciences →  Engineering →  Aerospace Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.