Implementation of An Object-Grasping Robot Arm Using Stereo Vision Measurement and Fuzzy Control
Implementation of An Object-Grasping Robot Arm Using Stereo Vision Measurement and Fuzzy Control
Implementation of An Object-Grasping Robot Arm Using Stereo Vision Measurement and Fuzzy Control
(2015) 17(2):193–205
DOI 10.1007/s40815-015-0019-2
Cheng-Hao Huang3
Received: 23 October 2014 / Revised: 16 February 2015 / Accepted: 6 March 2015 / Published online: 22 March 2015
Taiwan Fuzzy Systems Association and Springer-Verlag Berlin Heidelberg 2015
Abstract In this paper, a method using a stereo vision errors between the gripper and target such that the gripper
device and fuzzy control to guide a robot arm to grasp a can grasp the target. Using the proposed method, the stereo
target object is proposed. The robot arm has five degrees of vision device can not only locate the target object but also
freedom including a gripper and four joints. The stereo trace the position of the robot arm until the target object is
vision device located beside the arm captures images of the grasped. Finally, some experiments are conducted to
target and the gripper. Image processing techniques such as demonstrate successful implementation of the proposed
color space transformation, morphologic operation, and method on the robot arm control.
3-D position measurement are used to identify the target
object and the gripper from the captured images and esti- Keywords Robot arm Fuzzy control Inverse
mate their relative positions. Based on the estimated po- kinematics Image processing
sitions of the gripper and the target, the gripper can
approach and grasp the target using inverse kinematics.
However, since the robot arm’s accuracy of movement may 1 Introduction
be affected by gearbox backlash or hardware uncertainty,
the gripper might not approach the desired position with To control a robot arm, it is necessary that the positions of
precision using only inverse kinematics. Therefore, a fuzzy this controlled robot arm should be known at all times.
compensation method is added to correct any position Some external sensors such as the accelerometer and the
resolver are needed in order to estimate the positions of the
robot arm and its gripper. These sensors may be installed
& Rong-Jyue Wang on every actuator to measure the angles of the degrees of
[email protected] freedom (DOFs) as feedback signals. From the feedback
Jun-Wei Chang signals of the robot arm, the pose of the robot arm and the
[email protected] position of the gripper can be estimated. However, if the
Wen-June Wang robot arm has many DOFs, the number of sensors should
[email protected] be equivalent to or even more than the number of DOFs. In
Cheng-Hao Huang other words, if the robot arm has many DOFs, the cost for
[email protected] installing those external sensors will increase accordingly.
1
On the other hand, many studies regarding robot arm
Department of Electrical Engineering, National Central
University, No. 300, Jhongda Rd., Jhongli Dist.,
platforms use cameras to identify their targets and to
Taoyuan City 32001, Taiwan (ROC) monitor their workspaces. The paper [1] used a depth im-
2 age sensor called the Kinect to identify target objects and
Department of Electronic Engineering, National Formosa
University, No. 64, Wunhua Rd., Huwei Township, control a humanoid robot arm to grasp the target objects.
Yunlin County 632, Taiwan (ROC) The paper [2] identified an object using a SIFT algorithm
3
Robotic Automation Dept., Delta Electronics, Inc., 5F., No. and a monocular vision device mounted on the gripper. The
46, Keya Rd., Daya Dist., Taichung City 428, Taiwan (ROC) paper [3] identified elevator buttons and made a robot arm
123
194 International Journal of Fuzzy Systems, Vol. 17, No. 2, June 2015
to operate an elevator. Supposing that the monitoring estimates the angle of each joint on the robot arm using the
camera can recognize the position of the gripper and track inverse kinematic method. However, because the design
it, we will not need to install sensors on the robot arm, thus and assembly of the robot arm is not that precise, there is
reducing the total cost of the robot arm platform. Consid- visible backlash in the mechanism. This means that posi-
ering this cost reduction benefit, this study proposes a tion errors caused by backlash or hardware uncertainty
method which uses a stereo vision device to identify and should be considered when the robot arm moves. Herein,
locate the gripper and the target object, and estimate the fuzzy compensation method is used to deal with position
pose of the robot arm. errors. The concept of the fuzzy compensation method is to
There have been many studies on the subject of robot adjust the amount of robot arm movement using fuzzy
arm control using many different methods, the paper [4] logic. When the robot arm is close to the target, the com-
used interactive teaching to approximate a space of the pensation value is low, on the contrary, when the robot arm
knowledge-based grasp. The paper [5] presented an off-line is far from the target, the compensation value is high.
trajectory generation algorithm to solve the contouring Fusing the position value of the robot arm and compen-
problem of industrial robot arms. Using the Mitsubishi PA- sation values, we can obtain the new angle of each joint of
10 robot arm platform, the paper [6] proposed a harmonic the robot arm by the inverse kinematic and drive the motors
drive transmission model to investigate the gravity and to the calculated angle. Hence, using the stereo vision to
material influence on the robot arm. Then, the robot arm estimate the configuration of robot arm and the fuzzy
can be controlled to track a desired trajectory and the compensation method to reduce the position errors, the
motion error can be further analyzed. The paper [7] applied robot arm can accurately take its target object.
a self-configuration fuzzy system to find the inverse kine- This paper is organized as follows. Section 2 introduces
matics solutions for a robot arm. The paper [8] employed the experimental platform of this study. The principal
the inverse-kinematics-based two-nested control-loop stereo vision techniques, robot arm inverse kinematics
scheme to control the tip position using joint position and analysis, and fuzzy compensation are explained in Sects. 3,
tip acceleration feedback. The paper [9] proposed an ana- 4, and 5, respectively. The results of practical experiment
lytical methodology of inverse kinematics computation for and discussion are given in Sect. 6. Finally, the conclusion
a seven-DOF redundant robot arm with joint limits. Using is given in Sect. 7.
the inverse kinematics technique, the robot arm in [10] was
designed to push the buttons of an elevator. On the other
hand, the studies relating to position measurement and use 2 Description of Experimental Platform
of vision therein are described as follows. The paper [11]
combined a 2-D vision camera and an ultrasonic range In this paper, in order to implement an object-grasping task
sensor to estimate the position of the target object for the a platform for a robot arm with a stereo vision device is
robot gripper. The paper [12] used two cameras and one designed which contains a PC (laptop computer), a robot
laser to identify the elevator door and to determine its arm, a stereo vision device, and batteries, as shown in
depth distance. The papers [13–15] proposed photogram- Fig. 1. The utilized devices and their purposes are as de-
metric methods to measure the distances of objects using scribed below.
their features. The paper [16] effectively utilized color
images to achieve 3-D measurement using an RGB color 2.1 The Robot Arm
histogram. The paper [17] proposed an image-based 3-D
measuring system to measure distance and area using a The robot arm consists of the main body (four DOFs) and a
CCD camera and two laser projectors. The paper [18] gripper (one DOF) as shown in Fig. 2. Its main components
adopted two cameras and a laser projector to measure the are four SmartMotors, a planetary gearbox, three harmonic
edge of an object regardless of its position. drivers, two AX-12 motors, and some metal components.
In this paper, we propose an object-grasping method The specifications of the SmartMotor and the AX-12 are
using a stereo vision device and fuzzy control so that a shown in Tables 1 and 2, respectively. The SmartMotors
robot arm can accomplish an object-grasping task. The and the AX-12 motors communicate with PC through the
robot arm has no sensors installed on it, however, the stereo use of the serial ports (RS-232). The biggest torque motor,
vision device is set up beside the robot arm. Here, the SM3416D_PLUS, and a planetary gearbox are installed at
stereo vision device plays an important role in perceiving axis 1 to provide larger output torque, thus enabling the
the position of the robot arm from the feedback signals. robot arm to lift not only itself but also the target object.
Firstly, the stereo vision device is applied to identify the Based on many experiments, the robot arm can lift objects
position of the target object. Subsequently, the stereo vi- weighing up to about 2 kg. Since the load of axes 2–4 is
sion device traces the position of the robot arm and much smaller than that of axis 1, each remaining axis is
123
J.-W. Chang et al.: Implementation of an Object-Grasping Robot Arm… 195
Axis 4
Fig. 3 Structure of the gripper
Axis 3
AX-12 motors which include position, speed, and torque
through the serial ports (RS-232). Therefore, we can use
Axis 1 those status values to ascertain whether the gripper can
successfully grasp an object or not. For instance, when the
robot arm recognizes the target object and the gripper starts
Axis 2 to close, the ML motor’s torque value is positive and the
Fig. 2 Structure of the robot arm MR motor’s torque value is negative. If the values are in-
versed, it means the target object has been grasped. To
prevent objects from falling out of the gripper, two pieces
Table 1 Specifications of the SmartMotors of non-slip materials are pasted inside the grippers. A green
SM3416D_PLUS SM2315D mark is placed on each of the two sides of the gripper in
order to ease the identification of the gripper using the
Input voltage (VDC) 20–48 20–48
stereo vision device.
Maximum torque (N m) 1.6 0.3
Torque (N m) 1.09 0.19
2.2 The Stereo Vision Device
Speed (RPM) 3100 9000
Communication RS-232 RS-232 This stereo vision device consists of two Logitech
Weight (kg) 2.27 0.45 QuickCam Pro webcams as shown in Fig. 4. It is used to
capture stereo images with a resolution of 320 9 240
pixels at a rate of capture of 30 frames per second. The
captured images are transmitted to the laptop computer
composed of a SM2315D motor and a harmonic driver. via USB ports. The images are then used to identify, and
The gripper is composed of two AX-12 motors as shown in calculate the positions of, the object and the gripper in
Fig. 3. The PC receives the digitized feedback values of the 3-D space.
123
196 International Journal of Fuzzy Systems, Vol. 17, No. 2, June 2015
123
J.-W. Chang et al.: Implementation of an Object-Grasping Robot Arm… 197
Fig. 7 The center of gravity of the target object from the images of After the transformation, the positions (xt, yt, zt) of the
the webcam WL and WR stereo vision coordinate can be transformed into the
T Ot2
Ot L
CR
yt
Ot1 xt zt fR
S
OW OR (xR,yR)
Objectplane
CL
fL
OL WR image
Y
Z X (xL,yL)
WL image
123
198 International Journal of Fuzzy Systems, Vol. 17, No. 2, June 2015
(a)
(b)
Fig. 9 The different rotary directions between the stereo vision and
world coordinates
To make the robot arm and its gripper move to a desired Fig. 11 Further geometric analysis for the robot arm. a Links 1 and 2
position, we must first perform the inverse kinematics of the robot arm, b Xr–W plane and c Yr–Zr plane
analysis. Figure 10 shows the linking structure of the robot pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
arm, in which the shoulder joint is Or and is the origin point two-link arm plane, where L ¼ x2r þ y2r þ z2r is the dis-
of the inverse kinematics coordinate. Q is the elbow joint, tance between the gripper Gr and Or. The elbow joint angle
and Gr(xr, yr, zr) is the position of gripper. Or Q and QGr h3 is obtained as follows.
indicate links 1 and 2 of the robot arm, respectively, the h3 ¼ p a; ð8Þ
lengths of which are d1 and d2, respectively. In addition,
there are three rotation angles; h1, h2, and h3, on each of the where
Or and Q points. For further geometric derivations, links 1 2
d1 þ d22 L2
and 2 are projected on the Yr–Zr plane (see the green line on a ¼ cos1 : ð9Þ
2d1 d2
Fig. 10), where R and S denote that the projective points of
Gr and Q, respectively. The axis W is a referenced axis On the Xr–W plane as shown in Fig. 11b, the lifting
which extends from the projected link 1 on the Yr–Zr plane. movement joint angle h2 on Or point can be derived as
Let us provide three figures to introduce the deviations
1 xr
of the kinematics of the robot arm. Figure 11a shows a h2 ¼ sin : ð10Þ
d1 þ d2 cos h3
Furthermore, Fig. 11c depicts the projected arm on the Yr–
pffiffiffiffiffiffiffiffiffiffiffiffiffiffi
Zr plane, where L0 ¼ y2r þ z2r : Consequently, the shoulder
rotation joint angle h1 is obtained.
h1 ¼ b þ c; ð11Þ
where
!
1 L02 þ ððd1 þ d2 cos h3 Þ cos h2 Þ2 ðd2 sin h2 Þ2
b ¼ cos :
2L0 ððd1 þ d2 cos h3 Þ cos h2 Þ
ð12Þ
Fig. 10 Geometry of the robot arm and
123
J.-W. Chang et al.: Implementation of an Object-Grasping Robot Arm… 199
Gact
Ggr
123
200 International Journal of Fuzzy Systems, Vol. 17, No. 2, June 2015
(a) (b)
Table 4 Parameters of the same reason, the compensation factors ry and rz can be
i 1 2 3 4 5
premise parts found in a similar manner. Finally, the compensation
aix -5 -3 0 3 5 factors ri (i = x, y, z) are used to adjust the error as
aiy -5.5 -1.5 0 1.5 5.5 follows.
aiz -5 -2 0 2 5
ui ¼ ri ei ; ð18Þ
where ui are the compensation values which are used to
compensate for the position error as Eq. (19).
Table 5 Parameters of the consequent parts
Gcom ¼ Gact þ ux ; uy ; uz W ; ð19Þ
i 1 2 3 4 5
where Gcom is the new position of the gripper for ap-
uix -0.6 -0.4 0 0.4 0.6 proaching to the desired position.
uiy -0.65 -0.5 0 0.5 0.65 Overall, the error compensation for the robot arm with
uiz -0.65 -0.5 0 0.5 0.65 fuzzy controller is shown in Fig. 15. The stereo vision
device captures the actual position of robot arm (Gact) and
calculates the deviation values (D). These deviation values
After the parameters are assigned, the center average
are as the input into the fuzzy controller to obtain the
defuzzification method is used to obtain the position
compensation values for the robot arm (rx, ry, and rz).
compensation factor rx as follows.
Subsequently, the compensation values are used to get the
P5 i i
i¼1 ux B ðex Þ
new position of the robot arm (Gcom) by inverse kinematics
rx ¼ P 5
; ð17Þ and then the gripper moves to the new position. The
i
i¼1 B ðex Þ
compensation process will be terminated until |ei| B qi
where uix is the center value of the consequent part and (i = x, y, z) are satisfied, where qi is an acceptable error
Bi(ex) is the membership degree of the premise part. For the threshold of the gripper position.
123
J.-W. Chang et al.: Implementation of an Object-Grasping Robot Arm… 201
123
202 International Journal of Fuzzy Systems, Vol. 17, No. 2, June 2015
123
J.-W. Chang et al.: Implementation of an Object-Grasping Robot Arm… 203
Fig. 18 Position errors when the thermos is at location A Fig. 21 Position errors when the thermos is at location D
Fig. 19 Position errors when the thermos is at location B Fig. 22 Position errors when the thermos is at location E
123
204 International Journal of Fuzzy Systems, Vol. 17, No. 2, June 2015
Furthermore, the proposed method has two advantages 3. Kim, H.-H., Kim, D.-J., Park, K.-H.: Robust elevator button
which are low cost and high efficiency. Low cost is that recognition in the presence of partial occlusion and clutter by
specular reflections. IEEE Trans. Ind. Electron. 59(3), 1597–1611
only a stereo vision device is needed as the sensor. High (2012)
efficiency is that using fuzzy control to compensate the 4. Aleotti, J., Caselli, S.: Interactive teaching of task-oriented robot
error at final step. Therefore, any uncertainty and backslash grasps. Robot. Auton. Syst. 58(5), 539–550 (2010)
problems are easy to be dealt with for the movement of the 5. Munashnghe, S.R., Nakamura, M., Goto, S., Kyura, N.: Optimum
contouring of industrial robot arms under assigned velocity and
robot arm. torque constraints. IEEE Trans. Syst. Man Cybern. C 31(2),
159–167 (2001)
6. Kennedy, C.W., Desai, J.P.: Modeling and control of the Mit-
7 Conclusion subishi PA-10 robot arm harmonic drive system. IEEE ASME
Trans. Mechatron. 10(3), 263–274 (2005)
7. Shen, W., Gu, J., Milios, E.E.: Self-configuration fuzzy system
This study has described a method to achieve an object- for inverse kinematics of robot manipulators. In: Proceedings of
grasping task using only a stereo vision device to trace and the Annual Meeting of the North American Fuzzy Information
guide the motion of the robot arm. The stereo vision device Processing Society, Montreal, QC, Canada, June 2006, pp. 41–45
8. Feliu, V., Somolinos, J.A., Garcia, A.: Inverse dynamics based
identifies the position of the target object and gripper using control system for a three-degree-freedom flexible arm. IEEE
color space transformation, morphologic operation, and Trans. Robot. Autom. 19(6), 1007–1014 (2003)
3-D position measurement. After obtaining those positions, 9. Shimizu, M., Kakuya, H., Yoon, W.-K., Kitagaki, K., Kosuge, K.:
there is usually a position error between the gripper and Analytical inverse kinematics computation for 7-DOF redundant
manipulators with joint limits and its application to redundancy
target, due to recoil from the gearbox and inertia produced resolution. IEEE Trans. Robot. 24(5), 1131–1142 (2008)
by the movement of the robot arm. In order to compensate 10. Wang, W.-J., Huang, C.-H., Lai, I.-H., Chen, H.-C.: A robot arm
for these errors, the fuzzy compensation method is pro- for pushing elevator buttons. In: Proceedings of SICE Annual
posed to generate compensation values for each axis. The Conference, Taipei, Taiwan, August 2010, pp. 1844–1848
11. Nilsson, A., Holmberg, P.: Combining a stable 2-D vision camera
method is designed according to a principle that when the and an ultrasonic range detector for 3-D position estimation.
error in position is small, the movement required to com- IEEE Trans. Instrum. Meas. 43(2), 272–276 (1994)
pensate for that error is also relatively small. Then, fuzzy 12. Baek, J.-Y., Lee, M.-C.: A study on detecting elevator entrance
compensation method integrates the compensation values door using stereo vision in multi floor environment. In: Pro-
ceedings of ICROS-SICE International Joint Conference,
and inverse kinematics to estimate and drive the gripper to Fukuoka, Japan, August 2009, pp. 1370–1373
a new compensative position. 13. Fraser, C.S., Cronk, S.: A hybrid measurement approach for
Several experiments are given to demonstrate the im- close-range photogrammetry. ISPRS J. Photogramm. Remote
plementation of the proposed object-grasping method. The Sens. 64(3), 328–333 (2009)
14. van den Heuvel, F.A.: 3D reconstruction from a single image
fuzzy error compensation regulates the position of the using geometric constraints. ISPRS J. Photogramm. Remote
gripper until the position error satisfies acceptable error Sens. 53(6), 354–368 (1998)
thresholds. Therefore, the robot arm can successfully ap- 15. Zhang, D.-H., Liang, J., Guo, C.: Photogrammetric 3D mea-
proach the target object and raise it under the guidance of surement method applying to automobile panel. In: Proceedings
of the 2nd International Conference on Computer and Automa-
the stereo vision device in all the experiments. tion Engineering (ICCAE), Singapore, February 2010, pp. 70–74
The benefit of using the proposed method is that the 16. Egami, T., Oe, S., Terada, K., Kashiwagi, T.: Three dimensional
robot arm does not need external sensors such as ac- measurement using color image and movable CCD system. In:
celerometers or resolvers to measure the degree of rotation Proceedings of the 27th Annual Conference on IEEE Industrial
Electronics Society, Denver, Colorado, USA, November 2001,
on each axis. Thus, the cost for building the robot arm pp. 1932–1936
platform can be reduced. 17. Hsu, C.-C., Lu, M.-C., Wang, W.-Y., Lu, Y.-Y.: Three-dimen-
sional measurement of distant objects based on laser-projected
Acknowledgments The authors like to thank the Ministry of Sci- CCD images. IET Sci. Meas. Technol. 3(3), 197–207 (2009)
ence and Technology of Taiwan for its support under Contracts 18. Aguilar, J.J., Torres, F., Lope, M.A.: Stereo vision for 3D mea-
MOST 103-2221-E-008-001-. surement: accuracy analysis, calibration and industrial applica-
tions. Measurement 18(4), 193–200 (1996)
19. Feng, L., Xiaoyu, L., Yi, C.: An efficient detection method for
rare colored capsule based on RGB and HSV color space. In:
References IEEE International Conference on Granular Computing, No-
boribetsu, Japan, , October 2014, pp. 175–175
1. Song, K.-T., Tsai, S.-C.: Vision-based adaptive grasping of a 20. Laganière, R.: OpenCV2 Computer Vision Application Pro-
humanoid robot arm. In: Proceedings of the IEEE International gramming Cookbook. Packt Publishing, Birmingham (2011)
Conference on Automation and Logistics, Zhengzhou, China, 21. Jain, R., Kasturi, R., Schunk, B.G.: Machine Vision. McGraw-
August 2012, pp. 155–160 Hill, New York (1995)
2. Yang, Y., Cao, Q.-X.: Monocular vision based 6D object local- 22. Kim, B.S., Lee, S.H., Cho, N.I.: Real-time panorama canvas of
ization for service robot’s intelligent grasping. Comput. Math. natural images. IEEE Trans. Consum. Electron. 57(4), 1961–1968
Appl. 64(5), 1235–1241 (2012) (2011)
123
J.-W. Chang et al.: Implementation of an Object-Grasping Robot Arm… 205
23. Su, J., Zhang, Y.: Integration of a plug-and-play desktop robotic 25. DiCicco, M., Bajracharya, M., Nickels, K., Backes, P.: The EPEC
system. Robotica 27, 403–409 (2009) algorithm for vision guided manipulation: analysis and valida-
24. Nguyen, H.-N., Zhou, J., Kang, H.-J.: A calibration method for tion. In: Proceedings of the IEEE Aerospace Conference, Big
enhancing robot accuracy through integration of an extended Sky, Montana, March 2007, pp. 1–11
Kalman filter algorithm and an artificial neural network. Neuro-
computing 151, 996–1005 (2015)
123