A Tele-Operation Interface With A Motion Capture System and A Haptic Glove

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence (URAl)

August 19�22, 2016 at Sotitel Xian on Renmin Square, Xian, China

A Tele-operation Interface with a Motion


Capture System and a Haptic Glove
Sungman Park, Yeongtae Jung and Joonbum Bae
Department of Mechanical Engineering, , UNIST, Ulsan, 44919, Korea
(Tel : +82-52-217-2335; E-mail: [email protected])

Abstract-Tele-operation systems have been developed to (VPN) were utilized. The performance of the proposed
perform tasks in extreme environments which cannot be system was verified by experiments.
easily accessed by human. However, non-intuitive control
The remainder of this paper is organized as follows. A
interfaces using only a keyboard or a joystick and wireless
communication issues have prohibited the wide application
configuration of the tele-operation system is introduced in
of the tele-operation systems. In this work, an intuitive Section 2. Section 3 details the analysis of the inverse and
operation interface using inertial measurement units (IMUs) forward kinematics of the robot arm and hand. Experi­
and a haptic glove was proposed to control a six degrees ments for performance of the proposed robot manipulator
of freedom (DOFs) robot arm in remote place. Using the
is introduced in Section 4. Conclusions and future works
measured joint angles by the IMUs, the wrist position of the
human arm was obtained by forward kinematics, which was
are presented in Section 5.
used to calculate the robot joint angles by inverse kinematics.
2. CONFIGURATION OF THE TELE-OPERATION SY STEM
Considering workspace of the robot and human arms, robot
joint angles were selected from many feasible solutions by the
In this section, the concept of the proposed tele­
inverse kinematics. Also, the robot hand was controlled by
the measured finger joint angles from the haptic glove, which operation system shown in Fig. 1 is introduced, which
also delivered vibration to the fingers according to the grasp consists of the robot part and the control interface part.
force of the robot hand. As a tele-communication method,
the 3GPP (3rd Generation Partnership Project) Long Term 2.1. A Tele-operated Robot
Evolution (LTE) network and a virtual private network
(VPN) were utilized. T he performance of the proposed system
To mimic human-like manipulation ability, two robot
was verified by experiments. arms and one omnidirectional camera was placed on the
Keywords-Tele-operation, Motion capture, Human-robot top of the balancing robot. Two 6-DOF robot arms with
interaction, Robot arm 3-DOF robot hands were equipped to manipulate objects
in the field. Geared DC motors (Dynamixel PRO and
1. INTRODUCTION Dynamixel, [7]) were used to the robot arms and hands.
Two 200 W actuators were used at the shoulder, a 100 W
Robotic manipulators have been researched for indus­ actuator was used at the elbow and three 20 W actuators
trial, educational and medical applications [1], [2]. By were used at the wrist. The motors were connected with
integrating the tele-communication technologies with the the robot side laptop with RS-485 protocol. The position
robotic manipulator, there have been many trials to apply and torque of each actuator is measured in real time. The
the tele-operation systems to unpredictable, hazard and maximum payload of the robot arm is about 3kg and each
inhospitable circumstances where human cannot access, actuator has maximum 180 deg/sec of speed. The robot
e.g., tasks in chemical or nuclear reactors [3], [4]. In the hand has three fingers and can exert 20 N of force at the
tele-operation system, control of the robot manipulator fingertip.
through a tele-communication network is very important To deliver the vision information, six cameras were
to work with high accuracy. However, the application composed to capture all around the robot with high defini­
in such fields is hard to achieve due to non-intuitive tion images. The omnidirectional camera monitors around
control interface and tele-communication issues. Control the robot, and the captured images are stitched to remove
interfaces using a keyboard and joystick have been used overlapped parts and make an spherical image around the
for the control of manipulators, but those interfaces and robot. The stitched image is sent to the user over wireless
required long training time and it is hard to perform precise network in real time. For the locomotion of the robot, a
tasks [5], [6]. Wi-Fi based tele-communication network are two-wheeled balancing system was applied.
usually used to connect the user and the robot arm, but it
can not cover large area. 2.2. A Tele-operation Interface

In this paper, an intuitive operation interface using For intuitive control of the robot and monitoring a
inertial measurement units (IMUs) and a haptic glove was situation around the robot, a wearable device were used.
proposed to control a six degrees of freedom (DOFs) robot The IMUs were used to measure of the movement of the
arm in remote place. As a tele-communication method, human arm [8]. They were attached on each segment (the
the 3GPP (3rd Generation Partnership Project) Long Term upper arm, forearm and wrist) and measure the movement
Evolution (LTE) network and a virtual private network of the segments.

978-1-5090-0821-6/16/$31.00 ©2016 IEEE 544


Shoulder
Omnidirectional

End-effector position
balancin s stem
Fig. 3: Configuration of the human arm for derivation of
Fig. 1: Concept of the te1e-operation system. the end-etlector position.

Fig. 2: The wearable sensing glove for the finger position


measurement and haptic feedback. Orientation

Fig. 4: Configuration of the robot arm for the inverse


kinematics.
To measure the finger motion, a wearable sensing glove
was used shown in Fig. 2 [9]. Joint angles of fingers
are measured by linear potentiometers, flexible wires and derived as follows:
linear springs. It also has vibrators at the fingertips so
that the feedback of stimulation is possible when the 7l = 7lse + 7lew = (Pwx, Pwy, Pwz) (3)
robot hand grasps objects. To see the view from the
omnidirectional camera, a head mounted display (HMO) where 7lse is the vector from the shoulder to elbow seen
was used. The user can see the view from the robot as from the shoulder, 7lew is the vector from the elbow to
looking around. shoulder seen from the shoulder, Rs is the rotation matrix
of the shoulder, Re is the rotation matrix of the elbow,
3. KINEMATIC ANALY SI S OF THE HUMAN ARM AND h is the length of the upper arm and l2 is the length of the
ROBOT ARM forearm.

In the proposed tele-operation system, the robot arm is 3.2. Robot Arm Kinematics
controlled to follow the position of human arm. In this
section, the kinematic analysis of the human and robot The 6-00F robot arm was analyzed by inverse and
arm is presented. forward kinematic in this section. The joint angles of the
robot arm for a desired position was calculated using the
3.1. Wrist Position of the Human Arm geometry of the robot arm, and a feasible solution was
adopted among four solutions. The forward kinematics
The wrist position of the human arm was calculated
was developed to calculate the present position of the robot
by the rotation matrix from the IMU sensors and the arm
arm, so that verifying the robot follows human trajectory
model. Figure 3 shows the simplified kinematic model of
is possible.
a human arm. Using the rotation matrix, the vector from
A. Inverse Kinematics: Using the inverse kinematics
the shoulder to elbow, and elbow to wrist were calculated
with the end-etlector position and orientation, joint angles
as follows:
of the shoulder, elbow and wrist are derived [ 10], [ 1 1].
7lse Rs[O, 0, h]T ( 1) Figure 4 shows the configuration of the robot arm for the

-:(few Re[0, 0, l2]T (2)


inverse kinematics. The position of the wrist is related
to the joint angles of the shoulder and elbow, where the
Then, the end-effector position of the human arm is orientation of the hand is related to all joint angles. From

545
TABLE 1: DH Parameters of the robot arm.
i cti-l ai-l di CPi
T01 0 0 L1 81
Tl2 Jr/2 0 0 82
T23 0 L2 0 Jr/2 + 83
T34 Jr/2 0 L3 Jr/2 + 84
T45 Jr/2 0 0 Jr/2 + 85
T56 Jr/2 0 0 Jr/2 + 86

TABLE 2: The range of movement of the human arm.


Elbow(degree) Flexion 0-140
Forearm(degree) Pronation 0-90
Supination 0-90
Wrist(degree) Extension 0-70
Flexion 0-80
Radial Deviation 0-20
Ulnar Deviation 0-30

the geometry of the robot arm, the components of the


position vector is expressed as follows:

Cl(a2C2 + a3C23) (4)


81(a2C2 + a3C23) (5) Fig. 5: Four solutions from the inverse kinematics.
(a282 + a3823) (6)

By combining Eq. (4), (5) and (6),

B3 = Atan2(s3, C3) (7)

With the determined B3, it is possible to compute B2 by


squaring and summing Eq. (4), (5) as follows:

(8)

With the determined B3 and B2, Bl is obtained as follows:

(9)

Joint angles of robot arm were calculated through Fig. 6: Configuration of the robot arm for the forward
the inverse kinematics analysis with a given end-effector kinematics.
position. Four solutions are available according to sign
and direction of the angles: [B3,I, B2,I, Bl,IL [B3,I, B2,II,
Bl,II], [B3,II, B2,III, Bl,I]' [B3,II, B2,Iv, Bl,II] (Fig. 5). the measured rotation matrix by quaternion of the motion
Considering the range of movement of a human elbow capture is the same with a rotation matrix from the human
(Table 2), two solutions ([B3,IJ, B2,III, Bl,I], [B3,IJ, B2,Iv , wrist by Euler angle.
Bu I]) were filtered. The robot arm was designed longer B. Forward Kinematics: To compare the measured po­
than the human arm to cover a whole workspace of the sition of the robot arm with the human arm position,
human arm, which makes the robot arm always bent. The the forward kinematics of the robot arm was analyzed.
elbow of the robot arm is located opposite direction using In this analysis, we used Denavit-Hartenberg (DH) pa­
remained two solutions as shown in the upper part of Fig. rameters. Figure 6 shows configuration of the robot arm
5. That is, to reach the end-effector position, the robot and expresses the coordinate axis of each joint for DH
arm should pass through the body of the robot with one parameters. Table 1 shows the DH parameters of each
solution. Thus, through a calculation of positions of the joint. Since all joint angles of the robot arm is measurable,
robot elbow by the two solutions and comparison with the forward kinematics solution is fully defined by the
the position of the body, only one solution which can homogeneous transform matrix, which defines the position
avoid the collision between the robot arm and body was and orientation of the end-effector of the robot arm by
adopted. The IMUs give their orientation as a quaternion Eq. ( 10) and ( 1 1).
and Euler angle, but Euler angle causes sign convention
and alteration of axis. For that reason, the quaternion was Rot(x, ai_I)Tran(x, ai-I)
T(i-l)i =

used to the motion capture system. However, the range ( 10)


Tmn(z, di)Rot(z, ¢i)
of the movement of human wrist is less than 90 degree as
shown in Table 2 so it can be expressed Euler angle. Thus, T06 = TOIT12T23T34T4STs6 ( 1 1)

546
25
.......... Original Eq.
--FittedEq.
20

15-

·5o----�-�--�-
10 15 20 -::':
25 - :- __::_:c
30 -----:c=--____
35 40_c
Angle (deg)

Fig. 8: Least square fitting of the IP joint angle to a


function of MCP joint angle.

Fig. 7: Configurations of the robot hand. (a) Fully stretched


robot hand. (b) Configuration of the robot hand for the
analysis of the kinematics. (c) The four-bar linkages
r ����:I��t:�;c: --- i
. b� :4G·��r -
It Network �
lntemet LTE
------- ,

- ---
system of the robot hand.

J
C. Kinematics of the Robot Hand: Figure 7 shows

!! : 8
the hand of the robot arm. The robot hand has three _____ LTE

fingers, and each finger of the robot hand has only two Desktop VPN
(VPN server)
joints (Metacarpophalangeal (MCP) and Interphalangeal
Moblle Robot
(IP)) but the IP joint is coupled with the MCP joint -------------------- ----------------------

by four-bar linkages (Fig. 7 (c)). Because human fingers Fig. 9: Schematic of the wireless communication network.
have more OOFs than the robot hand, OOF reduction is
required in actuating the robot hand and for stable grasping
movement. In this research, a distance between thumb and
3.3. Wireless Communication for the Tele-operation
index finger was used since it is the most important to
grasp objects. Constructing a wireless communication network be­
All joint angles of the human fingers were measured by tween a user and a robot in the distance is a challenging
the sensing glove; thus, the distance between thumb and work. A private wireless network with Wi-Fi devices is
index finger, d, could be derived. Using the distance, the preferred in many applications because of its fast data
angles of the robot finger was calculated. When the robot transfer rate and low latency. However, the coverage of
hand grasps objects, the user can feel vibration from the a Wi-Fi router is less than 100 m even without any
glove and expect the size of the object with the image obstacle. In this research, a 3GPP LTE network, that
from the camera. provides high data rate and stable connection in most
Since the robot fingers are in a plane, the distance living spaces, was used for the wireless communication
between the fingertips is the distance in x-direction. Thus, rather than constructing a private wireless network (Fig. 9).
only a x-elements of the each position of the finger were Since most tele-communication companies do not provide
considered in the derivation of the MCP joint angle of the registered internet protocol (IP) to a LTE device, a VPN
robot hand. The distance between the fingertips as follows: was introduced. As shown in, the robot side computer
d 2(asin(eMcP,max - eMCp)
=
was asked to access in a VPN that is constructed in the
(12) human side computer, and thereby, the data transfer is
+asin(eMcp,max - eMcp - eIP)) + h
available in mutual direction using a private IP assigned
where eMcP,max is 60 degree, I is 10 cm, a is 5 cm and in the VPN. User datagram protocol (UOP) was used for
, h is 4 cm. Since the two joint angles of each fingers the fast data transfer with low latency. The order from
were coupled as shown in Fig. 7 (c), the IP joint angle the human user side was sent to the robot side in every
was expressed as a function of the MCP joint angle. To 10 ms, while the captured images from the robot side was
express the relation as a simple function, their relationship sent to the user side in every 40 ms after be compressed
was linearized with the least square method as in Fig. 8 with JPEG method. Some of the image data are often lost
and the fitted equation is as follows: during the telecommunication when their size is larger
than the maximum transfer unit (MTU). The lost images
d = -35.33eMcp + 22.08 (13) were substituted with the previous image.

547
4. TELE-OPERATION EXPERIMENT S

The robot arm is required to follow the user's motion


well so that the user can control the robot arm as desired.
Therefore, an experiment was set to verify that the robot
arm can follow human motion without large delay or
difference of a trajectory. In this experiment, the robot
arm was controlled to follow the position of user's wrist.
The position of user's wrist was measured at human-side
PC, and the data were sent to the robot-side PC by the
LTE network. The data were converted into the joint angle
of the robot arm in the robot side PC using the inverse -20
-22
kinematics, and the robot arm was controlled to follow the -26
-24
-28
desired joint angles. After that, present joint angles of the -30
-32
robot arm were measured and they were converted to the x(cm) y(cm)
present end-effector position by the forward kinematics. [x]

-==
1:1o�· . ..... ...
The position was transmitted to the human side PC to
compare with present human wrist position. � -

o · · ..
" .
.
8.
.
The wrist positions were measured when the arm . ,
..

0.5 1 1.5 2 2.5 3 3.5


drawed a circle as hown in Fig. 10. The robot arm follows

t::......
the human motion with no difference of the position but [y]
- -20
about 0.4 sec of delay as shown in the figure. Since the
1 � :
·

-30

=
delay contains the delays of two transmissions, the actual
delay between the position of user's arm and the robot arm 1 o -40
0.5
I
1.5 2
-
""-,"" .. . . -
...""
2.5
-...,
3
� ,
3.5
was about 0.2 sec. A simple task was carried out: grasping
a water bottle and pouring water to a cup. Calibration [z]

of the initial position, grasping the water bottle, pouring


water to the cup and coming back to initial position were
i:I�'", ... ' -

performed sequentially (Fig. 1 1). As a result, the robot arm ·�oo� 0.5 1 1.5 2 2.5 3 3.5
followed the user's motion well with haptic interaction, time(sec)
that is vibration. Fig. 10: Position tracking performance of the tele­
operation system.
5. CONCLUSION

In this paper, an intuitive tele-operation interface for


robot manipulators using IMUs and the haptic glove was
introduced. The human and robot arm were kinematically
analyzed for converting a motion of the human to the
robot. In the experiment, the robot manipulator follows the
movement of the human successfully through the 3GPP
LTE-based tele-communication network. The user moved
the arm using the transmitted image from a camera and
the vibration feedback form the glove.
For the future work, to improve the control of the
robot arm, immersive control interface and force control
of the robot arm and force feedback will be researched.
Also, an omnidirectional camera and a two-wheeled based
balancing system as the mobile platform will be integrated.

A CKNOWLEDGEMENT

Fig. 1 1: Tele-operation experiments: grasping a water


This work was supported by the 20 16 Research Fund
bottle and pouring water to a cup.
( 1. 160005.0 1) of UNIST (Ulsan National Institute of Sci­
ence and Technology).

REFERENCE S [3] 1. R. Nourbakhsh, K. Sycara, M. Koes, M. Yong, M. Lewis, and


S. Burion, "human-robot teaming for search and rescue," IEEE CS
[I] M. Qassem, 1. Abuhadrous, and H. Elaydi, "Modeling and sim­ and IEEE ComSoc Pervasive Computing, vol. 4, pp. 72-79.
ulation of 5 dof educational robot arm," 2010 2nd International [4] X.Z.Zheng, K. Tsuchiya, T. Sawaragi, K. Osuka, K. Tsujita,
Conference Advanced Computer Control (ICACC), vol. 5, pp. 569- Y. Horiguchi, and S. Aoi, "Development of human-machine in­
54. terface in disaster-purposed search robot systems that serve as
[2] R. H. Taylor, A. Menciassi, G. Fichtinger, and P. Dario, Springer surrogates for human," Robotics and Automation, 2004., vol. I,
Handbook of Robotics. Springer Berlin Heidelberg, 2008. pp. 225-230.

548
[5] M. Schwarz, S. Behnke, and T. Rodehutskors, "Intuitive bimanual
telemanipulation under communication restrictions by immersive
3d visualization and motion tracking," in Humanoid Robots (Hu­
manoids), 2015, pp. 276-283.
[6] S. Kawamura and K. Ito, "A new type of master robot for
teleoperation using a radial wire drive system," Intelligent Robots
and Systems '93, IROS '93., vol. I, pp. 55-60.
[7] Robotis. (2016) Dynamixel. [Online]. Available: http://www.
robotis.com/
[8] E2box. (2016) Ebmotionw. [Online]. Available: hUp://e2box.co.kr/
[9] Y. Park, J. Lee, , and J. BAE, "Development of a wearable
sensing glove for measuring the motion of fingers using linear
potentiometers and flexible wires," IEEE Transactions on Industrial
Informatics, vol. II, pp. 198-206, 2015.
[10] B. Siciliano, L. Sciavicco, L. Villani, and G. Oriolo, Robotics:
Modeling, Planning and Control. Springer, 2009.
[II] J. J. Craig, Introduction to Robotics, Mechanical and Control.
Prentice Hall, 2004.

549

You might also like