Vision-Based Virtual Touch Screen Interface: Kohl2
Vision-Based Virtual Touch Screen Interface: Kohl2
Vision-Based Virtual Touch Screen Interface: Kohl2
I. INTRODUCTION
It is inarguable that the augmented reality (AR) improves
our real world by adding virtual objects and fleetly enlarges its
application areas. This leads the intensive research in the area
of human computer interaction (HCI). However, they have
the intrinsic limitation for the ubiquitous service environment
that requires the convenience and the mobility. Some
examples include the restricted environments of the place
where AR can be applied, i.e. a certain table, the requirement
of too many markers, and the need of markers inconveniently
attached to human bodies [1]-[3].
For the fine interaction to control mobile equipment, it also
requires the convenience, the accuracy, and the naturalness.
Whereas much research has been devoted to fulfill such
requirements recently, they still are not satisfactory for being
adopted by the mobile environment. For example [4], [5] use
mechanical devices that directly measure motions and spatial
positions of a hand. They require wearing glove typed devices
that are directly wire-connected and constrain the naturalness
and comfort of the user to interact with computer.
To resolve these problems, we propose a virtual touch
screen interface that uses a stereo cameras installed seethrough head mounted display (HIMD) device and an adaptive
gesture recognition algorithm. Since it provides a natural way
to interact with machines or computers through the vision
based select, click, and drag&drop operations on the virtual
touch screen, it ensures the mobility and the convenience. It
allows users to control their portable devices like personal
digital assistants (PDAs) or ultra mobile personal computers
(UMPCs) using a hand during their daily life.
humn C*
camera A
ARTbolKit
Reonto
Fig. 1. A stereo cameras installed see-through HMD and the virtual touch
screen display algorithm
z,
11
y
'k
z
x
environments.
Second, the interface offers the natural and comfortable
HCI interface by not using any awkward devices for
interacting. Because the system recognizes the color, shape,
and movement of the hand, the users do not need any
additional devices.
In our future work, we plan to employ the differential image
and the optical flow techniques for improving the gesture
recognition accuracy, especially toward complex backgrounds.
We further intend to realize a vision-based interface that
allows the users to use their both hands without any real patch
images. This will naturally reinforce the mobility and the
convenience of our existing interface.
REFERENCE
[1] H. Kato, M. Billinghurst, I. Poupyrev, K. Imamoto, and K. Tachibana,
~~~~~~1.
......I
(a)
I
(b)
Fig. 3. Concept and example of interaction with the virtual touch screen
IMPLEMENTATION
We use a UMPC, VAIO VGN-UX50, for the
implementation. This device has a 1.06GHz Intel Core Solo
CPU and 512MB of main memory.
The implemented system detects and tracks the user's
hand position and perceives the index finger tip. It offers the
select, click, and drag&drop operations. If the distance
between the fingertip and the HMD is same to the distance
between the virtual touch screen and the HMD for a short time
(shorter than 1 second), it is regarded as the select operation.
If the selected location is on an icon, the UMPC executes the
III.