Abstract—In this work we study a followable user interface
(FUI): a user interface that tracks the motion of the user’s hand.
In an FUI, the system provides a UI screen that tracks the
motion of the user’s palm. As a means of implementing such a
system, we study a method involving an LCD projector and a
depth sensor. In this paper we discuss methods for using the
depth sensor to detect the position of the user’s hand and for
controlling the projection of the UI screen. Moreover, we discuss
a method for detecting gestures in which the user places one
hand on top of the other hand and makes sliding motions. As the
result of constructing the prototype system with our methods, it
is shown that the system was capable of properly projecting the
UI to match the motion of the user’s hand, to identify gestures,
and to switch the UI based on this input.
Index Terms—Hand tracking, projection mapping, depth
sensor, followable user interface.
Takuya Yamaguchi was with Department of Human Information Systems,
Faculty of Science and Engineering, Teikyo University, Tochigi, Japan
(e-mail: yamaguchi07t@gmail.com).
Kozo Mizutani and Masayuki Arai are with the Department of
Information and Electronic Engineering, Facultly of Science and
Engineering, Teikyo University, Tochigi, Japan (e-mail:
mizutani@ics.teikyo-u.ac.jp, arai@ics.teikyo-u.ac.jp).
[PDF]
Cite: Takuya Yamaguchi, Kozo Mizutani, and Masayuki Arai, "A Study of Followable User Interface to Hand Behavior," International Journal of Knowledge Engineering vol. 1, no. 3, pp. 240-243, 2015.