Reach Prediction Using Finger Motion Dynamics

Valkov, Dimitar; Kockwelp, Pascal; Daiber, Florian; Krüger Antonio

Abstract in edited proceedings (conference) | Peer reviewed

Abstract

The ability to predict the object the user intends to grasp or to recognize the one she is already holding offers essential contextual information and may help to leverage the effects of point-to-point latency in interactive environments. This paper investigates the feasibility and accuracy of recognizing un-instrumented objects based on hand kinematics during reach-to-grasp and transport actions. In a data collection study, we recorded the hand motions of 16 participants while reaching out to grasp and then moving real and synthetic objects. Our results demonstrate that even a simple LSTM network can predict the time point at which the user grasps an object with 23 ms precision and the current distance to it with a precision better than 1 cm. The target’s size can be determined in advance with an accuracy better than 97\%. Our results have implications for designing adaptive and fine-grained interactive user interfaces in ubiquitous and mixed-reality environments.

Details about the publication

PublisherACM
Book titleExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
Page range1-8
Article number242
Publishing companyACM Press
Place of publicationNew York, NY, USA
Title of seriesCHI EA '23
StatusPublished
Release year2023
Language in which the publication is writtenEnglish
ConferenceCHI Conference on Human Factors in Computing Systems, Hamburg, Germany
ISBN9781450394222
DOI10.1145/3544549.3585773
Link to the full texthttps://doi.org/10.1145/3544549.3585773
Keywordshand gesture; grasp prediction; datasets; neural networks

Authors from the University of Münster

Kockwelp, Pascal
Institute for Geoinformatics (ifgi)