Volltext-Downloads (blau) und Frontdoor-Views (grau)

Dynamic Hand Gesture Recognition for Mobile Systems Using Deep LSTM

  • We present a pipeline for recognizing dynamic freehand gestures on mobile devices based on extracting depth information coming from a single Time-of-Flight sensor. Hand gestures are recorded with a mobile 3D sensor, transformed frame by frame into an appropriate 3D descriptor and fed into a deep LSTM network for recognition purposes. LSTM being a recurrent neural model, it is uniquely suited for classifying explicitly time-dependent data such as hand gestures. For training and testing purposes, we create a small database of four hand gesture classes, each comprising 40 × 150 3D frames. We conduct experiments concerning execution speed on a mobile device, generalization capability as a function of network topology, and classification ability ‘ahead of time’, i.e., when the gesture is not yet completed. Recognition rates are high (>95%) and maintainable in real-time as a single classification step requires less than 1 ms computation time, introducing freehand gestures for mobile systems.

Export metadata

Additional Services

Share in Twitter Search Google Scholar


Author:Ayanava Sarkar, Alexander Gepperth, Uwe Handmann, Thomas Kopinski
Parent Title (English):Intelligent Human Computer Interaction. IHCI 2017. Lecture Notes in Computer Science
Document Type:Conference Proceeding
Year of Completion:2017
Release Date:2019/07/02
Issue:vol. 10688
Page Number:13
First Page:19
Last Page:31
Institutes:Fachbereich 1 - Institut Informatik
DDC class:600 Technik, Medizin, angewandte Wissenschaften / 600 Technik
Licence (German):License LogoCreative Commons - CC BY - Namensnennung 4.0 International