Volltext-Downloads (blau) und Frontdoor-Views (grau)
  • search hit 2 of 24
Back to Result List

Dynamic Hand Gesture Recognition for Mobile Systems Using Deep LSTM

  • We present a pipeline for recognizing dynamic freehand gestures on mobile devices based on extracting depth information coming from a single Time-of-Flight sensor. Hand gestures are recorded with a mobile 3D sensor, transformed frame by frame into an appropriate 3D descriptor and fed into a deep LSTM network for recognition purposes. LSTM being a recurrent neural model, it is uniquely suited for classifying explicitly time-dependent data such as hand gestures. For training and testing purposes, we create a small database of four hand gesture classes, each comprising 40 × 150 3D frames. We conduct experiments concerning execution speed on a mobile device, generalization capability as a function of network topology, and classification ability ‘ahead of time’, i.e., when the gesture is not yet completed. Recognition rates are high (>95%) and maintainable in real-time as a single classification step requires less than 1 ms computation time, introducing freehand gestures for mobile systems.

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Author:Ayanava Sarkar, Alexander Gepperth, Uwe Handmann, Thomas Kopinski
DOI:https://doi.org/https://doi.org/10.1007/978-3-319-72038-8_3
ISBN:978-3-319-72038-8
Parent Title (English):Intelligent Human Computer Interaction. IHCI 2017. Lecture Notes in Computer Science
Publisher:Springer
Document Type:Conference Proceeding
Language:English
Year of Completion:2017
Release Date:2019/07/02
Issue:vol. 10688
Page Number:13
First Page:19
Last Page:31
Institutes:Fachbereich 1 - Institut Informatik
DDC class:600 Technik, Medizin, angewandte Wissenschaften / 600 Technik
Licence (German):License LogoCreative Commons - CC BY - Namensnennung 4.0 International