Finger Gesture Tracking From Arbitrary Hand Position Using Smart Skin Technology and Deep Learning

Date of Award

4-30-2024

Document Type

Thesis

Degree Name

Master of Science in Machine Learning

Department

Machine Learning

First Advisor

Dr. Hava Siegelmann

Second Advisor

Dr. Bin Gu

Abstract

"The exploration of new approaches to achieve finger gesture recognition has been a subject of extensive study, with particular interest in leveraging Electromyography (EMG). The allure of EMG lies in its potential utility in applications where visual imaging is not practical. Until recently, EMG data required a significant reliance on hand position, often presenting intricate challenges in distinguishing between overlapping patterns associated with various positions. This study proposes a novel solution by employing a transformer model and motion sensor-based training approach to achieve unprecedented accuracy and utility. We tested our approach on eight subjects performing 10 distinct finger gestures and arbitrary hand positions. We designed a dedicated dual-branch transformer architecture with the following features: (1) it utilizes spatio-temporal patches of the input to decode the intricate relationship between spatial and temporal features present in EMG (2) it maps the temporal and spectral representations of the signal to get a holistic feature map. (3) it aligns the feature maps of time and frequency by introducing a time-frequency loss term to the training objective. With this, the model was able to make robust predictions with varying hand positions dramatically enhancing the application of EMG in finger gesture recognition over previously reported approaches."

Comments

Thesis submitted to the Deanship of Graduate and Postdoctoral Studies

In partial fulfilment of the requirements for the M.Sc degree in Machine Learning

Advisors: Hava Siegelmann,Bin Gu

with 1 year embargo period

This document is currently not available here.

Share

COinS