Multimodal Human Activity Recognition for Smart Healthcare Applications

Document Type

Conference Proceeding

Publication Title

2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC)


Human Activity Recognition (HAR) has emerged as a potential research topic for smart healthcare owing to the fast growth of wearable and smart devices in recent years. The significant applications of HAR in ambient assisted living environments include monitoring the daily activities of elderly and cognitively impaired individuals to assist them by observing their health status. In this research, we present a deep learning-based fusion approach for multimodal HAR that fuses the different modalities of data to obtain robust outcomes. Here, Convolutional Neural Networks (CNNs) retrieve the high-level attributes from the image data, and the Convolutional Long Short Term Memory (ConvLSTM) is utilized to capture significant patterns from the multi-sensory data. Finally, the extracted features from the modalities are fused through self-attention mechanisms that enhance the relevant activity data and inhibit the superfluous and possibly confusing information by measuring their compatibility. Lastly, extensive tests have been performed to measure the efficiency and robustness of the developed fusion approach using the UP-Fall detection dataset. It is evident from the experimental findings that the proposed fusion technique outperforms the existing state-of-the-art and achieves relatively better performance. © 2022 IEEE.

First Page


Last Page




Publication Date



Convolutional Long Short Term Memory, Convolutional Neural Network, Human Activity Recognition, Self-Attention, Smart Healthcare.


IR conditions: non-described