S R Sannasi Chakravarthy
Intelligent Recognition of Multimodal Human Activities for Personal Healthcare
Sannasi Chakravarthy, S R; Bharanidharan, N; Vinoth Kumar, V; Mahesh, T R; Khan, Surbhi Bhatia; Almusharraf, Ahlam; Albalawi, Eid
Authors
N Bharanidharan
V Vinoth Kumar
T R Mahesh
Dr Surbhi Khan S.Khan138@salford.ac.uk
Lecturer
Ahlam Almusharraf
Eid Albalawi
Abstract
Nowadays, the advancements of wearable consumer devices have become a predominant role in
healthcare gadgets. There is always a demand to obtain robust recognition of heterogeneous human activities
in complicated IoT environments. The knowledge attained using these recognition models will be then
combined with healthcare applications. In this way, the paper proposed a novel deep learning framework
to recognize heterogeneous human activities using multimodal sensor data. The proposed framework is
composed of four phases: employing dataset and processing, implementation of deep learning model,
performance analysis, and application development. The paper utilized the recent KU-HAR database with
eighteen different activities of 90 individuals. After preprocessing, the hybrid model integrating Extreme
Learning Machine (ELM) and Gated Recurrent Unit (GRU) architecture is used. An attention mechanism
is then included for further enhancing the robustness of human activity recognition in the IoT environment.
Finally, the performance of the proposed model is evaluated and comparatively analyzed with conventional
CNN, LSTM, GRU, ELM, Transformer and Ensemble algorithms. To the end, an application is developed
using the Qt framework which can be deployed on any consumer device. In this way, the research sheds
light on monitoring the activities of critical patients by healthcare professionals remotely. The proposed
ELM-GRUaM model achieved supreme performance in recognizing multimodal human activities with an
overall accuracy of 96.71% as compared with existing models.
Citation
Sannasi Chakravarthy, S. R., Bharanidharan, N., Vinoth Kumar, V., Mahesh, T. R., Khan, S. B., Almusharraf, A., & Albalawi, E. (2024). Intelligent Recognition of Multimodal Human Activities for Personal Healthcare. IEEE Access, 1-1. https://doi.org/10.1109/access.2024.3405471
Journal Article Type | Article |
---|---|
Acceptance Date | May 22, 2024 |
Publication Date | Jun 14, 2024 |
Deposit Date | Jun 18, 2024 |
Publicly Available Date | Jun 18, 2024 |
Journal | IEEE Access |
Publisher | Institute of Electrical and Electronics Engineers |
Peer Reviewed | Peer Reviewed |
Pages | 1-1 |
DOI | https://doi.org/10.1109/access.2024.3405471 |
Files
Published Version
(2.7 Mb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by-nc-nd/4.0/
You might also like
Multi-class Breast Cancer Classification Using CNN Features Hybridization
(2024)
Journal Article
Refining neural network algorithms for accurate brain tumor classification in MRI imagery
(2024)
Journal Article
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search