M Alyaman
Towards automation of dynamic-gaze video analysis taking functional upper-limb tasks as a case study
Alyaman, M; Sobuh, M; Zaid, A; Kenney, LPJ; Galpin, AJ; Al-Taee, M
Authors
M Sobuh
A Zaid
Prof Laurence Kenney L.P.J.Kenney@salford.ac.uk
Professor
Dr Adam Galpin A.J.Galpin@salford.ac.uk
Senior Lecturer
M Al-Taee
Abstract
Background and objective
: Previous studies in motor control have yielded clear evidence that gaze behavior (where someone looks) quantifies the attention paid to perform actions. However, eliciting clinically meaningful results from the gaze data has been done manually, rendering it incredibly tedious, time-consuming, and highly subjective. This paper aims to study the feasibility of automating the coding process of the gaze data taking functional upper-limb tasks as a case study.
Methods
: This is achieved by developing a new algorithm capable of coding the collected gaze data through three main stages; data preparation, data processing, and output generation. The input data in the form of a crosshair and a gaze video are converted into a 25Hz frame rate sequence. Keyframes and non-key frames are then obtained and processed using a combination of image processing techniques and a fuzzy logic controller. In each trial, the location and duration of gaze fixation at the areas of interest (AOIs) are obtained. Once the gaze data is coded, it can be presented in different forms and formats, including the stacked color bar.
Results
: The obtained results showed that the developed coding algorithm highly agrees with the manual coding method but significantly faster and less prone to unsystematic errors. Statistical analysis showed that Cohen's Kappa ranges from 0.705 to 1.0. Moreover, based on the intra-class correlation coefficient (ICC), the agreement index between computerized and manual coding methods is found to be (i) 0.908 with 95% confidence intervals (0.867, 0.937) for the anatomical hand and (ii) 0.923 with 95% confidence intervals (0.888, 0.948) for the prosthetic hand. A Bland-Altman plot also showed that all data points are closely scattered around the mean. These findings confirm the validity and effectiveness of the developed coding algorithm.
Conclusion
: The developed algorithm demonstrated that it is feasible to automate the coding of the gaze data, reduce the coding time, and improve the coding process's reliability.
Citation
Alyaman, M., Sobuh, M., Zaid, A., Kenney, L., Galpin, A., & Al-Taee, M. (2021). Towards automation of dynamic-gaze video analysis taking functional upper-limb tasks as a case study. Computer Methods and Programs in Biomedicine, 203, 106041. https://doi.org/10.1016/j.cmpb.2021.106041
Journal Article Type | Article |
---|---|
Acceptance Date | Mar 3, 2021 |
Online Publication Date | Mar 7, 2021 |
Publication Date | May 1, 2021 |
Deposit Date | Mar 8, 2021 |
Publicly Available Date | Mar 7, 2022 |
Journal | Computer Methods and Programs in Biomedicine |
Print ISSN | 0169-2607 |
Publisher | Elsevier |
Volume | 203 |
Pages | 106041 |
DOI | https://doi.org/10.1016/j.cmpb.2021.106041 |
Publisher URL | https://doi.org/10.1016/j.cmpb.2021.106041 |
Related Public URLs | https://www.journals.elsevier.com/computer-methods-and-programs-in-biomedicine |
Files
Manuscript(CMPB-D-20-00596)_R2_Published.pdf
(746 Kb)
PDF
Licence
http://creativecommons.org/licenses/by-nc-nd/4.0/
Publisher Licence URL
http://creativecommons.org/licenses/by-nc-nd/4.0/
You might also like
Why does my prosthetic hand not always do what it is told?
(2022)
Journal Article
Co-creation and user perspectives for upper limb prosthetics
(2021)
Journal Article
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search