Towards automation of dynamic-gaze video analysis taking functional upper-limb tasks as a case study

Alyaman, M, Sobuh, M, Zaid, A, Kenney, LPJ ORCID:, Galpin, AJ ORCID: and Al-Taee, M 2021, 'Towards automation of dynamic-gaze video analysis taking functional upper-limb tasks as a case study' , Computer Methods and Programs in Biomedicine, 203 , p. 106041.

PDF - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives 4.0.

Download (745kB) | Preview


Background and objective : Previous studies in motor control have yielded clear evidence that gaze behavior (where someone looks) quantifies the attention paid to perform actions. However, eliciting clinically meaningful results from the gaze data has been done manually, rendering it incredibly tedious, time-consuming, and highly subjective. This paper aims to study the feasibility of automating the coding process of the gaze data taking functional upper-limb tasks as a case study. Methods : This is achieved by developing a new algorithm capable of coding the collected gaze data through three main stages; data preparation, data processing, and output generation. The input data in the form of a crosshair and a gaze video are converted into a 25Hz frame rate sequence. Keyframes and non-key frames are then obtained and processed using a combination of image processing techniques and a fuzzy logic controller. In each trial, the location and duration of gaze fixation at the areas of interest (AOIs) are obtained. Once the gaze data is coded, it can be presented in different forms and formats, including the stacked color bar. Results : The obtained results showed that the developed coding algorithm highly agrees with the manual coding method but significantly faster and less prone to unsystematic errors. Statistical analysis showed that Cohen's Kappa ranges from 0.705 to 1.0. Moreover, based on the intra-class correlation coefficient (ICC), the agreement index between computerized and manual coding methods is found to be (i) 0.908 with 95% confidence intervals (0.867, 0.937) for the anatomical hand and (ii) 0.923 with 95% confidence intervals (0.888, 0.948) for the prosthetic hand. A Bland-Altman plot also showed that all data points are closely scattered around the mean. These findings confirm the validity and effectiveness of the developed coding algorithm. Conclusion : The developed algorithm demonstrated that it is feasible to automate the coding of the gaze data, reduce the coding time, and improve the coding process's reliability.

Item Type: Article
Schools: Schools > School of Health and Society > Centre for Health Sciences Research
Journal or Publication Title: Computer Methods and Programs in Biomedicine
Publisher: Elsevier
ISSN: 0169-2607
Related URLs:
Depositing User: Professor Laurence Kenney
Date Deposited: 08 Mar 2021 09:39
Last Modified: 07 Mar 2022 02:30

Actions (login required)

Edit record (repository staff only) Edit record (repository staff only)