Skip to main content

Research Repository

Advanced Search

Toward Kinecting cognition by behaviour recognition-based deep learning and big data

Soufian, M; Nefti-Meziani, S; Drake, J

Toward Kinecting cognition by behaviour recognition-based deep learning and big data Thumbnail


Authors

M Soufian

S Nefti-Meziani

J Drake



Abstract

The majority of older people wish to live independently at home as long as possible despite having a range of age-related conditions including cognitive impairment. To facilitate this, there has been an extensive focus on exploring the capability of new technologies with limited success. This paper investigates whether MS Kinect (a motion-based sensing 3-D scanner device) within the MiiHome (My Intelligent Home) project in conjunction with other sensory data, machine learning and big data techniques can assist in the diagnosis and prognosis of cognitive impairment and hence prolong independent living. A pool of Kinect devices and various sensors powered by minicomputers providing internet connectivity are being installed in up to 200 homes. This enables continuous remote monitoring of elderly residents living alone. Passive and off-the-shelf sensor technologies were chosen to implement data acquisition specifically from sources that are part of the fabric of the homes, so that no extra effort is required from the participants. Various constraints including environmental, geometrical and big data were identified and appropriately dealt with. A visualization tool (MAGID) was developed for validation and verification of numerous behavioural activities. Then, a subset of data, from twelve pensioners aged over 65 with age-related cognitive decline and frailty, were collected over a period of 6 months. These data were subjected to several machine learning algorithms (multilayer perceptron neural network, neuro-fuzzy and deep learning) for classification and to extract routine behavioural patterns. These patterns were then analysed further to ascertain any health-related information and their attributes. For the first time, important routine behaviour related to Activities of Daily Living (ADL) of elderly people with cognitive and physical decline has been learnt by machine learning techniques from selected sample data obtained by MS Kinect. Medically important behaviour, e.g. eating, walking, sitting, was best learnt by deep learning with accuracy of 99.30% during training stage and average error rate of 1.83% with maximum of 12.98% during the implementation phase. Observations obtained from the application of the above learnt behaviours are presented as trends over a period of time. These trends, supplemented by other sensory signals, have provided a clearer picture of physical (in)activities (including falls) of the pensioners. The calculated behavioural attributes related to key indicators of health events can be used to model the trajectory of health status related to cognitive decline in a home setting. These results, based on a small number of elderly residents over a short period of time, imply that within the results obtained from the MiiHome project, it is possible to find indicators of cognitive decline. However, further studies are needed for full clinical validation of these indications in conjunction with assessment of cognitive decline of the participants.

Citation

Soufian, M., Nefti-Meziani, S., & Drake, J. (2020). Toward Kinecting cognition by behaviour recognition-based deep learning and big data. Universal Access in the Information Society, 21, 33-51. https://doi.org/10.1007/s10209-020-00744-5

Journal Article Type Article
Acceptance Date Jun 25, 2020
Online Publication Date Sep 26, 2020
Publication Date Sep 26, 2020
Deposit Date Oct 8, 2020
Publicly Available Date Oct 8, 2020
Journal Universal Access in the Information Society
Print ISSN 1615-5289
Publisher Springer Verlag
Volume 21
Pages 33-51
DOI https://doi.org/10.1007/s10209-020-00744-5
Publisher URL https://doi.org/10.1007/s10209-020-00744-5
Related Public URLs https://link.springer.com/journal/10209
Additional Information Grant Number: R119333

Files





Downloadable Citations