Skip to main content Skip to navigation

Evaluating motor and cognitive function through gaze-control technology

Primary Supervisor: Dr David Souto, Department of Neuroscience, Psychology and Behaviour

Secondary supervisor: Dr Simon Judge, University of Sheffield

PhD project title: Evaluating motor and cognitive function through gaze-control technology

University of Registration: University of Leicester

Project outline:

Gaze-control technology has proved invaluable in assisting people with conditions such as Cerebral Palsy (CP), at times dramatically improving their quality of life by extending their ability to communicate independently. Even though there is a rather vast literature on gaze-control focusing on performance (e.g. gaze-typing speed), it generally ignores the goal-directed movements that underlie this performance. There is a wealth of physiological data (from eye movement, pupil and head movement tracking) that could be recorded during day-to-day use of gaze-control technology and used to infer cognitive function. An understanding of how these data relate to cognitive function would improve the ability to detect change in a range of conditions, whether it is deterioration or improvement due to the use of gaze-control technology [1, 2].

The aims of the project will be to:

(i) link behaviour recorded while using gaze-control technology to performance in standardised eye movements tasks measuring motor control and cognitive function (e.g. memory, visual attention and executive control). We will do so by extracting low-level (e.g. eye movement latency) and higher-level metrics (e.g. k coefficient of focal to ambient behaviour) developed in investigations of scene exploration [e.g. 3]. 

(ii) use a bottom-up approach based on machine learning to detect eye movement features that are predictive of changes in performance in standardized measures of cognitive and motor control. This approach will be developed in collaboration with Professors Huiyu Zhou and Yu-Dong Zhang from the School of Informatics, University of Leicester.

Specifically, there is a plethora of gaze data (eye + head movement) that could be used to detect changes in motor control that could be indicative of significant clinical outcomes related to loss of mobility or cognition, which is especially relevant in rapidly evolving conditions. Prior studies that looked at oculomotor function in clinical conditions have understandably used a small number of trials to avoid testing fatigue [4], whereas recording every-day use of a gaze-controlled interface would generate several times this amount of data in a single day, greatly increasing the ability to detect change reliably.

In this project, we will adopt a multi-disciplinary approach, combining gaze-control technology, optometry, experimental psychology, artificial intelligence, and augmentative and alternative communication (AAC) technology. The student will be working with a multidisciplinary team, meshing the latest AAC and machine learning technology with the potential to deliver clinically relevant insights, assist users with interface use and inform the design of communication interfaces to provide solutions that are resilient to disease progression, variability between individuals and between conditions.

References:

  1. Souto, D, Marsh O, Paterson KB. Use-dependent plasticity in assistive interfaces: Gaze-typing improves inhibitory control. Poster presented at the European Conference on Eye Movements, Alicante, Spain, 2019.
  2. Nelles G, Pscherer A, De Greiff A, Forsting M, Gerhard H, Esser J, Diener HC. Eye-movement training-induced plasticity in patients with post-stroke hemianopia. Journal of Neurology 256: 726–733, 2009.
  3. Krejtz K, Duchowski A, Krejtz I, Szarkowska A, Kopacz A. Discerning ambient/focal attention with coefficient K. ACM Transactions on Applied Perception (TAP) 13: 11, 2016.
  4. Anderson TJ, MacAskill MR. Eye movements in patients with neurodegenerative disorders. Nature Reviews Neurology 9: 74–85, 2013.

BBSRC Strategic Research Priority: Understanding the Rules of Life: Neuroscience and Behaviour

    Techniques that will be undertaken during the project:

    • Eye-tracking
    • Computational modelling
    • Machine learning

    Contact: Dr David Souto, University of Leicester