Researching Game Design with Eye-Gaze Tracking Technologies for Children with Profound Disabilities
Updated: May 12, 2023
I had the opportunity to participate in some ground-breaking research at the University of Toronto over the past 5 years. I was a research assistant on a team of researchers that was investigating the use of eye-gaze tracking technology in educational games and communication software designed for children with profound disabilities. You can view details of the study and our conclusions in this peer-reviewed paper that we published.
Background
One of the tremendous advantages in innovations in technology has been their capacity to enable a wide range of people of different abilities to interact with software that facilitate communication. This is especially true for people with cognitive deficits and/or physical disabilities.
Yet even as we make great strides, there is still a tendency to leave behind those with more complex disabilities because little is known or understood about the ways in which they learn. Our aim in this study was to study one mode of communication that is coming into its own for non verbal children with limited use of their hands, eye-tracking software couple with voice output.
We wanted to analyze the interplay between the three units involved in the communicative exchange. They are:
The Student - Children with multiple disabilities who have speech and other impairments coupled with speech impediments. Four the 12 children we studied had a condition called Retts Syndrome; the rest had other conditions such as cerebral palsy and seizure disorder. Eye-gaze has been identified as a promising mode of communication for this group through other studies that we cited in writing our own study. This raises the potential that eye-tracking technology could reveal intentionality in the child's gaze. In other words, it would help confirm whether or not the child was deliberately concentrating their gaze on a certain object to communicate.
The Teacher (Communicative Partner) - This is the person whom the child is communicating with, usually a teacher or educational assistant. Previous research indicates there is some significance to the partner's communication style, with a more enthusiastic and prompting affect increases the success of communication. This is still debated and there is literature to support a more neutral affect. Surveys also showed that familiarity with the communicative partner improved outcomes. But there is concern that the child may lose autonomy to reliance on a limited number of familiar partners and it would limit their options to only be able to communicate when a familiar partner is available.
Eye-Gaze Tracking Software - There are analog methods available for communication with eye-gaze such as low tech choice boards. But this often requires a partner to decode the child's gaze. In electronic eye-gaze tracking systems, the task of decoding is left to the computer software and since it also has a voice output, it allows the child to initiate communication by looking at the screen and gazing intentionally at an object. The communicative partner can focus on the conversation and the child gains agency.
Research Questions
We formulated our three research questions around four measures of effectiveness:
Research Question 1: To what extent can eye-tracking devices be used for assisted communication? We kept a measure of session time in minutes and seconds. Longer sessions meant greater interest and motivation.
Research Question 2: What role does the communicative partner's voice affect or style in communication outcomes? We examined interactions in both neutral (dull and monotonous) and affected (excited and engaged) tones.
Research Question 3: How does a familiarity with the communicative partner impact engagement? We measured the amount of contact time the child had with the communicative partner prior to the start of our study.
Participants
We had 12 participants with a range of cognitive and/or developmental disabilities. They ranged in ages between 4 and 12. All of them met our inclusion criteria, which were: difficulty using their hands, limited speech and little to no spoken language ability, and had sight (glasses were allowed).
Method
Prior to the start of the research, the team received ethics clearance from the Toronto District School Board and the University of Toronto. We also got written consent from the parents.
We collected data over a three month period, twice a week, at a school in downtown Toronto. We used an eye-tracker system by Tobii-Dynavox that was loaded with software that was designed to assist children in various tasks including facial recognition, targeting and identification of objects. We had an all in one built in system where the eye-tracking bar was built into the device and second system that was improvised by attaching an eye-gaze tracker to a laptop with a portable stand.
The participants used a software that was designed for children with profound disabilities to assist them in communication using objects with labels and phrases. The user could select an object on the screen by concentrating their gaze on it. The system would respond by giving and auditory output of the name of the object.
The data were collected in a setting familiar to the children, their classroom, during regular school hours.
We started by setting a baseline for the children's communication abilities using an online tool called the Communication Matrix. It's used to track changes in communication for children with disabilities. Our premise was that we would start by assessing where they were on the matrix before using the eye-gaze software and then reassess at the end of the study to see if there were improvements in their scores. If the trend showed scores improve, it follows that there is some correlation between the eye-gaze software and improved communication abilities for the children.
Data Collection
We collected screenshots of heatmaps produced by the software and videos of the children playing the games or communicating with partners. All of the recordings were stored online in a password protected assessment tool for teachers. Data was anonymized through the assignment of random IDs to each child.
Coding and Analysis
We coded the data by selecting a sample of videos from the study and analyzing them independently to look for all variables that could help us test our hypotheses. We created a codebook based on our results and used it to analyze and code all the other videos. The data we created was analyzed using IBM's SPSS software.
We also transcribed a sample of 7 videos, including descriptions of the surroundings and any gestures, movements, or facial expressions the children made. We analyzed them using a framework for assessing multi-modal interactions.
Results
The communication matrix assessments. 11 of the 12 children completed communication matrix assessments before and after the trial. 8 of the children showed improvements on their score and one showed no change. We also found an increase in session times. Teachers reported that typical communication with non eye-gaze methods last 35-50 seconds. With the eye-gaze tracking software, the sessions lasted an average of 154.54 seconds. Voice affect significantly influenced the length of interactions. A session whether the communicative partner used a neutral tone of voice last an average of 92.96 seconds, compared to 148.61 seconds when the communicative partner used an inflected tone of voice. In terms of partner familiarity, we found that there was no statistically significant difference in session length with familiar partners versus unfamiliar partners.
Conclusions
In response to research question 1, our results show that children with profound disabilities and complex communication needs are able to engage in longer, more meaningful exchanges with communicative partners, be they familiar or unfamiliar. Our measurements on the communication matrix showed that children gained communication skills.
With respect to research question 2, voice affect was found to be a strong determining factor in communication outcomes for the children, the device and the communicative partner. There is still debate in the literature on this topic, but our research showed that when the teachers used inflected tones, students were more expressive, experienced less frustration and were visibly enthusiastic.
Finally, research question 3, where we assessed the impact of familiarity on communication outcomes. Our results showed that while sessions with familiar partners were on average longer, there was no statistical significance to this difference. Both familiar and unfamiliar partners were able to interpret communicative acts.
Overall, our research participants demonstrated increasing joint attention and reciprocal communication. This is exciting to me because the eye-tracking software increased the students agency and gave them back so much control so that it motivated them to communicate more. This was a triadic relationship between the student, communicative partner and the software. Changing some conditions and controlling for others helped us to present a case for the best possible environment and methods to get the most out of this exciting new technology.
Comments