Improving AAC for ASD
By David Mahmarian / May 10, 2016
In 2007, a severely autistic, nonverbal woman named Amanda Baggs published a video on YouTube describing the experience of living as a person with autism. In the video she types:
“However the thinking of people like me is only taken seriously if we learn your language, no matter how we previously thought or interacted. It is only when I type something in your language that you refer to me as having communication.”
She goes on explaining the way she thinks and experiences life, describing the purpose of her behaviors and continues saying that:
“We are even viewed as non-communicative if we don’t speak the standard language, but other people are not considered non-communicative if they are so oblivious to our own languages as to believe they don’t exist.”
The video shattered the perception that people with autism who could not speak were not aware.
Currently the rates of diagnosis for Autism Spectrum Disorder (ASD) in the United States are 1 in 68 children. With rates this high, most people have a relative or know someone who is affected by it. In 1990 the rate of autism was 1 in 5000 and it was considered a rare disorder. Because of that, there was minimal research into the causes of autism and little was known about how to effectively teach children with the disorder. One of the core deficits in autism is the ability to communicate. The difficulties can range from idiosyncratic or repetitive use of language to having no verbal language at all. The use of augmentative communication for individuals with autism continues to be an emerging field. As technology continues to develop; the possibilities for innovative forms of communication devices and methods also increase.
There are many types of AAC (Augmentative and Alternative Communication) including sign language, gestures, picture symbols and speech generating devices. These methods have proven to be useful for people with any type of disability that impairs communication such as cerebral palsy and autism and have also helped those with degenerative diseases such as ALS and Parkinson’s disease. Difficulty with communication impacts quality of life in many ways and creates barriers to the access of education and community services. An individual’s ability to express their needs, wants, thoughts and feelings is vital to being human and connecting with others.
Historically, many of the employed techniques came from other fields and were designed for people with various disabilities. Many of the applications that currently exist to help people with ASD communicate were designed by speech-language pathologists and engineers. Human-centered design has not, until recently, started to become a part of the process, and companies like AssistiveWare have started investing in building UX teams.
Proloquo2Go and TouchChat are currently the two biggest communication applications on the market, and both operate similarly. The way a user communicates begins on the home screen which displays different categories, and within those categories are choices that allow the user to construct sentences. These choices are represented by the default icons in the system, and can also be customized. Unfortunately, there is a steep learning curve for the user and the parent or caretaker to set up and customize the app to help fit the needs of the individual. While both of these applications have improved the lives of many people with ASD, there are many on the spectrum who have been excluded.
What has proven to be very successful when designing products and services for people with ASD is adaptability. Joy Jensen is a certified equine-assisted therapist and works at Opening Gaits Therapeutic Riding Society in Calgary where they work with nonverbal autistic individuals using therapeutic horse riding. She says that “Horses mimic the way we walk, it’s very rhythmic, it’s like a white noise for people with autism that can calm and help focus. We usually see better communication and ability to stay on task.” She believes in adaptability in their process and said that they will even adapt the horse to the individual based on how rhythmic that specific horse is. When asked how she thought technology could better help people with autism communicate she said “We need to start with the way an autistic person thinks and interacts with society, then develop the technology that makes their interaction with the world, understandable to society at large.”
We also see this approach used in the different robots and applications that are being designed to teach social skills. Robots4Autism has developed a robot called Milo that teaches kids to understand emotions and expressions as well as appropriate responses.
Stanford Medicine has created the Autism Glass Project that is focused on using Google Glass and game mechanics to help people respond in social situations. Both of these projects use facial recognition and adapt to the user’s environment.
Game designers are also starting to see the potential to use video games to treat cognitive disorders. Akili Interactive has been working on an autism-specific video game that adapts in real-time based on the way the user plays. For the past few years they have researched and tested how different people on the spectrum respond positively to and created different versions of the game, within the game that are triggered by the way the user plays. The UI is constantly changing and responding in a closed-looped cybernetic system.
A New Approach
Although the current communication applications for ASD have helped many people who were never able to speak have a voice there is still room for improvement. The current system is based on a linear category model where the goal of the system is to construct thoughts and trigger a voice output (see diagram below).
How could we incorporate the learnings and insights of designers, roboticists, and therapists into a communication product that could reach an even greater number of people? What if there was an application where the UI responded in real-time and adapted based on what the system is learning about the user?
Using a first-order cybernetic model (see diagram below), we can visualize how this could work where the goal of the system is to display appropriate content based on the ability of the user. It could measure and compare different variables such as the selections made, speed of interactions and duration in different categories to trigger a targeted UI. The complexity would adapt to a version that is based on the ability of the user. The system could get smarter over time by learning the complexity of the interactions and to recognize when a mistake is made.
We have only recently stopped emphasizing and focusing our efforts on how to teach autistic people to be like us, and started thinking about how to create things that empower them to be comfortable in their world, and play to their strengths. There is now a dire need for people with ASD to have adaptable solutions that are based in their individual needs and wishes to make it easier for them to interact with the world. It is the due diligence of designers to make certain that these needs and wishes are not being left out of the design process. It is likely that we will begin to see more people on the spectrum designing solutions for other autistic individuals in the near future.