Autism is a developmental delay disorder, which affects as many as 1 out of every 59 children in the United States. One of the symptoms is a difficulty to acquire social skills, and trouble recognizing and responding appropriately to social cues. Part of the problem is that it’s incredibly difficult as a parent to know what therapies are beneficial to intervene, therapists are incredibly expensive, and practitioners are few and far between. But what about robots?
Robots are also few and far between, currently, but that could change. Ayanna Howard, a researcher at Georgia Tech, has been using interactive humanoid robots to guide children with autism on ways to socially and emotionally engage with others?as a supplement to therapy.
Dr. Howard is the founder of Zyrobotics, whose tag line is “Freedom through technology.” The freedom being talked about here is the sort of liberation of a person from being unable to communicate verbally or share to a person able to interact with others, aware of some typical social cues.
So, these robots? In a recent study, Howard and other researchers wanted to understand how robots could help children interpret sensory experiences. The study had 18 participants between the ages of 4 and 12, five diagnosed with autism, and the other 13 children were neuro-typical.
Two androids were programmed to express boredom, excitement, nervousness, and 17 other emotional states. Children would go around to stations set up for hearing, seeing, smelling, tasting, and touching, and the robots would model socially appropriate responses. The idea is that children would see the modeled behavior and learn to imitate it.
At the IEEE VIC Summit 2019 she spoke as part of a panel on “Ethics in AI – Impacts of (Anti?) Social Robotics”. ?If a child?s expression is one of happiness or joy, the robot will have a corresponding response of encouragement,? Howard says. ?If there are aspects of frustration or sadness, the robot will provide input to try again.? The study suggested that many children with autism exhibit stronger levels of engagement when the robots interact with them at such sensory stations.
Which is great – the proliferation of more interventions available to more people, earlier, can only be a good thing. Personally, I’d like to see wider samples of humans repeat the results, and I’d like some answers to what would keep a robot’s cloud up and running.
I keep this in mind, because earlier this year, Jibo, a robot meant to initially work with people with autism, and be a human’s companion, began to be crippled as its servers went dark.
Anki, makers of machine learning-powered race cars, who debuted in an Apple Keynote and sold in Apple and Target retail stores, let their customers know that their servers were going to go dark, too.
We need a plan for robots that are reliant on cloud, especially robots that are intended for human companionship, to continue to work when their companies fail. How devastating is it when one of these companions is lobotomized as their cloud is taken down?
We need more research into using robots to educate on social skills. We need more companies to avoid the harm they do when their robots go dark.