“Let’s do something about it!” There’s no waiting around for someone else to fix things for these guys, they’re taking matters in their own hands — from skiing to raise money for the leprosy-afflicted to rewriting an unbiased history of India and Pakistan; recycling tyres and pushing solar energy to solutions for the hearing impaired and making earthquake relief count. Motherland picks out a bunch of promising superheroes from across the country, determined to light up the world we live in.
During our short phone conversation, we overhear Arsh Dilbagi’s father prompting him, with an understandable sense of pride, to tell us how much he scored in his board exams. (He got 95.2%, in case you’re wondering. And a perfect score in computer science.) Arsh, who’s 17 and graduated from high school only last year, seems, by all accounts, to be a bit of a savant — he’s designing robots now (!), with Arido Labs, a subsidiary of the start-up company Arido, specialising in technology pertaining to robotics. Our topic of discussion, though, is centred around TALK, an assistive device — an Alternative Communication Device (AAC) — that helps people with speech impairments and developmental disabilities speak. With TALK, Arsh was one of 15 finalists at the Google Science Fair in 2014.
It works entirely through one’s breath; the person using it is required to exhale into the device twice, which is then processed through the MEMS microphone in front of the mouth or under the nose. Its size is comparable to a smartphone, Arsh tells us, and the breaths are processed and represented as Morse code. The code is subsequently synthesised using nine different voices available on the TALK, with not only male and female voices but also varying accents and age groups. It’s currently a prototype and in beta-testing, and the company plans to license it to manufacturers. It’s already been tested by 35 users, and Arsh has also worked with neurologists and ENT specialists in this phase. While its launch date is uncertain as yet, subject to a range of variables including testing, licensing, and manufacturing, Arsh says it might hit the Indian market by 2017.
He’s from Panipat, but moves around a lot and, he tells us, has applied to schools in the U.S. for further education. He started working on TALK in the summer of 2012, developing it independently until early 2015, when he joined Aribo Labs. The initial spark, he recalls, was ignited upon a visit to his doctor’s clinic. “I was sitting in the clinic, and there was a person there sitting in a wheelchair and crying. I asked the doctor about him, and it turns out he had a brain stem stroke. I knew about Stephen Hawking; I knew solutions existed. That’s when the entire thought chain began.” The original plan didn’t involve using the breath as the medium; that’s something that came later.
Besides TALK, Arsh is working on a couple of other start-ups as well. Along with the team at Aribo Labs, he has also developed a robot dog that has the potential to self-learn, which sounds as intimidating as it does exciting. “It’s a robotic dog with an AI neural network,” he explains. “Normally, you tell a machine how to walk. What we’ve developed is, it learns on its own. So if you cut out one of its motors, it adapts… it’ll teach itself how to walk with three legs. It’s very unique. The personal robot market is very saturated; they’re either like toys or high-tech expensive showpieces. There’s no robot yet that’s touched that sweet spot.”
Arsh is on it though; he tells us how they’re working on a human robot with the ability to self-learn — a self-aware human machine — and how they’ve already developed the neural network for it. How long will it take? Not very. “Within the next decade.”