Tareq Nabhan has circumnavigated the globe over the past seven years, helping to bring vision care to underserved communities.
Nabhan, an alumnus of the College of Optometry at the University of Missouri–St. Louis and now an assistant clinical professor, has made visits to countries such as China, Jordan, Peru, Rwanda and Thailand through his involvement with OneSight. The nonprofit organization is committed to bringing eye exams and glasses to the estimated 1.1 billion people worldwide who lack access.
In 2019, he spent time in Brazil on a floating clinic that traveled up the Amazon River, going from village to village and providing eye care services.
“We would manufacture the lenses on the boat, so the patients would get their glasses either the same day or the next day,” Nabhan says. “These are custom made. They’re not donated. We make the glasses that the patients needed, exactly to their prescription.”
Such trips always seem to come with a few hiccups with clinicians such as Nabhan often forced to divide hundreds of pounds of equipment between them and fit it in their luggage. They have to transport it through airports, on planes and ultimately into the field so they have the tools they need to make diagnoses. It can be a difficult and costly endeavor, and there’s always a chance they’ll be stopped at customs and barred from carrying an important piece of equipment into a country.
Nabhan started wondering if there was a way to use something as ubiquitous as a smartphone to perform the functions of those existing tools. He’s been working with Sanjiv Bhatia, a professor in UMSL’s Department of Computer Science to develop a diagnostic tool that relies on computer vision and image processing technology – Bhatia’s area of research expertise.
The two, along with Senior Research Engineering Technician Michael Howe, are building a prototype for a tool that can be used with a smartphone.
It works with the phone to collect data that is analyzed using computer vision techniques.
“You can reduce the cost using the power inside the smartphone to do things that are dependent on very expensive instruments otherwise,” Bhatia says.
The tool doesn’t even require a trained clinician at the controls.
Smartphones can also store data when there is poor or no connectivity and be used to deliver information asynchronously to a trained clinician for evaluation and to make a diagnosis.
“What we want is to develop a tool that has essentially low degrees of freedom – that guides the operator in not just making a diagnosis but learning how to use the tool and understanding what they see,” Nabhan says. “It can’t be inferior to current standards. That’s where we draw the line. I think that’s where others have failed in the field.”
He adds, “If we take advantage of that technology, we think we can penetrate not just the market but penetrate communities that are hard to get to and provide more access to care.”
They hope the tool will be ready for testing by the fall. They’ve already received funding to beta-test it on campus, and OneSight has granted them permission to test the device at its clinics.
Nabhan and Bhatia have had a fruitful partnership since they first started collaborating on hardware-software solutions in 2015.
That was the year they met at an institutional research workshop, though Nabhan had been aware of Bhatia and his expertise. Nabhan had been looking to connect with Bhatia when the workshop began.
“I don’t remember if I heard his name during roll call or he raised his hand and asked a question and they responded back with his name, but I was in the back, looking around trying to see where this guy was,” Nabhan says. “I made a beeline as soon as that meeting was over to introduce myself.”
Much of Bhatia’s research had been connected to the defense industry when they started working together, but collaborating with an optometrist made a lot of sense.
“I teach computer vision, and early on in the class, we talk about the human vision system because that is what we are trying to replicate,” Bhatia says.
The software driving computer vision technology attempts to mimic the way the eye and the brain work to process information. The pixels of data captured by cameras can serve the same function as the photoreceptor cells in the eye that read color or gray scale.
The technology he’s developing with Nabhan can afford to process things at a little bit slower speed than when he’s working on defense projects.
“When you are flying an airplane and trying to look for threats, you cannot afford to drop any frames,” Bhatia says. “The computers you need to have are much more powerful. Over here, it’s still very important, but if it takes a couple of seconds longer to make a diagnosis, it’s not going to be a matter of life and death.”
Nabhan and Bhatia have learned a lot from each other through their shared efforts, and they plan to continue tackling other challenges in the future.
“It’s great being around people who don’t do what you do because they think about it in different ways,” Nabhan says. “I see it as a no-brainer to try to work with computer scientists to solve these problems, and Dr. Bhatia has been my No. 1 advocate and mentor. We’re really excited to be solving problems together.”
This story was originally published in the spring 2021 issue of UMSL Magazine. If you have a story idea for UMSL Magazine, email magazine@umsl.edu.