By , Published: Jan. 19, 2023

Engineers at ƷSMӰƬ are tapping into advances in artificial intelligence to develop a new kind of walking stick for people who are blind or visually impaired.

Think of it as assistive technology meets Silicon Valley.

The researchers say that their “smart” walking stick could one day help blind people navigate tasks in a world designed for sighted people—from shopping for a box of cereal at the grocery store to picking a private place to sit in a crowded cafeteria.

“I really enjoy grocery shopping and spend a significant amount of time in the store,” said Shivendra Agrawal, a doctoral student in the Department of Computer Science. “A lot of people can’t do that, however, and it can be really restrictive. We think this is a solvable problem.”

In a , Agrawal, his advisorBradley Hayes,assistant professor of computer science,and their colleagues in the got one step closer to solving it.

The team’s walking stick resembles the white-and-red canes that you can buy at Walmart. But it also includes a few add-ons: Using a camera and computer vision technology, the walking stick maps and catalogs the world around it. It then guides users by using vibrations in the handle and spoken directions, such as “reach a little bit to your right.”

The device isn’t supposed to be a substitute for designing places like grocery stores to be more accessible, Agrawal said. But he hopes the CAIRO Lab's prototype will show that, in some cases, AI can help millions of Americans become more independent.

“AI and computer vision are improving, and people are using them to build self-driving cars and similar inventions,” Agrawal said. “But these technologies also have the potential to improve quality of life for many people.”

Take a seat

Agrawal and his colleagues first explored that potential by tackling a familiar problem: Where do I sit?

“Imagine you’re in a café,” he said. “You don't want to sit just anywhere. You usually take a seat close to the walls to preserve your privacy, and you usually don't like to sit face-to-face with a stranger.”

Previous research has suggested that making these kinds of decisions is a priority for people who are blind or visually impaired. To see if their smart walking stick could help, the researchers set up a café of sorts in their lab—complete with several chairs, patrons and a few obstacles.

Study subjects strapped on a backpack with a laptop in it and picked up the smart walking stick. They swiveled to survey the room with a camera attached near the cane handle. Like a self-driving car, algorithms running inside the laptop identified the various features in the room then calculated the route to an ideal seat.

Image of a shelf containing several different boxes of cereal. Each box is marked by a red or green square and scored with numbers ranging from 0.36 to 0.91.

A computer vision algorithm scores boxes of cereal to identify a target product—in this case, a box of Kashi GO Coconut Almond Crunch. (Credit:Collaborative Artificial Intelligence and Robotics Lab)

The team reported its findings this fall at the International Conference on Intelligent Robots and Systems in Kyoto, Japan. Researchers on the study also included Hayes, director of the CAIRO Lab, and doctoral student Mary Etta West.

The study showed promising results: Subjects were able to find a socially desirable chair in 10 out of 12 trials with varying levels of difficulty and found at least an open chair in every trial. So far, the subjects have all been sighted people wearing blindfolds. But the researchers plan to evaluate and improve their device by working with people who are blind or visually impaired once the technology is more dependable.

“Shivendra’s work is the perfect combination of technical innovation and impactful application, going beyond navigation to bring advancements in underexplored areas, such as assisting people with visual impairment with social convention adherence or finding and grasping objects,” Hayes said.

Let’s go shopping

Next up for the group: grocery shopping.

In new research, which the team hasn’t yet published, Agrawal and his colleagues adapted their device for a task that can be daunting for anyone: finding and grasping products in aisles filled with dozens of similar-looking and similar-feeling choices.

Again, the team set up a makeshift environment in their lab: this time, a grocery shelf stocked with several different kinds of cereal. The researchers created a database of product photos, such as boxes of Honey Nut Cheerios or Apple Jacks, into their software. Study subjects then used the walking stick to scan the shelf, searching for the product they wanted.

“It assigns a score to the objects present, selecting what is the most likely product,” Agrawal said. “Then the system issues commands like ‘move a little bit to your left.’”

He added that it will be a while before the team’s walking stick makes it into the hands of real shoppers. The group, for example, wants to make the system more compact, designing it so it can run off a standard smartphone attached to a cane.

But the researchers also hope their preliminary results will inspire other engineers to rethink what robotics and AI are capable of.

“Our aim is to make this technology mature but also attract other researchers into this field of assistive robotics,” Agrawal said. “We think assistive robotics has the potential to change the world.”