Robot guide dog.
Photo credit: Binghamton University, State University of New York

Traditional guide dogs are loyal and fiercely protective companions, but they lack one crucial ability for navigating the modern world: they cannot hold a conversation. Now, an innovative team of researchers has bridged that gap by developing a chatty robotic service dog powered by advanced artificial intelligence.

Created by scientists at Binghamton University, State University of New York, the new system uses large language models to determine the safest routes and provide a spoken, back-and-forth dialogue with visually impaired users in real time.

Shiqi Zhang, an associate professor at the Thomas J. Watson College of Engineering and Applied Science’s School of Computing, highlighted the massive leap in capability over traditional service animals.

“For this work, we’re demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs. Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities.”

Scene verbalisation

Zhang and his team had previously trained robot guide dogs to lead users by responding to physical tugs on a leash. This latest iteration takes the technology a major step further by introducing two new features: “plan verbalisation”—providing detailed route information before departure—and “scene verbalisation,” which continuously describes the surroundings during travel.

Zhang said: “This is very important for visually impaired or blind people, because situational and scene awareness is relatively limited without vision.”

To test the system, the research team recruited seven legally blind participants to navigate a large, multi-room office space. The robotic dog would proactively ask the user where they wanted to go, present possible routes, and estimate the travel time. Once a route was selected, the robot guided the user while actively verbalising obstacles and surroundings, such as notifying the user when they entered a “long corridor”.

Future integration

Following the trials, participants completed questionnaires rating the system’s helpfulness and ease of communication. The results showed a strong preference for the combined approach of pre-planning explanations mixed with real-time narration.

Going forward, the team hopes to increase the robot’s autonomy and test it over much longer distances in outdoor environments.

Zhang noted that the study participants were incredibly enthusiastic about the possibility of integrating the mechanical companions into their everyday lives.

“They were super excited about the technology, about the robots. They asked many questions. They really see the potential for the technology and hope to see this working.”

The paper, titled “From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication,” was presented at the 40th Annual AAAI Conference on Artificial Intelligence.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Alarming new US survey shows half of patients rely on AI for medical choices

Across the United States, a dangerous new trend is emerging. Millions of…

Global gambling firms rush to adopt AI despite severe lack of safety controls

The global gambling industry is racing to integrate artificial intelligence into its…

Why digital tears and online outrage fail to win modern political arguments

Scrolling through your social media feed today often feels like navigating a…

Students prefer artificial intelligence until they figure out it is a machine

University students prefer to get academic advice from artificial intelligence rather than…

Tracking how war and energy policies dimmed night lights of Europe

While human civilisation is glowing brighter than ever before, the lights across…

Massive AI study uncovers the secret GLP-1 side effects hidden on Reddit

Millions of patients are flocking to GLP-1 weight loss injections, but artificial…