Hui Yang and Kevin Mekulu
Hui Yang and Kevin Mekulu. Photo credit: Hui Yang

Artificial intelligence systems capable of analysing subtle nuances in human speech could detect signs of dementia and Alzheimer’s disease years before traditional clinical methods, according to new research from Penn State.

In a series of papers published in the Journal of Alzheimer’s Disease Reports and Frontiers in Aging Neuroscience, researchers propose a new framework that uses “Agentic AI” to screen for neurodegenerative conditions. The system looks for microscopic linguistic changes — such as hesitations, specific word choices, and structural repetition — that often escape human detection.

The approach addresses a critical bottleneck in geriatric care: traditional screening tools are often subjective, resource-intensive, and require specialists who are in short supply.

“Traditional dementia screening tools are paper-based, subjective and resource-intensive, requiring 10 to 15 minutes of staff time for administration,” says Hui Yang, Chair in Industrial and Manufacturing Engineering at Penn State. “With the U.S. facing a shortage of geriatric specialists, having roughly one geriatrician for every 10,000 geriatric patients… a scalable AI solution is urgently needed.”

Beyond ‘static’ computing

The researchers distinguish their approach from the “static” AI models currently used in healthcare, which typically just process an input to produce an output. Instead, they are utilising “Agentic AI” — systems capable of independent planning and dynamic interaction.

“Most AI models used today in health care are static… Agentic AI, by contrast, are systems capable of independently planning and executing complex tasks without human oversight,” says Kevin Mekulu, a doctoral candidate and co-author of the research.

“In our work, AI agents are not just scoring a test — they guide a screening interaction, adapt prompts based on a person’s responses and integrate multiple signals… into a coherent assessment,” Mekulu adds. This transforms the screening from a one-time measurement into an evolving process that better reflects the nuances of cognitive decline.

Speech as a window to the brain

The researchers focused on speech because it is one of the most “information-dense” behaviours humans perform. Speaking requires the complex coordination of memory, attention, executive function, and motor planning—all systems that are compromised early in the progression of neurodegenerative disease.

By analysing the “hidden transitions” in speech, the AI can identify patterns in fluency and language structure that a human listener might miss.

“Our AI analyses complex dynamics and transitions hidden in speech rather than relying on subjective clinical impressions alone,” says Yang. “This approach allows us to extract objective, quantitative biomarkers from natural patient behaviour, which removes a lot of the subjective interpretation associated with traditional tests.”

Future applications

The team believes the technology could eventually analyse more than just speech, incorporating eye-movement patterns, physiological signals, and motor behaviour to create a holistic view of a patient’s cognitive health.

“Speech is a powerful starting point, but it’s only one piece of the puzzle,” says Mekulu. “Interpreting all these signals together offers clinicians a more holistic view of cognitive health, not just whether someone passes or fails a test.”

The researchers are now working with neuropsychologists and communication science experts to validate the tools in real-world settings, specifically assisted living and memory care environments where early signs of decline are often first observed.

“We aim to bridge the gap between academic research and everyday clinical decision-making by validating these methods in real-world care environments,” says Yang.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Resilience by design: Protecting the North’s digital backbone

theFreesheet is the official media partner for Manchester Edge & Digital Infrastructure…

DeepMind and Anthropic CEOs clash on AGI timeline but agree on disruption

The leaders of two of the world’s most powerful AI companies offered…

Funny business: Algorithms reveal hidden engineering of stand-up comedy

It may feel like a spontaneous conversation, but a new algorithmic analysis…

95% of AI pilots failing as companies driven by ‘fear of missing out’, Davos told

Ninety-five per cent of generative AI pilot projects are failing to deliver…

‘Digital harness’ needed to tame AI before it surpasses human intelligence

A “digital harness” is urgently needed to prevent artificial intelligence from outrunning…