AI Spots Rare Hormonal Disorder from Hand Photos with Accuracy Surpassing Specialists
Acromegaly is the kind of disease that hides in plain sight. It develops slowly over years or decades, driven by excess growth hormone produced by a pituitary tumor. Hands and feet enlarge. Facial features coarsen. Joints ache. Because the changes are gradual and the condition is rare, the average time from first symptoms to confirmed diagnosis runs close to ten years. By then, potentially life-threatening complications - heart disease, diabetes, colon polyps - may already have taken hold. Life expectancy can be reduced by roughly a decade if the condition goes untreated.
An AI model developed at Kobe University can now identify the condition from photographs of the back of the hand and the clenched fist, without requiring a face shot. Published in the Journal of Clinical Endocrinology and Metabolism, the research shows the system outperforms experienced endocrinologists evaluating the same photographs - an achievement that could fundamentally change how suspected cases get routed to specialists.
The privacy problem that steered research toward hands
Earlier attempts to use AI for acromegaly detection relied on facial photographs. That approach has a practical obstacle: patients may reasonably object to having their faces entered into diagnostic databases, shared across medical facilities, or potentially exposed in a data breach. Endocrinologist Hidenori Fukuoka, who led the research, explains that despite considerable interest, facial-image AI tools have not found clinical adoption, in part because of these privacy concerns.
Graduate student Yuka Ohmachi proposed a different approach. "Trying to address this concern, we decided to focus on the hands, a body part we routinely examine alongside the face in clinical practice for diagnostic purposes, particularly because acromegaly often manifests changes in the hands," she said. The team went further still, deliberately excluding palm images - which contain individual fingerprint and line patterns that could identify a person - and training only on the dorsal (back) view and clenched fist.
This design choice enabled a recruitment scale that would have been harder to achieve with facial photographs. The team enlisted 725 patients across 15 medical facilities throughout Japan, generating more than 11,000 images for training and validation.
Performance that surprised the researchers
The model achieved high sensitivity and high specificity - meaning it correctly identified most acromegaly cases while generating few false positives. More striking, when experienced endocrinologists evaluated the same photographs, the AI outperformed them. "Frankly, I was surprised that the diagnostic accuracy reached such a high level using only photographs of the back of the hand and the clenched fist," Ohmachi said. "Achieving this level of performance without facial features makes this approach a great deal more practical for disease screening."
The researchers are careful to frame the tool's role accurately. In clinical practice, diagnosis relies on many data points - blood tests, imaging, symptom history, physical examination. The hand-based AI is not a replacement for that process. It is designed to serve as a screening layer: a system that could flag probable cases during routine health check-ups and direct them toward specialist evaluation earlier than would otherwise happen.
Extending the platform to other hand-visible conditions
Acromegaly is one of several conditions that produce visible changes in the hands detectable enough to train a classifier on. The team identifies rheumatoid arthritis, anemia, and finger clubbing as conditions where similar approaches might work. This broader application potential is significant: a single hand-photograph screening system could potentially serve as an entry point for detecting multiple conditions, improving early detection rates across a range of diseases that currently face long diagnostic delays.
The study has limitations inherent to its design. All 725 patients came from Japanese medical facilities, and the model's performance in different ethnic groups with different hand morphology is untested. The training images were collected in clinical settings under controlled conditions; real-world performance using smartphone photographs taken in varied lighting may differ. And the model's false negative rate - cases it misses - will need careful characterization before any clinical deployment, since missed diagnoses in a rare disease context carry real costs.
The next step the team identifies is extending validation to external datasets and eventually piloting the tool within comprehensive health check-up infrastructure, where it could refer suspected cases to endocrinologists. Fukuoka sees this as part of a broader infrastructure shift: using AI to connect regional or non-specialist healthcare settings to the expertise they currently lack access to, reducing disparities in diagnosis rates across geographic and economic lines.