Summary:
Florida Atlantic University (FAU) researchers have developed a novel AI-powered platform that uses real-time video analysis and deep learning to diagnose nystagmus remotely and cost-effectively, offering a promising alternative to traditional vestibular testing methods.
Key Takeaways:
- Real-Time, Smartphone-Based Diagnosis:ย The AI system enables patients to record and upload eye movement videos via smartphone for remote diagnostic evaluation, reducing the need for in-person visits and expensive equipment.
- Clinical Accuracy and Telehealth Integration:ย In pilot testing, the model produced diagnostic results comparable to gold-standard devices, supporting its use in telehealth and underserved settings.
- Scalable, Multi-Disciplinary Innovation:ย Backed by a cross-campus and multi-institutional team, the project aims to enhance accessibility, streamline clinical workflows, and expand to real-time wearable applications.
Most current AI models are based on static datasets, limiting their adaptability and real-time diagnostic potential. To address this gap, researchers fromย Florida Atlantic Universityย (FAU) and collaborators have developed a novel proof-of-concept deep learning model that leverages real-time data to assist in diagnosing nystagmusโa condition characterized by involuntary, rhythmic eye movements often linked to vestibular or neurological disorders.
The Challenges of Traditional Vestibular Testing
Gold-standard diagnostic tools such as videonystagmography (VNG) and electronystagmography have been used to detect nystagmus. However, these methods can come with notable drawbacks: high costs (with VNG equipment often exceeding $100,000), bulky setups, and inconvenience for patients during testing. FAUโs AI-driven system offers a cost-effective and patient-friendly alternative, for a quick and reliable screening for balance disorders and abnormal eye movements.
FAUโs Real-Time Solution for Nystagmus
The platform allows patients to record their eye movements using a smartphone, securely upload the video to a cloud-based system, and receive remote diagnostic analysis from vestibular and balance expertsโall without leaving their home.
At the heart of this innovation is a deep learning framework that uses real-time facial landmark tracking to analyze eye movements. The AI system automatically maps 468 facial landmarks and evaluates slow-phase velocity โ a key metric for identifying nystagmus intensity, duration and direction. It then generates intuitive graphs and reports that can easily be interpreted by audiologists and other clinicians during virtual consultations.
Results of the pilot study involving 20 participants, published in Cureus (part of Springer Nature), demonstrated that the AI systemโs assessments closely mirrored those obtained through traditional medical devices. This early success underscores the modelโs accuracy and potential for clinical reliability, even in its initial stages.
A Cost-Effective Alternative to Traditional Tools
โOur AI model offers a promising tool that can partially supplementโor, in some cases, replaceโconventional diagnostic methods, especially in telehealth environments where access to specialized care is limited,โ says Ali Danesh, PhD, principal investigator of the study, senior author, a professor in the Department of Communication Sciences and Disorders within FAUโs College of Education and a professor of biomedical science within FAUโs Charles E. Schmidt College of Medicine. โBy integrating deep learning, cloud computing, and telemedicine, weโre making diagnosis more flexible, affordable, and accessibleโparticularly for low-income rural and remote communities.โ
The team trained their algorithm on more than 15,000 video frames, using a structured 70:20:10 split for training, testing and validation. This rigorous approach ensured the modelโs robustness and adaptability across varied patient populations. The AI also employs intelligent filtering to eliminate artifacts such as eye blinks, ensuring accurate and consistent readings.
Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patientsโ electronic health records, and develop personalized treatment plans.
Beyond diagnostics, the system is designed to streamline clinical workflows. Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patientsโ electronic health records, and develop personalized treatment plans. Patients, in turn, benefit from reduced travel, lower costs and the convenience of conducting follow-up assessments by simply uploading new videos from homeโenabling clinicians to track disorder progression over time.
In parallel, FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real time. Early tests in controlled environments have shown promise, though improvements are still needed to address challenges such as sensor noise and variability among individual users.
โWhile still in its early stages, our technology holds the potential to transform care for patients with vestibular and neurological disorders,โ says Harshal Sanghvi, PhD, first author, an FAU electrical engineering and computer science graduate, and a postdoctoral fellow at FAUโs College of Medicine and College of Business. โWith its ability to provide non-invasive, real-time analysis, our platform could be deployed widelyโin clinics, emergency rooms, audiology centers, and even at home.โ
Interdisciplinary Initiative
Sanghvi worked closely with his mentors and co-authors on this project including Abhijit S. Pandya, PhD, FAU Department of Electrical Engineering and Computer Science and FAU Department of Biomedical Engineering, and B. Sue Graves, EdD, Department of Exercise Science and Health Promotion, FAU Charles E. Schmidt College of Science.
This interdisciplinary initiative includes collaborators from FAUโs College of Business, College of Medicine, College of Engineering and Computer Science, College of Science, and partners from Advanced Research, Marcus Neuroscience Instituteโpart of Baptist Healthโat Boca Raton Regional Hospital, Loma Linda University Medical Center, and Broward Health North. Together, they are working to enhance the modelโs accuracy, expand testing across diverse patient populations, and move toward FDA approval for broader clinical adoption.
โAs telemedicine becomes an increasingly integral part of health care delivery, AI-powered diagnostic tools like this one are poised to improve early detection, streamline specialist referrals, and reduce the burden on health care providers,โ says Danesh. โUltimately, this innovation promises better outcomes for patientsโregardless of where they live.โ
Along with Pandya and Graves, study co-authors are Jilene Moxam, Advanced Research LLC; Sandeep K. Reddy, PhD, FAU College of Engineering and Computer Science; Gurnoor S. Gill, FAU College of Medicine; Sajeel A. Chowdhary, MD, Marcus Neuroscience Instituteโpart of Baptist Healthโat Boca Raton Regional Hospital; Kakarla Chalam, MD, PhD, Loma Linda University; and Shailesh Gupta, MD, Broward Health North.ย ย
Featured image: FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real time.ย Photo: Florida Atlantic University