RESEARCH: New AI-powered robotic system performs heart ultrasounds without guidance
A Concordia-led team of researchers have developed a new AI-driven robotic system that can perform cardiac ultrasound scans autonomously. The researchers say this approach could expand access to cardiac imaging in remote or underserved areas, reduce operator fatigue and standardize scan quality.
Ultrasound exams, especially for the heart, usually depend on skilled sonographers to carefully position and adjust the probe. This study introduces a system that replaces manual expertise with an AI agent trained to guide a robotic arm holding an ultrasound probe. The goal is to automatically find the correct imaging views needed for diagnosis.
Instead of relying only on real-world data, which can be slow and difficult to collect, the researchers built a highly realistic simulation environment using generative artificial intelligence (GenAI). The simulation is designed to generate synthetic ultrasound images that closely mimic real ones. This approach ensures the AI agent is safely and efficiently trained within the simulated environment before it is deployed on physical equipment.
Model illustration by Ehsan Zakeri
The AI agent uses deep reinforcement learning to improve its movements based on how close it gets to the desired cardiac images. Over time, the model learns how to adjust the probe’s position and pressure to produce clear, clinically useful scans.
When tested on a robotic setup with a cardiac ultrasound training phantom, the agent was able to locate standard cardiac views faster and more accurately than remote human operators. These results were consistent across repeated trials.
The researchers say that with further testing on real patients, the system could support more autonomous and widely available heart diagnostics.
The study was published in IEEE Transactions on Medical Robotics and Bionics.
The study’s lead author is Ehsan Zakeri, PhD candidate, Mechanical, Industrial and Aerospace Engineering. Additional research was conducted by Amanda Spilkin and Hanae Elmekki, Concordia PhD candidates; Wen-Fang Xie and Lyes Kadem, professors in the Department of Mechanical, Industrial and Aerospace Engineering at the Gina Cody School of Engineering and Computer Science; Jamal Bentahar, professor in the Department of Cybersecurity and Intelligent Systems Engineering; and Antonela Zanuttini, Master’s student, and Philippe Pirabot, professor, both at the Faculty of Medicine at Université Laval.
Read the cited paper: “Deep Reinforcement Learning-Based Ultrasound Visual Servoing Scheme for Autonomous Robotic Echocardiography”