Decades of technological advances have led to a revolution in ultrasound machines that has given rise to modern devices that weigh less than a pound and can display images on smartphones. But they still require an expert to make sense of the resulting images.
“It’s not as easy as it looks,” said Richard Fabian, CEO of Fujifilm SonoSite, a pioneer of ultrasound technologies. “A slight movement of your hand means all the difference in the world.”
That’s why SonoSite is focused on a future in which artificial intelligence helps healthcare workers to make sense of ultrasounds in real-time. The idea is that computers can be trained to identify and label critical pieces of a medical image to help clinicians get answers without the need for specially-trained radiologists.
“Using AI you can really quickly interpret what’s going on. And the focus is on accuracy, it’s on confidence, and it’s on expanding ultrasound users,” Fabian said during a talk at Life Science Washington’s annual summit in Bellevue, Wash. on Friday.
Bothell, Wash.-based SonoSite recently partnered with the Allen Institute for Artificial Intelligence (AI2) in Seattle on an effort to train AI to interpret ultrasound images. To train the models, SonoSite is using large quantities of clinical data that it gathered with the help of Partners HealthCare, a hospital in Boston.
Artificial intelligence has shown promise in interpreting medical imaging to diagnose diseases like early-stage lung cancer, breast cancer and cervical cancer. The advancements have drawn tech leaders including Google and Microsoft, who hope their AI and cloud capabilities can one day be an essential element of healthcare diagnostics.
SonoSite was initially launched with the idea of creating portable ultrasounds for the military. Its lightweight units are widely used by healthcare teams in both low-resource settings and emergency rooms.
Ultrasound imaging is significantly more affordable and portable than X-ray imaging, CT scans or PET scans, without the risk of radiation exposure. While the images it provides are not as clear, researchers think deep learning can make up some of that difference.
AI2 researchers are in the process of training deep learning models on ultrasound images in which the veins and arteries have been labeled by sonographers. One application of the AI-powered ultrasound would be to help clinicians find veins much faster and more accurately.
Fabian also gave the example of AI models labeling things such as organs and fluid build-ups inside the body, which could inform care decisions without the need for specialists. He thinks that future ultrasounds could deliver medical insights without ever displaying an image.
“If ultrasound becomes cheap enough, it could become a patch [that gives] you the information that you need,” said Fabian.