Trending: Amazon employee at Seattle-area warehouse tests positive for COVID-19
Air Force Staff Sgt. Christine Blanco performs an ultrasound examination on a patient in Afghanistan. (Air Force Photo / Justyn M. Freeman)

A new collaboration between Fujifilm SonoSite and the Allen Institute for Artificial Intelligence aims to use AI to generate better interpretations of ultrasound images, opening the way for new applications and enhanced accuracy.

The collaboration demonstrates how Pacific Northwest connections can pay off: Fujifilm SonoSite, a subsidiary of Japan’s Fujifilm that’s headquartered in Bothell, Wash., reached out to the startup incubator at the Seattle-based institute, known as AI2, for advice on improving their compact ultrasound imaging systems.

“The AI2 Incubator was a perfect place to look for help in creating breakthrough technology,” Rich Fabian, SonoSite’s president and chief operating officer, said in a news release. “They have the type of talent that is hard to recruit, combined with the hunger of a startup. We look forward to collaborating more.”

Ultrasound imaging is significantly more affordable and portable than X-ray imaging, CT scans or PET scans, with none of the downside associated with radiation exposure. “Ultrasound’s comparative disadvantage is its lower image quality, which we aim to address with the use of deep learning,” Vu Ha, technical director at the AI2 Incubator, told GeekWire in an email.

Ha said deep learning and computer vision could be applied to a wide range of ultrasound scenarios. “The plan is to incorporate this technology in multiple products,” he said.

SonoSite hasn’t specified how AI-assisted ultrasound could be used. But Ha suggested that one potential application could identify blood vessels under the skin.

“We train deep learning models on ultrasound images, where veins and arteries have been carefully labeled by sonographers,” Ha explained. “When deployed, these trained models would detect vessels and visualize them on the ultrasound screen in real time. Clinicians would then use these visualizations to locate veins with much higher confidence, minimizing the need to poke patients multiple times.”

Ha said visualizing the arteries would show clinicians which spots on the body to avoid when they’re sticking in a hypodermic needle.

“Annually, 20% of the population in the U.S. goes through the painful experience of getting poked three or more times during IV procedures, due to the fact that they have hard-to-find veins,” Ha said. “With an AI-assisted vein detection ultrasound device, this problem can potentially be solved for many millions of patients.”

Artificial intelligence has already been applied to image interpretation for the diagnosis of maladies ranging from malaria to early-stage lung cancerbreast cancer and cervical cancer. AI2 and SonoSite aim to make similar advances in the ultrasound realm.

“The combination of deep learning and medical imaging is very exciting for the future of detection,” Diku Mandavia, senior vice president and chief medical officer of Fujifilm SonoSite, said in the company’s news release. “Better care and catching anomalies earlier and faster is a core mission.”

Update for 5:05 p.m. PT Sept. 20: This story has been revised to reflect the fact that SonoSite has not yet announced how AI-assisted ultrasound image recognition would be used in its devices.

Subscribe to GeekWire's Space & Science weekly newsletter


Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.