(Bigstock Photo)

You’ve no doubt covered a flashlight with your hand in a dark room. Light from the flashlight’s beam makes your hand glow red and orange as it bounces off skin, tissue, muscles and other organs. But the light is scattered so thoroughly that you can’t see the outlines of individual parts, just a bright glow.

Imagine being able to decode that glow to get a precise image of your bones, or even a count of all the white blood cells in your veins. That’s exactly what researchers from Houston’s Rice University and Pittsburgh’s Carnegie Mellon University hope to do, powered by a $10 million grant from the National Science Foundation.

The universities announced the new project Tuesday. It aims to develop a camera that can literally see through human skin, using elements of machine learning. The work will be led by Rice University’s Ashutosh Sabharwal, the grant’s principal investigator, and CMU computer vision researcher Srinivasa Narasimhan.

“Essentially, we are trying to build the next generation super-cameras that see the unseen,” Narasimhan told GeekWire via email. He said the new camera is less like a single device and more like a platform, which could power new medical technology addressing more than 100 diseases.

Carnegie Mellon University computer vision researcher Srinivasa Narasimhan is the associate director of the project. (CMU Photo)

“This could significantly reduce the number of biopsies as well as blood tests and help with early diagnosis and treatment of diseases,” Narasimhan said. “At this point, we are able to peer through a couple of millimeters below the skin but with the project, we are aiming to go much deeper. With each millimeter, we will be able to diagnose and treat progressively more and more conditions.”

The work will be done by research teams at CMU and Rice with additional collaborators at Harvard University, MIT and Cornell University.

CMU researchers will also work with researchers at the University of Pittsburgh School of Medicine to investigate applications in critical care and cardiovascular health and with physicians at the Allegheny Health Network, the region’s second-largest health system, on skin cancer applications.

The camera will use a technique called computational scatterography that reads the path of individual photons, or light particles, as they travel through a person’s body.

“An ordinary camera integrates too many photons that traveled in widely different paths, thus the information carried by each photon or a group of photons is impossible to unravel,” Narasimhan said. The image is essentially blurred out by all the noise of the scattered photons.

“Computational photo-scatterography uses a combination of specialized control of illumination and sensing to subselect photons that travel along informative paths within the body and machine learning algorithms to decode those photons,” he said.

In other words, the technique follows certain photons that can share the most accurate information and then uses machine learning to reverse-engineer a clear image of what’s happening inside someone’s body.

The technology could give doctors information down to the level of individual cells, a kind of resolution that MRIs and X-rays can’t provide. CMU researchers, including Narasimhan, have used the same technology to enable cameras to see through smoke and fog, like the Episcan3D, demonstrated below.

The opportunities in health are endless. One, which Rice’s Sabharwal cites in a press release, is as simple as monitoring a patient’s white blood cell count. Today, the count requires a visit to the hospital for a blood draw or finger prick. Oncologists routinely perform millions of these procedures on cancer patients every week.

“Imagine a wearable device no larger than a watch that uses sensors to continuously measure white blood cell count and wirelessly communicate with the oncologist’s office,” Sabharwal said. “The patient could go about their daily life. They’d only have to go to the hospital if there was a problem.”

The technology could also power next-generation medical imaging and replace many common blood tests, giving doctors better opportunities to catch diseases early on.

Of course, it will take years of research and development before the technology reaches a hospital or clinic, but the project is an interesting example of how artificial intelligence and more high-powered technology solutions are making strides in addressing healthcare problems.

It also points towards another trend: Creating broadly applicable technology for healthcare settings instead of highly specialized devices. That approach is now being realized because of the abilities of machine learning, among other factors.

It’s a trend that has been embraced by companies like Senosis Health, a startup founded by University of Washington innovator Shwetak Patel. It develops smartphone apps that function as basic diagnostic devices. Senosis was bought by Google last year.

The same approach is evident in Adaptive Biotechnology’s recent partnership with Microsoft, which aims to create a universal blood test that scans a patient’s immune system for dozens or hundreds of diseases at once.

If the Rice and CMU project’s camera technology is fully realized, it could replace everything from imaging devices to biopsies to blood tests. It would be a radical change for the healthcare system, but one that has the potential to address issues like rising costs and an increase in chronic disease.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.