Trending: SpaceX’s prototype internet satellites are good enough for gaming, Elon Musk says

3-D cell visualization
A 3-D visualization of human cells is color-coded to highlight substructures. (Allen Institute for Cell Science)

What happens when you cross cell biology with artificial intelligence? At the Allen Institute for Cell Science, the answer isn’t super-brainy microbes, but new computer models that can turn simple black-and-white pictures of live human cells into color-coded, 3-D visualizations filled with detail.

The online database, known as the Allen Integrated Cell, is now being made publicly available — and its creators say it could open up new windows into the workings of our cells.

“From a single, simple microscopy image, you could get this very high-contrast, integrated 3-D image where it’s very easy to see where all the separate structures are,” Molly Maleckar, director of modeling at the Seattle-based Allen Institute, told GeekWire.

“You can actually look at the relationships between them, and eventually apply a time series, so you can see dynamically how those change as well,” she said. “That’s something that’s totally new.”

Molly Maleckar and Graham Johnson
Molly Maleckar and Graham Johnson work on the Allen Integrated Cell project. (Allen Institute Photos)

Eventually, the database could make it easier to monitor how stem cells transform themselves into the different types of cells in our bodies, see how diseases affect cellular processes, and check the effects that drugs have on individual cells.

“These methods are allowing us to see multiple structures where they are in the cell, relative to one another, reliably, at the same time, while perturbing the cell as little as possible,” said Graham Johnson, director of the Allen Institute’s Animated Cell project. “Our goal is to get them as close to their native, happy state as possible, without hurting them with light, without messing up their function.”

The effort began with the institute’s collection of gene-edited human induced pluripotent stem cell lines, or hiPSC lines. These special cells have been engineered to add fluorescent labels, making it possible for researchers to pinpoint the substructures inside them.

Examples of such substructures include the central area of a cell’s nucleus, the energy-producing mitochondria and the microtubules that serve as cellular scaffolds.

Researchers trained an artificial intelligence program to recognize the glow-in-the-dark substructures in thousands of cells. Then they applied that deep-learning model to simpler black-and-white images of cells that didn’t have fluorescent labels.

The resulting “label-free model” makes it possible to generate highly detailed 3-D visualizations from the kinds of views you get from a standard high-school microscope. The method is described in depth in a research paper posted to the bioRxiv pre-print website.

Another model developed at the institute can accurately predict the most probable shape and location of structures in any pluripotent stem cell, based solely on the shape of the cell membrane and the nucleus.

The models could help researchers develop detailed information about cellular interactions over time without having to use chemical dyes, laser scans or other methods that disrupt the cells being studied.

Maleckar said the technique could be used for drug discovery, but she’s just as excited about the potential applications for regenerative medicine.

“A really interesting thing there is, how do we engineer heart muscle cells so we can grow them, and they become a functional cell and eventually functional tissue?” she said. “One way we can improve that process is by learning how that process occurs.”

For now, the Allen Institute is working to fine-tune the computer modeling tools rather than moving on to the clinical applications.

“We’re really excited about the downstream applications, but that’s not our major focus right now,” Maleckar said. “We’re really trying to probe the limits of the technology.”

Today, the tools can turn high-school-level microscopy into visualizations for professional researchers — but someday, even high-school students could benefit.

Johnson recalled how some of the cells he saw through the microscope during his high-school years just looked like blobs with a couple of spots on them.

“To be able to take that same high-school scope, run the software on it one day, and be able to see six or eight different things inside that cell, and understand how those different components are connected, and why the pieces are moving the way they are — that would be so exciting,” he said.

Johnson said he was so intrigued by the idea that he wrote himself a reminder to try out the system on microscope images from an actual high school. “That’d be really cool,” he said.

Subscribe to GeekWire's Space & Science weekly newsletter


Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.