Trending: SpaceX aces a fiery rehearsal of the worst-case scenario for Crew Dragon spaceflights
Jerome Lecoq and Kate Roll at Allen Brain Observatory
Senior scientist Jerome Lecoq and research associate Kate Roll inspect a microscope platform from the Allen Brain Observatory that was used to record real-time cellular activity in the visual cortex of mice as they were shown pictures and movies. (Credit: Allen Institute)

The Allen Brain Observatory is open for business, revealing what’s running through the mind of a mouse as it sees patterns of light and dark, pictures of butterflies and tigers – or even the opening scene of Orson Welles’ 1958 classic film, “Touch of Evil.”

The online repository of 30 trillion bytes’ worth of brain-cell readings represents the latest scientific offering from the Allen Institute for Brain Science, funded by Microsoft co-founder Paul Allen. It follows through on a $300 million pledge that Allen made more than four years ago.

The Allen Institute’s president and chief scientific officer, Christof Koch, has compared the project to a Hubble Space Telescope for the brain.

“No one has ever taken this kind of industrial approach to surveying the active brain at cellular resolution in order to measure how the brain processes information in real time,” Koch said today in the institute’s announcement of the data release. “This is a milestone in our quest to decode how the brain’s computations give rise to perception, behavior and consciousness.”

To gather their readings, scientists opened up tiny windows into the skulls of a few dozen genetically engineered mice. These mice carry a gene that makes brain cells glow under just the right amount of laser light when they’re active. Each mouse was put into a darkened chamber the size of a storage box, and then shown a series of images on a video screen.

As each mouse watched the imagery, a camera captured the microscopic flashes of activity in its visual cortex. More than 18,000 individual neurons could be monitored.

“We basically take little movies of the cells that are responding. … We can start mapping on a cell-by-cell basis what those neurons are responding to when they respond to visual stimuli,” Amy Bernard, product architect at the Allen Institute, told GeekWire.

The Brain Observatory’s prime objective is to make the data available to neuroscientists around the world via the institute’s website. But Bernard and the institute’s other researchers have already identified some intriguing frontiers to explore. “We’re reeling in joy for the sheer amount of data we’re able to play with,” she said.

A mouse's butterfly brain cell
This corona data plot shows the response of a single mouse brain cell in the Allen Brain Observatory to still pictures of natural scenes. Each dot represents one cell’s mean response to a single image presentation. Each image is shown multiple times, and all presentations of an image form a ray. The two longest arms of this plot correspond to two different images of butterflies. (Credit: Allen Institute)

For example, there’s one particular neuron that lights up more brightly when a mouse sees a butterfly, even if the winged insect is seen from two different perspectives. Previous studies conducted at the institute have shown that specialized neurons respond to different patterns of light and dark, such as vertical vs. horizontal stripes. But the butterfly neuron seems to be different.

“There’s something beyond just the physical pattern of blotches and lines that that cell is responding to: There’s something like ‘butterflyness,'” Bernard said.

The Orson Welles movie provides another example. The institute’s scientists decided to have the mice watch the first three minutes of “Touch of Evil” because it consists of one unbroken shot that flows smoothly through the streets of a Mexican town. Not that the mice are film critics, mind you. Rather, the researchers were curious to see how the visual cortex deals with a smooth, flowing scene rather than a slideshow of rapidly changing, unconnected images.

Bernard said the difference was noticeable. “The specific neurons that respond to images from a ‘natural’ movie seemed to be more coordinated than when the images are seen one by one,” she said.

The findings suggest that the visual cortex does much more than merely capture images for processing in other regions of the brain.

The mouse brain readings are presented on the Allen Brain Atlas website using a novel visualization interface, and researchers can download the data in a standardized file format known as Neurodata Without Borders. They also have access to the analytical tools created by the Allen Institute. Even more data will become available in the months to come.

Meanwhile, researchers at the institute are gearing up for another observing session under a new set of circumstances. Next time, the mice will be allowed to roam around in what amounts to a virtual-reality theater, but without the headsets. When the mice move on a turntable in the viewing chamber, that will affect what they see on the screen.

The point is to see how the mice process information when they’re not just being couch potatoes. “When animals are moving, all of their responses seem to be amped up in terms of neurons,” Bernard explained.

Over the long term, data from the Allen Brain Observatory will be fed into even more ambitious brain research efforts. The institute is already collaborating with the Swiss-based Blue Brain Project to create networks of virtual mouse neurons. Blue Brain’s aim is to simulate the brain in a supercomputer, which sounds a lot like the plot for a science-fiction movie. Just imagine what Orson Welles could have done with that.

Subscribe to GeekWire's Space & Science weekly newsletter


Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.