Friday, April 25, 2025

Google reveals ZAPBench to predict brain activity in zebrafish and unlock new AI-powered neuroscience research

Share

If we are lucky, Artificial intelligence might one day help scientists understand the human brain the same way language models predict the next word in a sentence. And now, that future is closer to becoming reality thanks to a new project from Google Research, Harvard University, and HHMI Janelia. You see, these teams have introduced the Zebrafish Activity Prediction Benchmark, better known as “ZAPBench,” which could help researchers create more accurate models for predicting brain activity.

ZAPBench isn’t just another dataset, folks. Actually, this new tool is based on two hours of brain recordings from larval zebrafish, capturing how roughly 70,000 neurons fired in response to different virtual reality scenarios. These tiny fish were shown various environmental changes, including shifting light patterns and moving water currents, while researchers recorded brain activity at an impressively detailed scale.

But why did they choose zebrafish? Well, while their brains are small, they are complex enough to offer valuable insights into neural activity. Even better, larval zebrafish are transparent, allowing scientists to observe their entire brain under a microscope without invasive procedures. This makes them perfect for whole-brain imaging, something that is not possible with larger animals.

Believe it or not, the project goes beyond traditional brain mapping. While connectomics focuses on showing how neurons are wired together, ZAPBench adds another layer by offering real-time activity data. In other words, it’s not just about seeing the roadmap of the brain – it’s about watching the traffic flow in real time.

Researchers used genetically modified zebrafish that express a calcium indicator called GCaMP, which glows when neurons are active. The fish were immobilized in a soft, jelly-like substance while a light sheet microscope scanned their brains one thin layer at a time. During the process, different VR-generated stimuli were projected around the fish to simulate natural situations the animals might encounter.

With this new benchmark, Google and its collaborators challenge the research community to build models that can predict what happens next in the brain. The idea is simple but powerful: Given a clip of brain activity, how well can a model forecast the next 30 seconds? Just like AI tools have transformed weather forecasting and language modeling, ZAPBench could help do the same for neuroscience.

The research team experimented with two types of data. One approach used the 3D volumetric video of brain activity, while the other worked with time-series data that tracks the activity of individual neurons over time. Early results suggest that models trained on the full 3D data sometimes perform better, likely because they can consider how neurons are arranged in space. Models that had access to longer sequences of context data also made better predictions.

Interestingly, the research showed that some areas of the zebrafish brain are harder to predict than others. Even the best models struggled with certain regions, a finding that raises new questions about how brain activity varies across different systems.

This work is far from over, however. Google and its partners are now working on creating a detailed connectome for the same zebrafish brains used in ZAPBench. Once complete, this map of neuron connections will pair perfectly with the activity data, opening the door to even deeper research into how brain structure and function work together.

Read more

Local News