In 2021, a team of Australian scientists spent 18 months teaching a cluster of human brain cells to play Pong.

In February 2026, an independent developer with no formal biology background taught 200,000 human neurons to play DOOM — in about a week.

That compression tells you everything you need to know about where biological computing is headed.


What Actually Happened

Cortical Labs, a Melbourne-based biotech company founded in 2019, has been quietly building what they call “the world’s first code-deployable biological computer.” Their device is called the CL1.

The CL1 is not a metaphor. It is not a simulation of neurons. It is real, living human brain cells — derived from adult skin and blood samples, reprogrammed into stem cells, then differentiated into neurons — grown directly onto a silicon chip embedded with a multi-electrode array.

Those electrodes do two things: they send electrical signals into the neurons (input), and they listen to the neurons firing back (output). Feed the neurons a pattern that represents, say, an enemy appearing on the left side of a screen, and the neurons respond. Interpret that response as a game command — move right, shoot, turn — and you have a biological controller.

No traditional software. No gradient descent. No model weights measured in terabytes.

Just neurons doing what neurons do.


The Doom Experiment

The Pong demonstration in 2021 was genuinely remarkable, but it was also extraordinarily narrow. The neurons controlled a single paddle responding to a ball moving in one dimension. The relationship between input and output was nearly direct. It worked — but it required years of careful scientific work to make it work.

Doom is a different problem entirely.

Doom involves a three-dimensional environment (technically 2.5D), multiple enemy types, weapons, navigation through complex geometry, and real-time decision-making under uncertainty. There is no direct input-output relationship. The system has to learn to associate patterns of electrical stimulation with meaningful actions in an environment it has never experienced and has no conceptual framework to understand.

Cortical Labs solved the interface problem by building a Python API for the CL1 — the Cortical Cloud. Developers can now deploy code to living neurons remotely, the same way you would spin up a cloud compute instance.

An independent researcher named Sean Cole, described as having “relatively little expertise working directly with biology,” used that API to pipe the Doom video feed into patterns of electrical stimulation across the neural culture. When a demon appeared on the left, electrodes on the left side of the array fired. The neurons responded. Those responses were translated into game commands.

The neurons learned to seek out enemies, shoot, spin, and navigate.

“Is it an eSports champion? Absolutely not,” said Dr. Alon Loeffler of Cortical Labs. “Right now, the cells play a lot like a beginner who’s never seen a computer. And in all fairness, they haven’t. But they show evidence that they can seek out enemies, they can shoot, they can spin. And while they die a lot, they are learning.

That last part is the entire point.


The Energy Argument Is Not Subtle

The AI industry is on a collision course with its own infrastructure demands. Data centers consumed an estimated $200 billion in 2025. Training a frontier language model can draw tens of megawatts. The compute requirements for frontier AI are doubling every six to twelve months, and the energy grid is not keeping pace.

A 30-unit CL1 server rack draws between 850 and 1,000 watts total.

Your brain — which performs reasoning, language, memory, motor control, emotional processing, and sensory integration simultaneously — runs on roughly 20 watts.

A single GPU cluster training a large language model can consume megawatts.

The energy economics of biological compute are not incrementally better than silicon. They are orders of magnitude better. And that gap does not shrink as the tasks get harder — it widens, because biological neurons evolved specifically to handle exactly the problems where silicon struggles most: real-time adaptation under uncertainty, learning from minimal data, processing ambiguous signals without brute-force compute.

Individual CL1 units cost $35,000, and Cortical Labs began shipping 115 units in 2025. Through the Cortical Cloud, access is priced at the level of a software subscription — developer access to living neurons without touching a lab.


Who Is Paying Attention

Cortical Labs raised $11 million in a Series A round led by Horizons Ventures, with participation from Blackbird Ventures, LifeX Ventures, Radar Ventures — and In-Q-Tel, the CIA’s venture capital arm.

In-Q-Tel does not fund science projects. They fund infrastructure they believe will matter for national security within a five to ten year horizon. Their portfolio includes foundational investments in data analytics, cybersecurity, and — increasingly — engineered biology.

Biological compute excels at exactly the tasks most relevant to intelligence applications: real-time adaptation in uncertain environments, pattern recognition from incomplete data, autonomous decision-making at low power in remote or constrained settings.

The Doom demo is marketing. The investor list is the signal.


What Researchers Are Saying

The scientific community has been measured but genuinely excited.

Dr. Andrew Adamatzky at the University of the West of England described the Doom result as a significant step forward: “What’s exciting here is not just that a biological system can play Doom, but that it can cope with complexity, uncertainty, and real-time decision-making. That’s much closer to the kinds of challenges future biological or hybrid computers will need to handle.”

Steve Furber at the University of Manchester acknowledged that Doom represents a meaningful level up from Pong, while noting that many fundamental questions remain — including how the neurons “know” what is expected of them, and how they effectively “see” a screen without visual apparatus.

Brett Kagan, Chief Science Officer at Cortical Labs, was more direct about what the demo actually proves: “Unlike the Pong work that we did a few years ago, which represented years of painstaking scientific effort, this demonstration has been done in a matter of days by someone who previously had relatively little expertise working directly with biology. It’s this accessibility and this flexibility that makes it truly exciting.”

That is the inflection point. Not the neurons playing Doom. The fact that anyone can now deploy code to neurons through an API.


The Ethical Dimension

It would be dishonest to discuss this technology without acknowledging the questions it raises.

These neurons are not conscious. Experts are consistent on this point — 200,000 neurons responding to electrical feedback are not experiencing anything, any more than a single transistor experiences current. The neurons do not know they are playing Doom. They are not afraid of the demons.

But consciousness is not a binary switch. It exists on a continuum. As organoid complexity increases — as these cultures grow larger, more structurally sophisticated, more interconnected — the question of where “responsive to stimulation” ends and “experience” begins becomes genuinely difficult.

The sourcing of neurons from human adult cells raises consent questions that the field is still working through. The comparison to HeLa cells — Henrietta Lacks’s cell line, which has been used in research for decades without her initial consent — is not a perfect analogy, but it is not a frivolous one either.

As Cortical Labs opens its API to developers and researchers globally, the range of applications being built will expand rapidly. Most will be benign. Some will not be. The regulatory frameworks for biological computing are essentially nonexistent right now.

These conversations need to happen faster than the technology is developing. They are not happening fast enough.


What Comes After Doom

Cortical Labs is explicit about the roadmap. Doom was chosen because it represents a category of problem — complex 3D environments, real-time uncertainty, adaptive goal-directed behavior — that maps directly to useful real-world applications.

The near-term targets:

  • Robotic arm control: Using biological neurons as a processing hub for precision manipulation tasks, where the continuous real-time adaptation of living tissue outperforms rigid silicon controllers
  • Prosthetics and sensory interfaces: Biological compute that can learn to interpret ambiguous signals from nerve endings
  • Drug discovery: Brain organoid systems that respond to pharmaceutical compounds the way actual human neurons respond
  • Edge compute: Low-power autonomous systems that can operate in environments where megawatt infrastructure is not available

The Cortical Cloud positions the company not as a hardware vendor but as a platform. If biological compute proves to have the advantages it appears to have — particularly in energy efficiency and adaptive learning — the platform that controls access to it will be extraordinarily valuable.


The Question That Matters

The internet reacted to this story the way the internet reacts to everything: memes about renting your brain cells to AI companies, existential jokes, a predictable cycle of hype and dismissal.

The question that got asked over and over was: “Can it run Doom?”

That question has been answered.

The question that actually matters is different.

Silicon AI has reached a point where advancing capability requires exponentially more energy, more data, and more infrastructure. The returns are diminishing. The costs are compounding.

Biological neurons learn from minimal data, adapt in real time, run on fractions of a watt, and handle ambiguity without brute force.

Cortical Labs built a Python API. An independent developer spent a week on it.

The neurons are ready.

The question is what happens when they can run everything else.


Cortical Labs CL1 units are currently available for research institutions and developers through the Cortical Cloud platform. The Python API is open for application.