The AI industry just spent two years building the most energy-intensive infrastructure in human history. Data centers drew over $200 billion in 2025. Training a single frontier model can consume as much electricity as a small city.
Somewhere in Melbourne, a dish of human neurons just learned to play Doom in a week.
Somewhere in Vevey, Switzerland, sixteen brain organoids are being rented to universities through a web browser.
Somewhere in Washington D.C., two neurosurgeons are feeding stock market data into living cells and pulling out predictions.
These are not science fiction scenarios. They are happening right now, in 2026, run by three small companies that most people have never heard of. Together, they represent the earliest stages of a computing paradigm that does not compete with silicon AI so much as it runs alongside it — built not from transistors, but from biology itself.
Why Now
The timing is not a coincidence.
Silicon-based AI has hit a familiar ceiling. Bigger models require more data, more compute, and more energy. The returns are diminishing. The costs are compounding. The infrastructure required to train and run frontier models is becoming one of the largest engineering and energy challenges on the planet.
The human brain, by comparison, runs on 20 watts. It learns from a handful of examples. It handles ambiguity, real-time adaptation, and uncertainty without brute-force compute. It has been doing this for hundreds of thousands of years.
Biocomputing is the bet that you can harness that efficiency — not by simulating it in silicon, as neuromorphic chips attempt to do, but by using the actual biological substrate itself.
Three companies are currently the clearest proof that this bet is not crazy.
Company 1: Cortical Labs (Melbourne, Australia)
Founded: 2019 Funding: ~$11M Series A Investors: Horizons Ventures, Blackbird, In-Q-Tel (CIA) Product: CL1 biological computer + Cortical Cloud
What They Built
Cortical Labs grows real human neurons — derived from adult skin and blood samples reprogrammed into stem cells — directly onto silicon chips embedded with multi-electrode arrays. The electrodes send signals in and read signals out. The result is a hybrid device where living biology and conventional hardware are fused into a single computational system.
Their flagship product, the CL1, is described as “the world’s first code-deployable biological computer.” It houses approximately 200,000 neurons. It ships as hardware. It interfaces via a Python API.
In 2021, they demonstrated neurons learning to play Pong. That took 18 months of painstaking scientific work.
In February 2026, an independent developer with no formal biology background used their API to teach the same system to play Doom. That took about a week.
The Cortical Cloud
The more significant development is not the Doom demo — it is the infrastructure behind it. Cortical Labs built the Cortical Cloud, a platform that lets developers deploy code to living neurons remotely, the same way you would spin up a cloud compute instance. No lab required. No biology expertise required. Just a Python API and a subscription.
This is the platform play. If biological neurons prove to have the advantages Cortical Labs believes they do, the company that controls remote access to them at scale will occupy an extraordinarily powerful position.
Why It Matters
A 30-unit CL1 server rack draws between 850 and 1,000 watts total. For context, a single GPU cluster training a large language model can draw megawatts. The energy economics are not marginally better. They are orders of magnitude better.
The investor list tells you who else has noticed. In-Q-Tel — the CIA’s venture arm — does not fund science projects. They fund infrastructure they believe will matter for national security within a five to ten year horizon. Biological compute that excels at real-time adaptation under uncertainty, pattern recognition from incomplete data, and autonomous decision-making at low power fits that description precisely.
115 CL1 units began shipping in 2025. The neurons are ready.
Company 2: FinalSpark (Vevey, Switzerland)
Founded: 2014 Funding: Bootstrapped / seeking CHF 50M next round Founders: Dr. Fred Jordan and Dr. Martin Kutter (both PhDs from EPFL) Product: Neuroplatform — remotely accessible biocomputing research platform
What They Built
FinalSpark took a different approach to the same fundamental problem. Rather than selling hardware, they built a remote laboratory — a cloud-accessible platform where researchers anywhere in the world can run experiments on living brain organoids through a web browser or Python API.
The Neuroplatform launched commercially on May 15, 2024. It houses 16 brain organoids arranged across 4 Multi-Electrode Arrays, each organoid roughly 0.5mm in diameter and containing approximately 10,000 neurons. In total: around 160,000 living human neurons, always on, always accessible, always processing.
The system uses dopamine as a reward signal — chemically reinforcing correct outputs the same way a human brain reinforces learned behavior. Researchers can trigger neurotransmitters including dopamine, glutamate, and serotonin through the API. This is not a metaphor for reward learning. It is literally reward learning, running on real biology.
The Numbers
FinalSpark’s bioprocessors consume approximately one million times less energy than traditional digital processors. That figure is not marketing copy — it comes from peer-reviewed research published in Frontiers in Artificial Intelligence, which reached top 1% most-read status by October 2024.
By early 2026, the company had tested approximately 10 million neurons and collected 20 terabytes of experimental data across more than 1,000 experiments.
Nine research institutions currently have access to the platform, with over three dozen universities on the waiting list. Academic access is priced at $500 per month per user. Paying commercial clients can develop proprietary applications.
The Roadmap
At a presentation in London in June 2025, FinalSpark outlined a 10-year vision: bio-servers accessible via the cloud, delivering computational power for generative AI applications at a fraction of today’s energy cost.
Phase 1 (2024–2026) focuses on platform optimization — extending organoid lifespan from 100 days to 200 days. The early organoids survived only a few hours. Today they last around 100 days. The trajectory is clear.
The co-founder Fred Jordan is honest about where the technology stands: biocomputers remain “real black boxes,” he says. Unlike digital computers, the internal workings are opaque and the protocols for programming them are still being invented. But he is also clear about what makes neurons uniquely suited to this: “Of all the cells to pick from, human neurons are the best at learning.”
Why It Matters
FinalSpark is building the research infrastructure layer — the platform that lets the scientific community figure out what biological compute can actually do, at scale, from anywhere in the world. If Cortical Labs is the Apple of biocomputing (building a closed, polished product), FinalSpark is closer to AWS (building the access layer for everyone else to build on).
The business model reflects this. Free access for select academic labs. Paid access for commercial clients. A community of over 1,000 researchers on Discord actively working the problem.
Company 3: The Biological Computing Co. (Washington D.C., USA)
Founded: 2022 Funding: $25M seed (February 2026) Investors: Primary, Builders VC, E1 Ventures, Refactor Capital, Tusk Ventures Founders: Alex Ksendzovsky and Jon Pomeraniec (both neurosurgeon-neuroscientists)
What They Built
The origin story of The Biological Computing Co. (TBC) is one of the stranger founding stories in recent tech history.
It is 2021. Two neurosurgeons are in a Washington D.C. Airbnb — not on vacation, but feeding stock market data into a dish of living brain cells. They encode daily S&P 500 prices into electrical patterns and feed those patterns into neurons. The neurons do not just respond. They recognize patterns in the data.
That experiment became the first proof point. You could take information from the real world, translate it into a language biology understands, put it into living neurons, draw something out, and analyze it. The neurons were not just reacting. They were computing.
TBC’s approach differs from Cortical Labs and FinalSpark in one important way: they are not trying to replace silicon with biology. They are trying to connect biology to existing AI models to make those models more stable, scalable, and efficient.
How It Works
TBC’s neuroscience and engineering team encodes data — text, images, video — into living neurons. Those neurons process the data and produce richer, more complex representations. Those representations are then mapped back onto conventional AI models.
The insight is that neurons handle certain types of processing — particularly ambiguous, uncertain, real-time data — in ways that current transformers handle poorly and expensively. Rather than replacing the transformer, TBC adds a biological preprocessing layer that makes the transformer better.
The co-founder Ksendzovsky put it directly: “Our current AI systems have gotten to the point where they’ve been extremely inefficient, and require way too much data to train. That’s because they’ve become very unbiological.”
The planned products include an algorithm discovery platform and a software adapter — tools that let existing AI systems tap into biological compute without requiring developers to understand the underlying neuroscience.
Why It Matters
TBC raised $25 million at seed stage in February 2026 from some of the most pattern-matched investors in tech. That is not a science grant. That is a bet that the augmentation approach — biology plus silicon, not biology instead of silicon — is the fastest path to commercial value.
The neurosurgeon background matters here. Ksendzovsky and Pomeraniec understand how biological neural systems actually work at a clinical level. They are not building from analogy. They are building from direct knowledge of how neurons process information in functioning human brains.
The Bigger Picture
These three companies are building at different layers of the same stack.
FinalSpark is building the research platform — remote access to living neurons so the scientific community can figure out what is possible. Cortical Labs is building the hardware and developer tools — a deployable product that anyone can write code for. TBC is building the integration layer — connecting biological compute to the AI infrastructure that already exists.
None of them are competing with OpenAI or Anthropic. All of them are working on the problem that OpenAI and Anthropic cannot solve from inside silicon: the energy and data efficiency ceiling that every frontier AI lab is approaching.
The parallel to AI is not that biocomputing will replace AI. It is that both are converging toward the same set of hard problems at the same time, from opposite directions. AI is discovering that learning efficiently from limited data and handling real-world uncertainty is harder than scaling transformers. Biocomputing is discovering that biological neurons do exactly those things by default.
There is a reason the CIA is paying attention. There is a reason two neurosurgeons just raised $25 million at seed. There is a reason a developer with no biology background built a working interface to living neurons in a week using Python.
The infrastructure is being built right now. Most people have not noticed yet.
They will.
This is the first in an ongoing series covering the companies building the biocomputing era. More coverage on DNA computing, organoid intelligence, and bio-silicon hybrid systems at BioComputer.com.