Imagine a future where artificial intelligence, from generating stunning images to powering intelligent assistants, runs not on massive, power-hungry supercomputers, but with a fraction of the energy. It sounds like science fiction, yet a startup named Extropic is making a bold claim that could fundamentally reshape the landscape of AI: a new “probabilistic” AI chip capable of slashing energy consumption by an astounding 10,000 times compared to today’s powerful GPUs.
This isn’t merely an incremental improvement; it’s a paradigm shift, addressing what many in the tech world see as the looming bottleneck for AI’s limitless potential: electricity.
The AI Energy Crisis
The artificial intelligence boom, particularly with the rise of generative AI models, has brought with it an unprecedented demand for computational power. This demand, in turn, translates directly into an insatiable hunger for electricity. Data centers worldwide are already struggling to secure enough power to support the training and inference of advanced AI models. As Extropic officials aptly put it, “With today’s technology, serving advanced models to everyone all the time would consume vastly more energy than humanity can produce.”
For years, the focus has largely been on developing faster chips and more efficient algorithms. However, Extropic recognized that the fundamental limit might not be chip speed or data availability, but rather the sheer amount of energy required to keep these intelligent systems running. Instead of chasing new energy sources, Extropic decided to tackle the problem at its core: making AI itself radically more energy-efficient.
Extropic’s Probabilistic Computer
Modern AI, especially neural networks, relies heavily on graphics processing units (GPUs). Originally designed for rendering complex graphics, GPUs excel at matrix multiplication, the core mathematical operation behind how neural networks learn and make predictions. However, GPUs are not inherently energy-efficient for this purpose. A significant portion of their power consumption is dedicated to moving vast amounts of data around the chip, rather than the computations themselves.
Extropic’s innovation lies in its entirely new class of AI chip: a scalable probabilistic computer. This hardware is purpose-built for directly sampling probability, rather than performing the energy-intensive matrix math that GPUs specialize in. This new device is called the Thermodynamic Sampling Unit, or TSU.
How the Thermodynamic Sampling Unit (TSU) Works
Unlike traditional AI chips (including GPUs and TPUs) that perform extensive calculations to estimate probabilities, Extropic’s TSUs claim to bypass this step, directly generating samples from complex distributions. Here’s a closer look at its key features:
- Probabilistic Cores: TSUs are constructed from large arrays of these cores.
- Energy-Based Models (EBMs): They sample from these machine learning models.
- Local Communication: Crucially, TSUs minimize energy consumption by keeping communication strictly local, meaning circuits only interact with their immediate neighbors. This design avoids the costly long-distance wiring and voltage changes that are major energy drains in GPUs.
In essence, TSUs are physically and energetically optimized for probability, not arithmetic.
At the heart of the TSU is the pbit, Extropic’s fundamental building block. While a traditional digital bit is always a definitive 1 or 0, a pbit fluctuates randomly between 1 and 0, with its probability of being in either state being programmable. This makes a pbit essentially a hardware random number generator, but unlike previous academic pbit designs requiring exotic components, Extropic has engineered a pbit entirely from standard transistors, making it commercially viable and highly energy-efficient.
Denoising Thermodynamic Model (DTM)
To showcase the practical application of their novel hardware, Extropic has also developed a new generative AI algorithm called the Denoising Thermodynamic Model (DTM). Inspired by diffusion models—the same family of algorithms used by popular image generators like Stable Diffusion—DTMs iteratively transform noise into structured output.
However, DTMs are specifically designed to leverage the unique architecture of TSUs, making them far more energy-efficient than their GPU-reliant counterparts. Extropic’s simulations of their first production-scale TSUs running small-scale generative AI benchmarks suggest a staggering 10,000 times greater energy efficiency compared to modern algorithms on GPUs. These results can even be replicated using thrml, Extropic’s open-source Python library, allowing developers to experiment with thermodynamic machine learning today.
Why Extropic’s Breakthrough Matters
The implications of Extropic’s claims, if validated at scale, are profound. The current trajectory of AI development faces an insurmountable energy ceiling. Every major AI model increases compute requirements, which in turn increases energy demand, leading to an unsustainable path for data centers already grappling with power limitations.
Extropic’s proposed solution attacks half of this equation: drastically reducing the energy consumed per AI workload. This could remove the “energy ceiling” that currently prevents the widespread, always-on AI applications envisioned for the future. While new power generation methods (like nuclear-powered data centers) remain vital, TSUs aim to make AI less reliant on massive, continuous increases in electricity supply.
The Road Ahead
Extropic states that the “fundamental science is done” and the company has now entered the critical build-out phase. This involves scaling their small prototypes into production-grade systems, a process that requires top talent in mixed-signal integrated circuit design, hardware systems engineering, and probabilistic machine learning.
Their XTR-0 development platform has already undergone beta testing with early partners, a crucial step in moving from theoretical promise to practical application. While still in its nascent stages, Extropic is confident in its mission to “rebuild computing from the ground up to match what AI actually is — probabilistic, not deterministic.”
Key Takeaways for Enterprise Leaders
- AI’s Growth Will Hit Energy Limits: The demand for electricity is rapidly becoming the next major bottleneck for scaling AI, even before compute limits are reached.
- A Fundamentally Different AI Chip: Extropic’s TSUs offer a new class of AI chip that doesn’t just run matrix math faster but avoids it entirely by directly generating samples.
- Early Validation is Promising: Small-scale tests in hardware and simulations, including a working pbit design and open-source algorithm replication, demonstrate the concept’s viability.
- Transformative Energy Efficiency: The claim of up to 10,000 times greater energy efficiency, if proven at scale, would drastically alter AI infrastructure economics and accessibility.
As Extropic moves from breakthrough research to commercial deployment, the world watches with anticipation. Should their vision materialize, we could be on the cusp of an era where advanced AI is not constrained by its immense power requirements, opening doors to possibilities currently unimaginable.
Join our community by subscribing to our Weekly Newsletter to stay updated on the latest AI updates and technologies, including the tips and how-to guides. (Also, follow us on Instagram (@tid_technology) for more updates in your feed).
(For more such interesting informational, technology and innovation stuffs, keep reading The Inner Detail).








I like the valuable information you provide in your articles.
I will bookmark your blog and check again here frequently.
I am quite sure I will learn lots of new stuff right here!
Good luck for the next!
Do you have any video of that? I’d love to find out some
additional information.
I am regular visitor, how are you everybody?
This piece of writing posted at this website is truly fastidious.
Awesome things here. I’m very satisfied to look your post.
Thanks so much and I am looking ahead to contact you. Will you please drop me a e-mail?
Its not my first time to pay a quick visit this web page,
i am visiting this web page dailly and obtain pleasant data from here all the time.