VZ editorial frame
Read this piece through one operating lens: AI does not automate first, it amplifies first. If the underlying decision architecture is clear, AI scales clarity. If it is noisy, AI scales noise and cost.
VZ Lens
From the VZ perspective, this topic matters only when translated into execution architecture. A ChatGPT query consumes ten times more energy than a Google search. The EROI has plummeted from 100:1 to 10:1. The window of opportunity will close between 2025 and 2035. Its business impact starts when this becomes a weekly operating discipline.
TL;DR
Civilization was built on cheap energy—and that foundation is now crumbling. Meanwhile, we are developing the most energy-intensive technology of all—artificial intelligence—at a time when energy is at its most expensive. The EROI (Energy Return on Investment) has dropped from 100:1 to 10–20:1; we are moving backward on the Kardashev scale instead of moving forward, and AI data centers consume as much electricity as a small town just to train a single model. The solution is not to reject technology—but to reprogram for a circular energy system. The question is not whether everything will change. It is whether you will be standing on the platform when the train departs.
Wind-blown Thoughts by Lake Balaton
The last orange streaks of the sun caress the waves as I sit on the shore. The lights dancing on the water’s surface slowly fade, and the evening breeze brushes my face with a cold sweep. In the distance, a sailboat rocks gently, its lamp shining alone in the darkening sea. I feel the moisture in the air, the cool breath of the approaching night. The water murmurs softly beneath the shore’s stones, as if the lake itself were breathing. I sit here, where the horizon seems endless, and the wind constantly reminds me: all movement is fueled by energy. The sailboat’s light grows ever lonelier in the darkness, while the rustling of the trees along the shore deepens.
When the cursor begins to tremble
Humanity is developing its most energy-intensive technologies, while the efficiency of energy production has been declining for decades. The EROI (Energy Return on Investment) has dropped from a ratio of 100:1 in the 1930s to 10–20:1; a ChatGPT query consumes ten times more energy than a Google search. The window of opportunity is open between 2025 and 2035—after that, the momentum of inertia will decide for us.
The energy dilemma of civilization stems from the fact that humanity is currently developing its most energy-intensive technology—artificial intelligence—while the efficiency of energy production (EROI) has been declining for decades. The window of opportunity is open between 2025 and 2035: after that, inertia will decide for us.
The future is slipping through our fingers unnoticed. We believe it’s natural that answers to all our questions arrive in an instant, that algorithms assist our decisions, that information is available everywhere and always. But every question we ask, every decision that artificial intelligence helps us make—demands more and more energy. Quietly, softly, turning into light.
And there will come a moment when it runs out somewhere. Not dramatically, not permanently—you’ll just be left out of it. As if a train were leaving, and you just happened not to be on the platform.
The modern world functions like a giant, interconnected computer system. Energy is the processor, society is the memory, and technology is the software that connects everything. But what happens when this system starts to slow down, and we realize that our hardware can no longer run the programs we wrote?
Cities pulse like overloaded processors. The network of roads is like the wiring in a machine assembled by an unknown engineer long, long ago. And we—tiny users in our own history—can’t decide whether we’re controlling the program or just executing its scripts running in the background, unconsciously.
The illusion of cheap energy is fading. The system is starting to slow down. Windows open jerkily, the cursor trembles, and a message appears on the screen: Fatal error. Restart required. This is the screen flash of our era.
Why is civilization slowing down like an overloaded computer?
EROI (Energy Return on Investment) is like a performance metric for an operating system. It shows how much useful energy we can produce per unit of energy invested. In the 1930s, this ratio was 100:1 for oil production—today it is 10–20:1, and for renewables, often only 5–10:1.
This number is not merely a technical statistic. It is the “frame rate” of civilization—it shows how fast we can work, develop, and dream. As EROI declines, the world slows down, like an overloaded computer where every operation takes longer and longer.
Herman Daly, the father of ecological economics, put it simply: Nature’s basic rule is that there is no such thing as a free lunch—the bill always comes later. Right now, we are receiving the bill for the last two hundred years.
The drama is not that energy is running out. Energy is not running out—the cost of conversion is rising. To extract the same amount of usable energy, we have to invest more and more. It’s as if a movie’s resolution were steadily decreasing: the plot remains the same, but the details blur, the frames stutter, and the viewer has to squint harder and harder to understand what’s happening.
Buckminster Fuller—the inventor-engineer who dedicated his entire life to maximizing efficiency—called this an evolutionary necessity. The question isn’t whether we want to be efficient. The question is whether we will survive if we are not efficient.
Why is AI’s energy consumption a paradox?
As the energy foundations of our civilization weaken, a new, massive energy consumer has emerged: artificial intelligence. A single ChatGPT query consumes ten times more energy than a Google search. Training deep learning models requires as much energy as a small city’s annual consumption.
This is an absurd competitive situation. We are developing our most energy-intensive technologies at the very moment when energy is becoming increasingly expensive and less accessible. It is as if we were trying to build the most modern navigation system on a sinking ship. The navigation would undoubtedly help—but the ship would still sink.
French sociologist Jacques Ellul put it this way: all technology is ultimately energy transformation. What happens is not what we think—we do not “process information,” we do not “ask questions.” In a physical sense, we transform energy into heat, light, and motion. Behind every single prompt, every single response, every single generated image, there are watts, megawatts, and gigawatts.
But there is also a flip side. Artificial intelligence is not merely a consumer—it can also be a potential solution. Machine learning algorithms can optimize energy consumption, predictive analytics can forecast consumption peaks, and deep learning can help discover new energy sources. But all of this also requires energy. It’s an endless feedback loop—like a program that runs to reduce its own running costs, while simultaneously increasing them.
The question isn’t whether we should use AI. The question is whether its use consumes energy faster than it can save.
Are we moving backward on the Kardashev Scale?
The Kardashev Scale was proposed by Russian astrophysicist Nikolai Kardashev in 1964, and it essentially deals with how to classify civilizations based on how much energy they are capable of utilizing.
Humanity currently stands at level 0.7 on this scale—we haven’t even reached the level of a Type I civilization. Type I would mean that we are capable of utilizing all the available energy on the planet. But based on current trends, it is also conceivable that we are not moving forward on this scale, but backward. It’s as if we were moving our character backward in a video game—the level remains the same, but our abilities are diminishing.
Carl Sagan—who, as an astrophysicist and science popularizer, dedicated his entire career to helping people understand cosmic scale—said that if a species survives its own technology, it can have its own universe. The key word is “survives.” Because technology and energy are not automatically allies. Technology requires energy—but energy does not require technology. This is an asymmetrical relationship, and anyone who doesn’t understand this doesn’t understand the rules of the game.
A true Type I civilization is not merely capable of “exploiting” all of its planet’s energy—it employs circular energy management. This includes, in addition to geothermal, wind, hydro, and solar energy, the recovery of waste energy, the recycling of biomass, and closed-loop systems where energy “waste” becomes the input for the next process. A true Type I civilization does not merely consume energy—it recreates, optimizes, and regenerates it.
This difference is not technical—it is paradigmatic. It is not about how much we consume, but about how we think about consumption.
Updating the Operating System of Civilization
Lewis Mumford, one of the deepest thinkers in the history of technology, wrote that the development of civilization can be measured in energy transformations. Not in wars, not in empires, not in religions—in energy transformations. Fire, coal, oil, nuclear energy—each was an operating system update. Now the next update is needed, but the old operating system is still running and refuses to shut down.
The paradigm shift is taking place simultaneously along three axes:
From linear to circular. Closed-loop energy management instead of the “extract-use-waste” logic. All “waste” becomes a resource for the next process. This is not environmental romanticism—it is a thermodynamic necessity. Minimizing energy losses, utilizing waste heat, and closing material cycles are technological imperatives without which a civilization becomes energetically unsustainable.
From extractive to regenerative. Not the exploitation of energy, but its recreation and optimization. Type I civilization does not merely consume the planet’s energy—it regenerates it. Waste heat from circular data centers can heat cities, cryogenic cooling of quantum computers can recover energy, and neural network learning patterns can optimize the operation of the entire energy grid.
From ownership to access. Economist Jeremy Rifkin put it this way: ownership is an illusion; access is reality. The sharing economy is like a distributive network that optimizes resource allocation. Car-sharing, Airbnb, and tool-sharing are not merely business models—they are social software that reprograms the concept of ownership. A car that sits idle 95% of the time is an energy absurdity. It’s not a luxury item—it’s a waste.
How does the brain function on 20 watts, while AI requires megawatts?
The human brain consumes only 20 watts. Twenty watts. As much as a dim light bulb. Yet it is capable of performance that current artificial intelligence systems require megawatts to achieve. This is no coincidence—it is a matter of architecture.
Biomimetic computing is a field that mimics the processes, systems, and organizational principles observed in nature. Nature’s survival strategies are often more efficient, energy-efficient, and robust than algorithms or hardware designed by humans. Nature has been optimizing for four billion years—we’ve been at it for a few decades at most.
Neural mimicry—that is, computer architecture that follows the operating principles of the human brain—could be the energy breakthrough the artificial intelligence industry has been waiting for. If we could even come close to matching the brain’s efficiency, AI’s energy consumption would decrease by orders of magnitude.
Donald Knuth, one of the founders of computer science, said that every elegant solution is simple—and every simple solution is hard. The brain’s “solution” is astonishingly simple: it is brutally energy-efficient. What is hard is reproducing this artificially.
Meanwhile, social complexity is only increasing. According to the research of historian Joseph Tainter, complex societies function like multi-layered software systems—every new layer, every new functionality increases energy demand. The digitization of healthcare, the shift of education to the online space, and the computerization of public administration—all of these create new energy demands. A hospital’s IT system today consumes more energy than an entire village did fifty years ago.
The Organic Reprogramming of Society
Social adaptation is already underway. Not as a central plan, but as an emergent phenomenon—as millions of individual decisions coalesce into a new civilizational paradigm. Peter Drucker, the father of management thought, said that the future cannot be predicted, only lived. This is especially true now.
The culture of remote work is redefining physical space. Since COVID, it has become not just an option, but a new social norm. This means not only flexibility—but a radical restructuring of energy needs. Less commuting, fewer office buildings, a different kind of infrastructure. Work is becoming dematerialized—it is turning into information, into a stream of data.
Minimalism is a kind of mental defragmentation. Just as computers slow down due to unnecessary files, so too is the human mind slowed down by excessive material burdens. Minimalism is not poverty—it is efficiency. It is not an aesthetic choice—it is an energy strategy. Every unnecessary object consumes energy in its production, transportation, and storage.
The digital DNA of Generation Z is particularly instructive. They aren’t adapting to the digital world—they grew up in it. For them, the blurring of physical and virtual space is natural. Their value-based consumption isn’t an ideology, but a default setting. An ethical and sustainable brand isn’t a premium feature—it’s a basic expectation. For them, the circular economy is not an alternative concept, but a natural way of operating.
Communities that are able to respond to change creatively and cooperatively will not merely survive the energy transition—they will thrive in it. It’s like evolution: it’s not the strongest or the smartest, but the most adaptive species that survive.
What scenarios await us after 2035?
The future does not follow a single path. As with any complex system, multiple outcomes are possible here as well. But what they have in common is that the window of opportunity is narrow: between 2025 and 2035. After that, the momentum of inertia takes over.
Scenario 1 — Successful reprogramming. A Circular Type I civilization by 2050. Biomimetic AI, nearly 100% waste-to-energy recovery, regenerative technology. In this version, humanity not only learns to live with energy constraints but turns those constraints into innovation. Waste heat from data centers heats cities. Neural network learning models optimize the energy grid. Artificial intelligence is not the enemy—it is a tool for reprogramming.
Scenario 2 — Energy Collapse. EROI drops below 5, social complexity collapses, and technological regression sets in. Not explosively—quietly, gradually. First the peripheral areas fall away, then services, then institutions. Tainter documented exactly this, from the Roman Empire to the Mayan civilization: complexity is a kind of debt that must be repaid with energy, and when the energy runs out, the debt defaults.
Scenario 3 — Bifurcated Evolution. This is the most uncomfortable scenario because it is not black and white. Energy elite versus low-energy society. Technological apartheid based on energy. Those who have access to cheap, renewable energy, AI, and circular systems will continue to develop exponentially. Those who do not—will gradually fall behind. Not dramatically, not permanently. They’ll just be left out. Like someone who isn’t on the platform when the train departs.
The existential dimension
There is something profoundly absurd about the way our civilization functions. Like Sisyphus, we roll the boulder of energy consumption uphill, knowing that cheap fossil fuels are running out. Technological optimism is like the rebellion of Albert Camus’s absurd man—we continue to progress, even though we know that the laws of physics will ultimately prevail.
From the perspective of existential psychology, the energy crisis is a source of fundamental anxiety. Humans have always struggled with the tension between freedom and security, growth and sustainability. Now this struggle has risen to the level of civilization.
In the world of data analysis, there is a concept called concept drift—when the model’s underlying assumptions change, and previous predictions lose their validity. Our civilization’s model is based on cheap energy, but this assumption is no longer valid. We must learn a new algorithm—or accept that the old one no longer works.
The question is: do we still have enough imagination left to recode the whole thing before the machine freezes up for good? Or will we accept that progress is not a linear upward trajectory, but a cyclical transformation—like a neural network that learns from its mistakes to function more intelligently?
The future does not lie in the quantity of technology
Energy is not just a physical resource—it is a mindset, a paradigm, a belief in the future. Waste is not an inevitable byproduct of progress—it is a design flaw that can be solved.
We do not need more energy—we need smarter energy management. The Kardashev scale is not a hierarchy, but a measure of efficiency. Energy efficiency is a fundamental prerequisite for the next civilizational upgrade.
The code is already being written. The only question is: will we be part of its programming, or will we remain passive users in an increasingly sluggish system? The answer lies in the cycle—every endpoint is a new beginning, every waste is a new opportunity.
Key Ideas
- The EROI spiral is civilization’s speedometer — it has dropped from 100:1 to 10–20:1, and this is not a technical detail but an operational limit on the entire social complex
- The AI energy paradox is real — we are building the most energy-intensive technology precisely when energy is most expensive; the solution is not to stop, but to achieve biomimetic efficiency
- We may be moving backward on the Kardashev scale — from level 0.7, we are not moving toward Type I, but rather face the threat of a collapse of complexity if we do not shift to a circular paradigm
- The window of opportunity is open between 2025 and 2035 — after that, the momentum of inertia will decide for us; the question is not whether everything will change, but whether we will consciously steer the process
Key Takeaways
- The energy foundation of civilization is weakening: the EROI (Energy Return on Investment) for oil has fallen from 100:1 to 10–20:1, meaning that the same amount of usable energy requires an ever-greater investment. This is a slowing of civilization’s “frame rate.”
- The development of artificial intelligence and the deterioration of energy supply create a timing paradox: we are developing the most energy-intensive technologies (e.g., a ChatGPT query consumes 10 times more electricity than a Google search) at a time when energy is the most expensive and least efficient.
- The window of opportunity is narrow: according to the article, we must transition the system to a circular energy model between 2025 and 2035, because after that, inertia (and not intentional decisions) will drive the processes.
- The solution is not to reject technology, but to transition to a new energy paradigm, as Buckminster Fuller pointed out: efficiency is not a choice, but a matter of survival.
- The energy problem is not about depletion, but about the ever-increasing costs of transformation—it’s as if a movie’s resolution were deteriorating, and the details were being lost.
Frequently Asked Questions
What is EROI, and why should anyone who isn’t an energy engineer care?
EROI (Energy Return on Investment) shows how much useful energy we get back for each unit of energy invested. In the 1930s, the EROI of oil production was 100:1—a hundredfold return. Today it is 10–20:1, and for renewables it is often 5–10:1. This directly affects food prices, heating bills, transportation costs, and the quality of healthcare. When EROI declines, civilization “slows down”—the same level of social complexity requires more energy, and sooner or later something falls out of the system. Usually, it’s what we’ve paid the least attention to.
Is artificial intelligence really that energy-intensive, or is that an exaggeration?
It’s not an exaggeration—but it’s a matter of context. A ChatGPT query consumes ten times as much energy as a Google search. Training a large language model (LLM) requires as much energy as a small city’s annual consumption. The global energy demand of AI data centers could exceed the total consumption of some countries by 2030. At the same time, AI is capable of optimizing energy grids, forecasting consumption peaks, and discovering new energy sources. The question is not whether AI consumes a lot of energy—but whether the savings it generates exceed the amount consumed. For now, this remains an open question.
What can I do individually to address a problem on a civilizational scale?
The paradox of civilizational-scale problems is that they seem unsolvable individually, yet they consist solely of the sum of individual decisions. Remote work reduces commuting energy. Minimalism frees up energy from the production-transportation-storage chain. The sharing economy (car-sharing, equipment-sharing) optimizes utilization. Conscious consumption is not a moral decision—it is an energy strategy. You don’t have to save civilization. It is enough if you are not an active participant in its decline—and if, in your decisions, you recognize that every choice is an energy transformation.
Related thoughts
- Energy Colonization and Digital Isolation — when access to energy becomes a geopolitical weapon
- The AI Productivity Paradox — Jevons’ Revenge — why we consume more when we are more efficient
- The Hungarian and CEE AI Exceptionalism — an Eastern European perspective on the technological divide
Zoltán Varga - LinkedIn\
Neural • Knowledge Systems Architect | Enterprise RAG architect\
PKM • AI Ecosystems | Neural Awareness • Consciousness & Leadership\
The grid remembers every watt you didn’t save.
Strategic Synthesis
- Map the key risk assumptions before scaling further.
- Measure both speed and reliability so optimization does not degrade quality.
- Use a two-week cadence to update priorities from real outcomes.
Next step
If you want your brand to be represented with context quality and citation strength in AI systems, start with a practical baseline and a priority sequence.