This is great, but do they have an actual example of something that would have been passed on to consumers? Or is it just a hypothetical?
In the location I’m familiar with, large infrastructure projects have to pay their own interconnection costs. Utilities are diverse across the country so I wouldn’t be surprised if there are differences, but in general I doubt there are many situations where utilities were going to raise consumer’s monthly rates specifically to connect some large commercial infrastructure.
Maybe someone more familiar with these locations can provide more details, but I think this public promise is rather easy to make.
However there are some examples where increased demand by one sector leads to higher prices for everyone. The PJM electricity market has a capacity market, where generators get compensated for being able to promise the ability to deliver electricity on demand. When demand goes up, prices increase in the capacity market, and those prices get charged to everyone. In the last auction, prices were sky high, which leads to higher electricity prices for everyone:
https://www.utilitydive.com/news/pjm-interconnection-capacit...
A lot of electricity markets in other places allow procurement processes where increased costs to meet demand get passed to all consumers equally. If these places were actually using IRPs that had up to date pricing, adding new capacity from renewables and storage would lower prices, but instead many utilities go with what they know, gas generators, which are in short supply and coming in at very high prices.
And the cost of the grid is high everywhere. As renewables and storage drive down electricity generation prices, the grid will come to be a larger and larger percentage of electricity costs. Interconnection is just one bit of the cost, transmission needs to be upgraded all around as overall demand grows. We've gone through a few decades of stagnant to lessening electricity demand, and utilities are hungry to do very expensive grid projects because they get a guaranteed rate of return on grid expansion in most parts of the country.
The demand is still there, connected to the grid or not. The grid can help make things more efficient and resilient in some ways (and less resilient in other ways), which is why the grid came about in the first place.
That's not the question, we aren't discussing trivialities like what change of supply is necessary for satisfying increased demand, that's like discussing "is water wet" or "do you need more or less water to satisfy your thirst for water".
The real question is Who is going to pay for building the additional supply?
Residential and other prior customers have already paid the capex for the existing supply and now you want them to pay the capex for enormous amounts of new capacity which the AI corps convert exclusively into their own revenue.
The public is already paying through the nose for new semiconductor capacity because the same scam-geniuses cornered the RAM, GPU and related chips market and they are mercilessly scalping it too, again at the expense of the public.
> The grid can help make things more efficient and resilient in some ways
In a perfect world it can, in this world it makes things more unstable and far more unfair when large new consumers use it for their exclusive revenue extraction while pretending that the new capacity is somehow benefiting everybody instead of just them.
> The muddied water is just supply and demand.
Indeed, "just supply and demand" is the mud in the eyes.
It got blocked by FERC as it would raise other consumers' energy prices and the deal wasn't fully transparent (probably intentionally so they could shift costs onto others).
> Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years.
These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
Quite the opposite, really. I did some napkin math for energy and water consumption, and compared to humans these things are very resource efficient.
If LLMs improve productivity by even 5% (studies actually peg productivity gains across various professions at 15 - 30%, and these are from 2024!) the resource savings by accelerating all knowledge workers are significant.
Simplistically, during 8 hours of work a human would consume 10 kWH of electricity + 27 gallons of water. Sped up by 5%, that drops by 0.5kWH and 1.35 gallons. Even assuming a higher end of resources used by LLMs, a 100 large prompts (~1 every 5 minutes) would only consume 0.25 kWH + 0.3 gallons. So we're still saving ~0.25 kWH + 1 gallon overall per day!
That is, humans + LLMs are way more efficient than humans alone. As such, the more knowledge workers adopt LLMs, the more efficiently they can achieve the same work output!
If we assume a conservative 10% productivity speed up, adoption across all ~100M knowledge work in the US will recoup the resource cost of a full training run in a few business days, even after accounting for the inference costs!
Additional reading with more useful numbers (independent of my napkin math):
https://www.nature.com/articles/s41598-024-76682-6
https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans...
Saying “we can do the same work with less resource use” doesn’t mean resource consumption is reduced. You’ve just gone from humans using resources to humans using the same resources and doing less work, plus AI using more resources.
Your question is a variant of what do we do with all those humans now that they don't have to walk miles to the well every day because we invented aqueducts? The point is that they didn't want to walk to the well but they had to (and in some places they still have to) and very few people want to work, even now and even us, but they have to.
We will see what happens this time when we won't have to walk to that well.
Soon enough, we won't be able to avoid this question.
For instance, I think operating at this level of productivity is unsustainable (https://news.ycombinator.com/item?id=46938038). As discussed in detail by the recent "AI vampire" blog: https://news.ycombinator.com/item?id=46972179 -- most humans are not designed for that level of cognitive intensity.
But even then, the productivity per human will explode, and we will still have the problem of "too many humans." Cynically, if most knowledge workers get laid off, it's good from an environmental perspective because that means much less commuting and pollution! But then they're starving and we will have riots!
This is where I foresee the near-term problems with GenAI: social turmoil rather than resource consumption. I suspect it's not all bad news though. While it's impossible to put numbers on it, it helps to think about the first-order economic principles that are in play:
1. This is hand-wavy, but knowledge work boosts economic growth. If this is massively accelerated, we should be creating surplus value that compensates for a lot of costs.
2. However a huge chunk of knowledge work is busy work which will be automated away. People can try upskilling but the skill gap is already huge an growing quickly and they will lose jobs.
3. The economy is essentially people providing and paying for services and goods. If people lose jobs and cannot earn, they cannot drive the economy and it shrinks.
4. The elite, counter-intuitively enough, do NOT want that because they get richer by taking a massive cut of the economy! (Not to mention life in a doomsday bunker can get pretty dull if starving people start rioting -- https://news.ycombinator.com/item?id=46896066)
There are many more dynamics at play of course, but I think an equilibrium will be found purely because everyone is incentivized to find a solution (UBI?) that keeps both the elites and the plebes living long and prospering. I expect some turmoil, but luckily, the severe resource crunch of GPUs gives us time to figure things out.
If it helps better interpret my posts, these days my frame of mind is "We're living in very interesting times" in the sense of https://en.wikipedia.org/wiki/May_you_live_in_interesting_ti...
Have fun watching the car crash.
I have no expertise here, but a couple years ago I had a prototype using locally deployed Llama 2 that cached the context (now deprecated https://github.com/ollama/ollama/issues/10576) from previous inference calls, and reused it for subsequent calls. The subsequent calls were much much faster. I suspect prompt caching works similarly, especially given changed code is very small compered to the rest of the codebase.
Mostly lines up with this reference too, which focuses only on water usage at work: https://quench.culligan.com/blog/average-water-usage-per-per...
Buying electricity isn't inherently destructive. That's a very bad analogy.
> These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
I'm not arguing that they are efficient right now, but how would you measure that? What kind of output does it have to make per kWh of input to be acceptable? Keep in mind that the baseline of US power use is around 500GW and that currently AI is maybe 10.
> AI sector will need at least 50 gigawatts of capacity over the next several years.
The error bars on this prediction are extremely large. It would represent a 5% increase in capacity in "the next several years" which is only a percent or two per year, but it could also only be 5GW over the next several years. 50GW represents about 1 year of actual grid additions.
> All of you building these things for these people should be embarrassed and ashamed.
I'm not building these things, and I think there should be AI critique, but this is far over the top. There's great value for all of humanity in these tools. The actual energy use of a typical user is not much more than a typical home appliance, because so many requests are batched together and processed in parallel.
We should be ashamed of getting into our cars every day, that's a true harm to the environment. We should have built something better, allowed more transit. A daily commute of 30 miles is disastrous for the environment compared other any AI use that's really possible at the moment.
Let's be cautious of AI but keep our critiques grounded in reality, so that we have enough powder left to fight the rest of things we need to change in society.
Of course I could do all of my coding alone again, but I would be slower. It's like walking to the mall several times per week, several hours per time, instead of once or twice per week with a car, three cumulative hours. I trade a higher energy consumption for more time to do other things and the ability to live far away from shops.
Counting the 100w for 24 hours for a human doesn’t match up with counting the power usage from “AI” for only the 10 minutes it’s doing a task.
Also - units issue: 100 watts for a day is 2400 watt-hours. It’s a moot point anyway because the power draw for the frontier models is an order of magnitude off that the division by 24 is basically meaningless.
You should start from beef industry.
These days, it's about framing - every country is scrambling to up their game just to stay in power. The companies that are riding this wave are spending millions in marketing, lobbying and billions on consuming energy so that they can make trillions in valuation.
I am also an ardent user of AI - but sometimes I do feel guilty when I use so many tokens - because I know I am burning energy, and feeding part of this mission. If there is a solution, I would like to be a part of it.
This is by far the best article I've seen on it [0]. Which leads me to conclude: if you use coding agents, then yes, it's definitely a concern. Yet if you drive daily, even an EV, it's very small compared to that. Let alone flying. Personally, even if my "AI emissions" are at 10x his estimated usage (they almost certainly aren't), the other sacrifices I make to reduce emissions have such an impact that I'd still be multiple times below the national average.
Note how the above measures energy usage (kWh), not emissions. For anyone taking fossil fuel transit regularly, whether ICE car/taxi/airplane, AI usage is all but guaranteed to be meaningless compared to their transport emissions. One hamburger is at least 5x more emissions than his "median say with Claude Code", so there's another one. If you're feeling guilty, track how much beef you're eating, cut it down by 20% and use agents to your hearts content.
Now of course, a different form of AI usage like image generation and especially video generation is incomparably more energy-intensive per query. We'd need separate math on that.
For example, the article says their daily average use of Claude code is similar to the dishwasher running. Is that just including inference or also training Opus 4.5?
What we need to do here is write an article that makes a wild claim in either direction ("99% is inference!"), post it on HN, and wait for the comments to roll in that prove it right or wrong.
But it's the bullshit some people like so it's not going to go away soon.
This frames the dilemma that you just need to make this little sacrifice so Trillion.ai can make it's trillion. We shouldn't sacrifice anything.
Hah, if only. Man, I wish that companies succeeded in doing that, then we'd have a lot more people making such sacrifices. That'd be great.
No one wants their customer to feel guilty because it makes them less likely to buy the product. It's the worst nightmare of any marketer.
Yet another example of socialize the costs, privatize the profits (except AI isn't profitable yet, lol)
But even on the subject of electricity costs. It looks like the biggest electricity consuming sector globally is.. the oil industry! So we're back to the Lambo drivers.
There's nothing particularly worse about money spent on AI vs. anything else. I don't feel guilty for having 6 shirts even though I can only wear one at a time.
> trillions in valuation.
This is more or less literally the "yes we destroyed the planet, but for a brief moment we created trillions in shareholder value" meme. Perhaps we need to take a step back and ask to what extent this benefits humans as humans, not as economic units. Especially given the explicit threat in the AI marketing material to destroy all creative industries and replace human fulfilment and even connection with AI.
You're just describing individual action, which like you said, isn't gonna do anything.
They don’t generally just have GW of power sitting idle for a rainy day (I’m not talking about the capacity they reserve for hot july days).
Maybe then, we could afford to smelt an ingot of aluminum in the USA.
Until then, I guess we're just sadly just burning coal to create cat memes. I hope Anthropic can lead the charge. Crypto was already a massive setback in terms of clean power, AI is already very dirty.
[1] https://abcnews.com/International/wireStory/china-building-c...
China has also been installing more clean energy than the rest of the world combined, and their emissions might have peaked.
https://www.economist.com/cdn-cgi/image/width=600,quality=10...
https://www.economist.com/china/2025/05/29/chinas-carbon-emi...
Wind and solar combined generation increased by 12.2% during the first 11 months of 2025, providing 19% of total US electricity compared to 17.3% during the same period in 2024.
Between January and November 2025, utility-scale solar capacity grew by about 22,237 MW, while small-scale solar capacity increased by 5,461 MW. https://electrek.co/2026/01/28/eia-99-of-new-us-capacity-in-...
That's a reasonable assumption. At the same time, I don't know that you can neatly attribute things happening during one adminstration to the prior administration. We need more rigorous analysis than that. For instance, the economy tends to do better under Democratic than Republican rule, but using your lag mental model we should then actually ascribe it to Republic policy? Back to energy, notably, in Jul 2025, more coal was added than wind... should we ascribe that to the prior admin due to lag?
> The current administration has been extremely hostile to renewables in terms of rhetoric, I would be surprised if they were lying about that.
Yes, that's clear. They are very hostile in rhetoric and action.
The administration characterizes wind and solar as expensive and unreliable energy sources that have been subsidized by taxpayers for too long. In July 2025, President Trump signed an executive order to eliminate subsidies for wind and solar in accordance with the "One Big Beautiful Bill Act". On his first day in office, Trump issued an order blocking the government from auctioning off the rights to build wind farms on public lands or in public waters. The administration has halted already-issued permits for offshore wind projects and suspended leases for five major wind projects in December. Solar and wind projects are now subject to an elevated review process likely to slow down approval. Tax credits for renewable energy projects were restricted, requiring projects to begin construction within a year or produce electricity by 2028.
The adminstration prefers fossil fuels (oil, natural gas, coal), hydropower, nuclear energy, and critical minerals as domestic energy resources.
Despite all that, 2026 is still projected to have 99%+ new capacity in 2026 to be solar, wind, and storage.
Congress provided $320 million for DOE solar and wind programs despite the White House requesting zero funding for these programs. https://www.utilitydive.com/news/solar-gas-nuclear-ferc-infr...
So I, for one, have hope.
But still, it's possible that a smaller, dirtier build-out in the US will significantly drop prices relative to today, and certainly relative to the rest of the world (which is failing spectacularly at building out power infrastructure).
But yes, the only way you're ever going to smelt Aluminum in the US again is if you have customers who can't/won't buy Chinese Aluminum. And even then, worth keeping an eye on the richer Arabs states. They're quickly roofing over their deserts, and certainly don't worry about local NIMBY opposition to power lines...
What I infer from Anthropic post is that they will estimate the energy price as if they weren't using it and pay the difference if their use upped the price.
With day ahead forecasting, we can try to turn that peak load into base load. Grid operations are a non trivial part in how this AI energy situation plays out.
First of all, the resources those tiny batteries use are less than a drop in the bucket of what we ideally would like to add to the grid, and secondly, we're currently nowhere close to being limited by battery capacity. China alone had the industrial capacity to produce more than 2 TWh of new batteries last year, but they actually produced a bit less than 1 TWh because there was no market demand for this many batteries.
They just overbuilt production capacity by over 100%, per their industrial policy.
"We will cover the cost of upgrading the electricity grid so we can use more energy" yeah. Of course you will. What?
Go to the bathroom upstairs, don't use our ecosystem as your latrines if you can direct it straight to the CMB.
We lived without aluminum soda cans for 100k years
1. humans use soda cans, that people could do without for 100k years
2. ???
3. Don't put datacenters into space?
"data centers use a lot of electricity we shouldn't do that"
"why?"
"computers are different"
people have been correctly indoctrinated about global warming and the dominant heating terms coming from excess CO2 concentration, but because of this over-emphasis they neglect the prompt heating that comes with nearly all energy generation mechanisms (from fossil fuels, to solar panels to nuclear energy).
when the whole world starts raising their living conditions, and when a computational race erupts, there is no taming of total human energy consumption.
but what we can do is offload the bulk of computational energy consumption, like training common goods such as LLM weights...
Oh no the burden of actually explaining why you want to de-emphasize a comment.
One single square meter of land in direct sunlight receives a constant 6kW (21MJ) of energy. The heat rejected by industrial and other processes is absolutely minuscule in comparison, a rounding error.
Comments that are incorrect but posted in an authoritative voice get downvoted, for good reason.
This is incorrect, at ground level its about 1 kW of sunlight per square meter if that square meter is orthogonal to the line of sight to the sun, otherwise it gets diminished with cos(theta) where theta is the angle between the line of sight to the sun and the normal of the square meter of land, it can not receive 6 kW no matter the orientation. And 6 kW is a power, while 21 MJ is an energy.
> Comments that are incorrect but posted in an authoritative voice get downvoted, for good reason.
Indeed your incorrect comment in an authoritative voice might get downvoted, for good reason, but I won't be the one doing it...
Please don't interrupt the discussion to meta-discuss the scoring system.
but now you say I "interrupted" the back-then-non-existent discussion... whatever rsync, whatever...
See, the AI is gonna create jobs, not eliminate them lol. Now let us strip mine your hood G.
How does paying more monthly cover an infrastructure build out that requires up front capital?