Cost Allocation Is Destiny: Why Digital Infrastructure Economics Now Runs Through the Rate Case
A few years ago, if you wanted to understand the future of digital infrastructure, you looked at chip roadmaps, cloud demand curves, and AI model size. You studied Nvidia earnings. You tracked hyperscale capex guidance. You debated whether inference would outpace training.
Today, if you want to understand the future of digital infrastructure, you read rate case filings.
That is not a metaphor. It is the structural shift underway.
The buildout of AI infrastructure is not constrained by demand. Demand is overwhelming. Capital is available. Technology is accelerating. What constrains expansion now is not appetite. It is cost allocation. Who pays for the wires, the substations, the gas plants, the transmission lines, the capacity payments, and the reliability margin.
Infrastructure expansion is never stopped by demand. It is shaped, slowed, and redirected by the framework that decides who bears the cost.
Every infrastructure boom eventually becomes a cost allocation fight.
The Structural Rule
The structural rule is simple and unforgiving. Infrastructure economics is governed not by enthusiasm, but by allocation.
When a new class of load or traffic overwhelms an existing system, the initial reaction is expansion. Utilities build. Investors finance. Politicians celebrate growth. The narrative is progress.
Then the bills arrive.
Residential ratepayers see higher electricity costs. Industrial customers see capacity charges rise. Grid operators warn of reliability margins tightening. Suddenly the conversation shifts from growth to fairness.
That is the moment when cost allocation frameworks move from background policy to front page politics.
We are there.
In Virginia, lawmakers are debating bills that would shift more of the system cost of new infrastructure onto data centers served by Dominion Energy. In Wisconsin, We Energies has asked the Public Service Commission to establish a new rate structure for massive data centers that could require as much energy as all current customers combined. In Oklahoma, House Bill 3392 directs the Corporation Commission to examine who pays for grid upgrades driven by large loads. In Texas, ERCOT is restructuring how it studies and integrates large load interconnections because individual transmission studies are collapsing under volume. In the PJM region, governors and the Trump administration have pressed for reforms that would allocate additional generating and transmission costs to large load users while extending wholesale price caps to protect consumers.
These are not isolated policy skirmishes. They are structural responses to concentrated demand.
The Historical Precedent
This is not the first time.
Telecom went through it when long distance traffic surged and regulators had to decide how access charges would be allocated. Pipelines went through it when new shale basins required transmission buildout and fights erupted over who would underwrite capacity. Electric transmission itself went through it in earlier cycles when regional markets like PJM had to decide how to allocate the cost of reliability projects across states and customers.
In every case, the pattern is consistent.
First comes growth. Then comes concentration. Then comes cross subsidy anxiety. Then comes redesign.
No infrastructure market escapes the moment when someone asks whether grandma is subsidizing growth.
Digital infrastructure believed it was different because the tenants were creditworthy hyperscalers and the narrative was innovation. But physics and politics are stronger than narrative. When a single facility consumes more power than a small city, it is no longer an abstract cloud node. It is a rate class problem.
Why AI Infrastructure Is Entering That Phase
Three forces are accelerating this transition.
The first is large load concentration. Modern AI training facilities can exceed 100 megawatts. Clusters of facilities can reach gigawatt scale. In Wisconsin, consumer advocates warned that proposed data centers could ultimately require as much energy as all existing We Energies customers combined. In the PJM footprint, summertime peak demand is projected to increase by as much as 35 percent over the next decade, driven largely by data center growth.
The second force is residential bill sensitivity. In Virginia, projections suggest residential customers could see significant monthly increases tied in part to system expansion. Polling shows voters increasingly connect data center growth with higher bills. When 78 percent of voters blame data centers for rising electricity costs, affordability becomes a legislative priority.
The third force is grid reliability scrutiny. PJM’s base residual capacity auction hit record prices, prompting price caps and political intervention. ERCOT has described a “study doom loop” where interconnection requests invalidate each other. When reliability margins narrow, regulators move quickly to protect the system.
Layered on top of these is environmental reporting pressure. Texas regulators are preparing to require data centers to report water usage. New York lawmakers have proposed a moratorium on permits for data centers capable of using 20 megawatts or more until environmental impact studies and rate allocation rules are clarified. South Carolina legislators are debating whether data centers should pay entirely for new generation, transmission, and capacity costs, while also considering water and noise standards.
Demand is not the constraint. Distribution of burden is.
The Three Fracture Points
As cost allocation frameworks tighten, three fracture points emerge that will define the next phase of digital infrastructure economics.
Separate Rate Classes
The first fracture point is the creation of distinct rate classes for large loads.
In Virginia, the State Corporation Commission approved a GS-5 high load customer rate class with long term contracts, minimum demand charges, and exit fees. In Wisconsin, We Energies is seeking approval for a data center specific rate structure that would ensure new generation and distribution are paid for by the facilities that require them.
A separate rate class formalizes cost causation. It moves large loads from the blended pool into a defined bucket with contractual obligations. That increases transparency. It also increases fixed cost exposure.
For developers and operators, this means power pricing becomes less about marginal kilowatt hours and more about long term commitments tied to infrastructure recovery.
Interconnection Collateralization
The second fracture point is interconnection collateralization.
If a utility builds a gas plant or transmission line to serve a new data center and that tenant’s business plan changes, who absorbs the stranded cost?
Consumer advocates in Wisconsin have explicitly raised the risk of an AI bubble burst that leaves ratepayers responsible for “boatloads” of new power plants. South Carolina’s S.784 would require large commercial data centers to enter into 15 year contracts with collateral requirements to cover unrecovered infrastructure costs. Oklahoma lawmakers are asking their Corporation Commission to examine how large loads affect ratepayers and who should fund grid upgrades.
Collateral, minimum take provisions, and exit fees are not anti growth tools. They are risk transfer mechanisms. They shift the burden of uncertainty from residential customers to the large load applicant.
Interconnection is no longer just a queue position. It is a financial instrument.
Mandatory Self Supply and Behind the Meter Conditions
The third fracture point is the push toward self generation or behind the meter solutions.
PJM has proposed rules encouraging data center owners to build their own power plants and offering expedited grid connection processes for those who do. South Carolina legislation explicitly permits and encourages self generation under Public Service Commission oversight. Virginia stakeholders are debating how air permitting standards affect the use of on site generators for demand response.
As public grids strain, regulators increasingly ask whether large loads should provide part of their own capacity. That changes the capital stack. It also shifts environmental and operational risk back to the developer.
Behind the meter is no longer just a resilience play. It is becoming a regulatory expectation.
The Capital Market Feedback Loop
Once cost allocation frameworks tighten, capital markets adjust.
Asset backed securities tied to digital infrastructure require certainty of revenue and operating cost. If power cost exposure becomes variable or politically contingent, risk premiums rise.
Public REIT multiples depend on deliverable megawatts. If interconnection timelines become dependent on new rate design proceedings or collateral negotiations, projected capacity online dates become less certain.
Power purchase agreements are shifting from branding tools to defensive instruments. Long term contracts with utilities or self generation assets now function as hedges against regulatory volatility. They protect not just price, but access.
In PJM, extending wholesale price caps is framed as consumer protection while market reforms are implemented. That signals to capital that short term returns may be moderated to stabilize the system. In Texas, ERCOT’s batch transmission studies reflect recognition that individual project evaluation is insufficient in a clustered load environment.
The feedback loop is clear. Regulatory tightening increases cost transparency. Increased transparency changes underwriting. Changed underwriting influences capital allocation.
The Inevitable Outcome
The inevitable outcome is that the market will price sites based on tariff durability and interconnection credibility.
Land will matter. Fiber will matter. Incentives will matter. But the decisive variable will be whether the rate design governing a site is stable, enforceable, and aligned with cost causation principles.
Sites in jurisdictions with unresolved cost allocation debates will carry a premium of uncertainty. Sites with defined large load rate classes, clear collateral rules, and credible interconnection processes will command capital confidence.
This does not mean growth slows to a halt. It means growth becomes structured.
Digital infrastructure economics now runs through the rate case because the grid is the binding constraint. Not in aggregate supply. In allocation of burden.
Investors who treat rate design as background noise will misprice risk. Operators who understand that cost allocation is destiny will adapt first.
The AI revolution is real. The capital is real. The demand is real.
But the wires belong to regulators.
And that is where the next decade of digital infrastructure will be decided.



I concur- "But the wires belong to regulators."
Your post was highlighted over at the Bad Boys the other day-
https://energybadboys.substack.com/p/necessarily-skyrocket
I keep up with PG&E's rate cases while we still lived in CA. The PV system we had installed back in 2006 is going to lose it's NEM rate schedule later this year.