Large Load Doesn’t Mean Blended Rate Anymore: The Quiet Death of Socialized Upgrades
Something quiet but structural just happened.
Anthropic announced it will cover all grid upgrade costs needed to connect its data centers by increasing its own monthly electricity charges rather than allowing those costs to flow through to other customers. It committed to bringing new power generation and grid capacity online to serve its load instead of leaning on existing capacity. That is not a press release detail. That is a rate design signal.
In the same month, the Georgia Senate stalled a bill that would have barred data centers from passing power costs onto utility customers. Senate Bill 34 did not advance, but the debate itself is the point. Legislators openly questioned whether Georgia Power customers should be exposed if the AI buildout slows or collapses. One state is negotiating cost responsibility through private initiative. Another is attempting to mandate it through statute.
The political narrative has flipped. Data centers are no longer framed only as economic development assets. They are being framed as potential rate pressure drivers.
That shift changes underwriting.
For years, large load strategy relied on a quiet assumption. Utilities would expand transmission and distribution infrastructure, rate base the upgrades, and recover costs broadly across the customer base. The data center secured service under an existing tariff or negotiated rider. The rest of the system absorbed the incremental expansion.
That assumption is now fragile.
If cost causation becomes the organizing principle rather than economic development, upgrade costs move. They can be directly assigned to the customer. They can be embedded in new large load rate classes. They can be structured through special contract riders that lock in minimum demand commitments. They can be paired with mandatory term lengths that protect the utility against stranded asset risk.
Power cost is no longer just the tariff rate.
Power cost equals tariff plus upgrade contribution plus the timeline risk premium associated with regulatory review.
In Georgia, the debate centers on whether data center demand is contributing to rising bills. Customers have already seen rate increases tied to grid expansion. Legislators are asking who ultimately carries the burden. In Virginia, Dominion Energy has created a new high energy user class that requires significant financial commitments and long term contracts to mitigate stranded asset risk as data center load grows. In Michigan, DTE secured a $242 million rate increase and signaled another filing within days, drawing political scrutiny. Across jurisdictions, regulators are confronting the same question. Who pays when load growth becomes step function rather than incremental.
Developers and infrastructure investors need to internalize this shift.
Secondary markets that advertise available transmission headroom are exposed if upgrade costs are not fully specified. Developers pricing land before final interconnection study results are exposed if network upgrades are ultimately assigned rather than socialized. Merchant generation strategies that rely on behind the meter netting or cost arbitrage are exposed if rulemakings eliminate those advantages, as PJM has proposed with its 50 megawatt threshold for new behind the meter loads.
What breaks first is sequencing.
Land options secured without clarity on cost allocation agreements become speculative inventory rather than bankable sites. Capital stacks sized around assumed blended rates become stressed when upgrade contributions are explicitly allocated. Debt underwriting premised on tariff stability becomes vulnerable when new rate classes or minimum demand requirements alter the cost curve.
None of this means data center growth stops. Dominion has nearly 48.5 gigawatts of contracted data center capacity in Virginia. Load is real. Capital is flowing. But the framework governing how that load is integrated into the grid is being rewritten in real time.
This is not a political story.
It is a cost allocation regime change.
Large load does not mean blended rate anymore. It means scrutiny. It means minimum term commitments. It means direct upgrade contributions. It means regulators asking whether residential customers should backstop AI infrastructure.
If you are underwriting a site in 2026, the first question is no longer how much land costs.
The first question is who pays for the grid.



I’ll be teaching a structured version of this constraint-based underwriting framework this summer at New York University in “Data Centers Demystified.” We’ll break down grid access, rate cases, and capital sequencing step by step. Details here: https://www.sps.nyu.edu/courses/REDV1-CE2000-data-centers-demystified.html
Let’s face facts. Data centers require massive consumption of natural resources. These companies will bankrupt à city and its people because ignorant politicians don’t understand how to negotiate critical contracts, and the city and its citizens will suffer.
Under the table deals will supersede the needs of the cities people and they will bear the burden.
What’s it all for? It’s for creating a structure that has the ability to know everything about everyone and use that to control society.
If we’re so smart why don’t we run an analysis on how much data is needed over the next 50 years and what will be the toll on natural resources to power these monsters? It will be catastrophic to society.
Man is stupid and will be the death of America if this isn’t slowed down and evaluated by people who understand the consequences of this rushed, massive, uncontrolled takeover of our country by mega corporations and billionaires who have the ability to silence truth.
Why do we need so much data anyway!!