In the case of your quoted article - taking it at face value - this means "everyone" is paying .02/khw more on their bill. A datacenter is going to be paying thousands of times more than your average household as they should.
I don't see a problem with this at all. Cheap electricity is required to have any sort of industrial base in any country. Paying a proportionate amount of what it costs the grid to serve you seems about as fair of a model as I can come up with.
If you need to subsidize some households, then having subsidized rates for usage under the average household consumption level for the area might make sense?
I don't really blame the last watt added to the grid for incremental uptick in costs. It was coming either way due to our severe lack of investment in dispatchable power generation and transmission capacity - datacenters simply brought the timeline forward a few years.
There are plenty of actual problematic things going into these datacenter deals. Them exposing how fragile our grid is due to severe lack of investment for 50 years is about the least interesting one to me. I'd start with local (and state) tax credits/abatements myself.
Data centers get commercial or maybe even industrial rates depending on their grid hookup and utilities love predictable loads. Those are lower than residential rates. If you're dishonest and don't understand the cost of operating a grid, you could say that's users paying for data centers. But then you'd need to apply it to every commercial/industrial user.
If the regular users were paying for data centers usage, why are so many of them going off-grid with turbines or at least partially on-prem generation?
The solution is more and cheaper energy.
wait what? consumers are literally paying for server farms? this isn't a supply-demand gap?