It's not about the price they're paying for power (peak hours from the power company's perspective), it's about the peak usage of their data center.
As the article (correctly) says, the capital costs are dependent on your peak usage. If you have 1000 servers using 400kW at peak, you need sufficient air conditioning to extract 400kW worth of heat, and backup generation capable of producing 400kW. It doesn't matter if you only use 100kW 16 hours a day - the capital costs are the same.
I'm suggesting that Glacier could live entirely in non peak periods, meaning that the capital costs are unchanged and the demand curve is flattened.
The real question then is: is there really a peak hour for datacenter usage/consumption? What's the consumption difference between peak and regular hours (and low-demand hours)
How much power does Amazon use during mornings compared to Netflix watching peak time?
As the article (correctly) says, the capital costs are dependent on your peak usage. If you have 1000 servers using 400kW at peak, you need sufficient air conditioning to extract 400kW worth of heat, and backup generation capable of producing 400kW. It doesn't matter if you only use 100kW 16 hours a day - the capital costs are the same.
I'm suggesting that Glacier could live entirely in non peak periods, meaning that the capital costs are unchanged and the demand curve is flattened.