I don't think it contradicts the article. They wrote: The energy consumption graph of data centers is basically flat. That's unexpected because big data, machine learning, etc have gotten popular in the last decade (or so). The way I interpreted this is that by improving efficiency, we kept the total energy usage close to constant.
I'm not saying that if you write Rust you save the planet, but in certain scenarios, the energy you save by having some parts of your services written in Rust can be significant.
> If you look at the graph of energy consumption, the top line is basically flat going back as far as 2010.
> There have been too many data center efficiency improvements to list
This seems consistent with jevon’s paradox. Efficiency improvements have not reduced the overall consumption.
Unless you believe that efficiency improvements have just randomly coincided with growth and they happen to cancel each-other out… but it seems more likely to me that cheap compute contributed to the huge uptick in machine learning / big data etc.
I'm not saying that if you write Rust you save the planet, but in certain scenarios, the energy you save by having some parts of your services written in Rust can be significant.