We can’t see electricity, we can see it sometimes when we use it such as turning on a light, but we certainly can’t see or quantify it when its been used, or when we aren’t using it.
What I’m getting at is the rise in data centres, that run “cloud” computing. The energy needed to run these is collossal, and rising with each new thing that the providers (typically Microsoft Google and Amazon) bring out.
The latest is Microsoft bringing AI to Edge, Windows, Office 365. This is a massive power user – in the data centre, but as its also massively profitable for the companies concerned, they will continue to roll it out meanwhile claiming they use “green” energy in their data centres (we can’t all use green energy).
This post focuses on BITCOIN. Can we really afford the energy required to produce effectively a load of numbers?
This provides further information:
Cambridge Bitcoin Electricity Consumption Index (CBECI) (ccaf.io)
There are many different estimates on how much carbon electricity produces, lets say 1Kwh produces 500g (approx 1lb) of carbon dioxide.
Bitcoin is using 15million kwh, or 15 million lbs, or 7500 tonnes of Carbon Dioxide, every day. Can we afford such pointless exercises? Pointless as opposed to the need for transport or the need to heat, light, or air condition our homes (in certain countries).