Cost of AI for the environment

sourajit roy chowdhury
3 min readApr 15, 2020

--

Google

So, many of us are either a machine learning engineer or an enthusiast or somehow related to the field of computer science. As we know the field of AI is growing each and every day along with the curiosity of engineers to implement them. The usage of AI is a blessing to all of the worlds, but obviously with a cost, maybe I would say one of the major costs that we shall pay, not now but in the near future — ‘Carbon Footprints’.

Computing Power

It is very evident that if we are using computers that means we need computing power. Also, any domain in AI is much more compute-intensive than any other traditional IT tasks. Those who are not from any AI domain, for them it's just a little clarification: a huge number of mathematical operations are being done continuously.

So, how can we measure it?

Let’s go step by step

Starting with my own personal laptop, which has an Intel i5 processor (CPU). It dissipates nearly ~90 Watts power. So If I run an ML model for a day then the power will be 90 Watts * 24 hours = 2.1 kWh.

But, I am running a Depp learning model so CPU is not good enough, so I switched to my Nvidia 1059Ti GPU, which dissipates nearly ~250 Watts power. So the calculation is now 250 Watts * 24 hours = 6kWh.

A lot of power is being dissipated from my personal laptop. Now how a real-world deep learning model gets trained? Obviously not in a single laptop/desktop. It either uses on-premise servers or clouds. To simplify it, in a data center. A data center is nothing but a stack of CPUs and GPUs and also powerful than a regular laptop CPU and GPU. Google also went beyond GPUs, they use their TPUs (Tensor Processing Units). I guess from these heavily weighted terms you already started your calculations in your head.

Now, the question is how many data centers are running in this globe? Say, one million, although the actual number is very high (you can do a quick google search 😉). So, what is power consumption in a day by all these data centers?

Let’s calculate: 1 Million * 6kWh (considering they are also using 1080Ti GPU and one GPU per data center) = 6000 mWh. A big reason to think about 😵

How powers are coming to data centers?

Google

It's clear that like our personal laptops the power for a data center is not supplied from a three-pin plug. Rather it is being supplied from big powerhouses that use mostly non-renewable energy sources like coal and diesel, to make the data centers running 24x7.

How to calculate the carbon emission quantity from the wattage ratings. carbon emitted = Power consumption x Time x Carbon Produced Based on the Local Power Grid

Although carbon production on the local power grid differs from different data centers. For example, if a GCP (Google Cloud Platform) data center is located at aisa-east1 the average carbon production is 0.56 kg eq. CO2/kWh

If the same model been run in Google Cloud Platform’s europe-west6 region, the carbon emitted would have been of 0.05 kg eq. CO2.

Conclusion

We have taken many assumptions to calculate the power for a deep learning model like:

  1. The data centers are using GPUs the same as the personal laptop, which is not always true, the GPUs and even TPUs are very powerful at the data centers.
  2. We calculated the power dissipation assuming one GPU per data center, but the fact is that there are thousands of GPUs and CPUs which are running in a single data center.
  3. In a data center, there are many other hardware and cooling systems available which we didn’t take part in our discussion.

Anyway, we can control such carbon footprints by some means like avoiding unnecessary model training or redundant model training, but still, in the future, it might have some effect on our environment.

--

--

sourajit roy chowdhury
sourajit roy chowdhury

No responses yet