Date of Submission

Spring 2020

Academic Program

Computer Science

Project Advisor 1

Keith O'Hara

Abstract/Artist's Statement

With the rapid development of machine learning, deep learning has demonstrated superior performance over other types of learning. Research made possible by big data and high-end GPU's enabled those research advances at the expense of computation and environmental costs. This will not only slow down the advancement of deep learning research because not all researchers have access to such expensive hardware, but it also accelerates climate change with increasing carbon emissions. It is essential for machine learning research to obtain high levels of accuracy and efficiency without contributing to global warming. This paper discusses some of current approaches in estimating energy consumption. We compare the energy consumption of the training phase of two convolutional neural networks, SimpleNet and AlexNet, using RAPL. Although we weren't able to reproduce the network exactly from their original papers, we found that AlexNet uses more than 6 times as much energy and has more than 6 times as much carbon emission as SimpleNet.

Open Access Agreement

Open Access

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

This work is protected by a Creative Commons license. Any use not permitted under that license is prohibited.