Author

June Knauth

Date of Award

2023

First Advisor

Eric Kramer

Second Advisor

Zachary While

Abstract

Artificial Intelligence (AI), a field which is currently undergoing a massive technological and financial boom, has created possibilities in computing which are still being explored. However, it requires significant amounts of computing (and thus electrical) resources. Few players in the AI space are considering whether the technology represents a reasonable and ethical use of these resources in the face of climbing CO2 emissions. In this thesis, I explain in simple terms what AI is, how it came to be, and why it uses so much energy. From there, I explore Approximate Multipliers (AMs) a new technique which can drastically reduce the energy usage of a neural network — the computational backbone of modern AI. Most previous investigations into AMs did not consider their use during training, which accounts for the majority of energy consumed by a neural network. Using ApproxTrain, a framework which allows implementation and testing of AMs within the AI development library TensorFlow, I reimplement the AlexNet network, a common choice for testing new AI technologies. I test performance in training and inference on the CIFAR-10 dataset, which contains 60,000 images in 10 categories for classification. I show that approximate multipliers do not reduce classification accuracy, even when used during training.

Simon's Rock Off-campus Download

Simon's Rock students and employees can log in from off-campus by clicking on the Off-campus Download button and entering their Simon's Rock username and password.

Share

COinS