## Eyan Forsythe

MIT EECS | Analog Devices Undergraduate Research and Innovation Scholar

Accuracy of Analog Neural Network Accelerators

2023–2024

EECS

- Computer Architecture

Joel Emer

Deep neural networks are invaluable for the modeling and statistical inference of complex systems, but are often infeasible due to the prohibitive computational costs of matrix multiplication. Analog neural network accelerators have emerged as a strategy to reduce the computational requirements for these operations, but these accelerators trade off accuracy in the process. This project aims to quantify the accuracy lost from using analog accelerators to compute the output of deep neural networks. This project will evaluate how different accelerator architectures and multiplication techniques affect the accuracy of various neural networks. The goal of the project is to provide quantitative data which can inform the development of future analog accelerators.

Through participating in SuperUROP, I would like to gain research experience beyond what a normal UROP can provide. I have taken courses on computer architecture, circuits, and machine learning, and I am interested in using low-level circuit elements to solve computational problems. I hope that this project will allow me to become more involved in research related to these interests.