Research Project Title:
Fine-Grained Mixed Precision Computing with Deep Neural Networks
abstract:Low precision training of neural networks has been shown to improve training time and save energy. However, there is a performance cost incurred from the corresponding loss of information. We explore fine-grained mixed precision training of neural networks. Specifically, given an assignment of operators to float precisions in the computation of forward propagation and back propagation we try to minimize truncation, while maintaining test set performance. Furthermore, we compute probabilistic bounds for our allocation in order to provide guarantees of the overall precision of the computation.
My interest in doing a SuperUROP came from my desire to spend more time doing research during the semester and to learn more from Ben Sherman and Michael Carbin. My research with them last semester touched on some related topics. I want to explore the ways to represent rationals on computers and to understand the corresponding effects on computation. I like that this area of research has rich theory and interesting applications.