Angelos  Assos

Angelos Assos

Scholar Title

MIT EECS | Citadel

Research Title

Non-Convex Optimization for Deep Neural Networks

Cohort

2022–2023

Department

Electrical Engineering and Computer Science

Research Areas
  • Artificial Intelligence & Machine Learning
  • Theory of Computation
Supervisor

Luca Daniel

Abstract

Deep neural networks have shown great success in many machine learning tasks. Their training is challenging since the loss surface of the network architecture is generally non-convex, or even non-smooth. Thus, there has been a long-standing question on how optimization algorithms may converge to a global minimum. In general, existing optimization algorithms cannot guarantee the convergence to a global solution without convexity or other strong assumptions. In this project, we study optimization frameworks for machine learning tasks and design (stochastic) gradient-based methods to solve this problem. We aim to provide theoretical guarantees for these methods with convergence to global solutions using reasonable assumptions.

Quote

I am excited to participate in SuperUROP as I want to gain more experience in doing research and leading a project in an area that I am interested in. I am looking forward to working with people that have years of experience in research, and gain as much as I can.

Back to Scholars