alt text 

Yasaman Bahri

Research Scientist
Google Research (Brain Team)


I am a theoretical and computational scientist working at the intersection of machine learning and physical science.

In one direction, I have worked on building foundations for deep learning and investigated core machine learning problems. In the other, I am interested in connections with and applications of machine learning to specific domains of physical science.

A sample of topics from some of my past & ongoing work in this area includes:

  • Theoretical and empirical understanding of the role of scale in deep learning (‘‘scaling laws")

  • Exact connections between deep neural networks, Gaussian processes, and kernel methods

  • Phase transitions and the dynamics of gradient descent in supervised deep learning

  • Connections between information propagation in deep neural networks, neural network priors, and trainability

  • Distribution shift and the pre-training, fine-tuning paradigm

  • Graph neural networks for prediction in molecular physics

See a recent Quanta magazine article for coverage on older work.

I was trained as a theoretical quantum condensed matter physicist, and I received my Ph.D. in Physics from UC Berkeley in 2017. My graduate work is specifically in the field of quantum many-body theory and strongly correlated physics. I was fortunate to have Professor Ashvin Vishwanath as my thesis advisor. I worked on several areas as part of my doctoral research, including topological phases, many-body localization, and non-Fermi liquids. My dissertation proposed new classes of quantum behavior; new routes towards realizing exotic quantum phases; and new classes of mechanical behavior through topological mechanisms. I got started in theory as an undergraduate through research on tensor networks and entanglement in quantum systems with Professor Joel Moore at UC Berkeley, which was also the subject of my honors senior thesis.

Link to Google Scholar

Contact: yasamanbahri@gmail.com

Recent & Upcoming News

  • Gave a guest lecture in CS 159 at Caltech on theoretical aspects of deep learning.