alt text 

Yasaman Bahri

Research Scientist
Google Research (Brain Team)

I am theoretical and computational scientist working at the boundary of machine learning and physics.

In one direction, I am interested in scientific questions within machine learning and have been working on building a theoretical and empirical understanding of deep learning that will be useful for a broad range of communities. I am also interested in using machine learning to advance condensed matter physics (many-body systems, materials) and in connections between the disciplines.

A sample of topics from some of my past & ongoing work in this area includes:

  • Theoretical and empirical understanding of the role of scale in deep learning (‘‘scaling laws")

  • Exact connections between deep neural networks, Gaussian processes, and kernel methods

  • Phase transitions and the dynamics of gradient descent in supervised deep learning

  • Connections between information propagation in deep neural networks, neural network priors, and trainability

  • Distribution shift and the pre-training, fine-tuning paradigm

  • Graph neural networks for prediction in molecular physics

See a recent Quanta magazine article for coverage on older work.

I was trained as a theoretical quantum condensed matter physicist, and I received my Ph.D. in Physics from UC Berkeley in 2017. My graduate work is specifically in the field of quantum many-body theory and strongly correlated physics. I was fortunate to have Professor Ashvin Vishwanath as my thesis advisor. My scientific interests have always been broad, and I worked on several different areas as part of my doctoral research, including topological phases, many-body localization, and non-Fermi liquids. My dissertation proposed new classes of quantum behavior; new routes towards realizing exotic quantum phases; and new classes of mechanical behavior through topological mechanisms. I got started in theory as an undergraduate through research on tensor networks and entanglement in quantum systems with Professor Joel Moore at UC Berkeley, which was also the subject of my honors senior thesis.

Link to Google Scholar


Recent News

  • Gave a guest lecture in CS 159 at Caltech on theoretical aspects of deep learning.