Downloads

Generative Modeling by Estimating Gradients of the Data Distribution

Time: June 9th, 2021 10:00    
Location: N412, Mong Man-wei Science Technology Building

Lecturer: Stefano Ermon, Assistant Professor of Computer Science, Stanford University

Existing generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the vector field of gradients of the data distribution (scores). Our framework allows flexible energy-based model architectures, requires no sampling during training or the use of adversarial training methods. Using annealed Langevin dynamics, we produce samples comparable to GANs, achieving new state-of-the-art inception scores. Finally, I will discuss challenges in evaluating bias and generalization in generative models.

Stefano Ermon is an Assistant Professor of Computer Science in the CS Department at Stanford University, where he is affiliated with the Artificial Intelligence Laboratory, and a fellow of the Woods Institute for the Environment. His research is centered on techniques for probabilistic modeling of data and is motivated by applications in the emerging field of computational sustainability. He has won several awards, including four Best Paper Awards (AAAI, UAI and CP), a NSF Career Award, ONR and AFOSR Young Investigator Awards, a Sony Faculty Innovation Award, a Hellman Faculty Fellowship, Microsoft Research Fellowship, Sloan Fellowship, and the IJCAI Computers and Thought Award. Stefano earned his Ph.D. in Computer Science at Cornell University in 2015.