UNIVERSITY OF HERTFORDSHIRE
COMPUTER SCIENCE RESEARCH COLLOQUIUM
presents
"GANs with integral probability metrics:
some results and conjectures"
Prof. Arthur Gretton
(Gatsby Computational Neuroscience Unit,
University College London)
12 February 2020
12:00 - 13:00
Hatfield, College Lane Campus
Seminar Room C154
Everyone is welcome to attend.
Refreshments will be available.
Abstract
I will explore issues of critic design for generative adversarial networks. The talk will focus on integral probability metric (IPM) losses, specifically the Wasserstein loss as implemented in the WGAN-GP, and the MMD GAN. We will begin with an introduction to IPM losses, their relation to moment matching in the case of the Maximum Mean Discrepancy (MMD), and how IPMs relate to f-divergences (answer: almost not at all). Next, we will look at GAN design using these IPM losses: we will address the question of critic gradient bias, and discuss the convergence of the GAN training algorithm when critic and generator are alternately trained. We'll end with some conjectures on the results that would be needed to establish a `theory of IPM GANs': in particular, I will claim that a problem-specific critic is needed, and that a critic that is a good approximation to a generic off-the-shelf divergence (Wasserstein, KL, MMD) is less likely to be useful in GAN training.
Biography
Arthur Gretton is a Professor with the Gatsby Computational Neuroscience Unit, and director of the Centre for Computational Statistics and Machine Learning (CSML) at UCL. He received degrees in Physics and Systems Engineering from the Australian National University, and a PhD with Microsoft Research and the Signal Processing and Communications Laboratory at the University of Cambridge. He previously worked at the MPI for Biological Cybernetics, and at the Machine Learning Department, Carnegie Mellon University.
Arthur's recent research interests in machine learning include the design and training of generative models, both implicit (e.g. GANs) and explicit (high/infinite dimensional exponential family models), nonparametric hypothesis testing, and kernel methods.
He has been an associate editor at IEEE Transactions on Pattern Analysis and Machine Intelligence from 2009 to 2013, an Action Editor for JMLR since April 2013, an Area Chair for NeurIPS in 2008 and 2009, a Senior Area Chair for NeurIPS in 2018, an Area Chair for ICML in 2011 and 2012, a member of the COLT Program Committee in 2013, and a member of Royal Statistical Society Research Section Committee since January 2020. Arthur was program chair for AISTATS in 2016 (with Christian Robert), tutorials chair for ICML 2018 (with Ruslan Salakhutdinov), workshops chair for ICML 2019 (with Honglak Lee), program chair for the Dali workshop in 2019 (with Krikamol Muandet and Shakir Mohammed), and co-organsier of the Machine Learning Summer School 2019 in London (with Marc Deisenroth).