BEGIN:VCALENDAR VERSION:2.0 PRODID:-//132.216.98.100//NONSGML kigkonsult.se iCalcreator 2.20.4// BEGIN:VEVENT UID:20250817T194931EDT-4007emggfl@132.216.98.100 DTSTAMP:20250817T234931Z DESCRIPTION:Title: Generalized Energy-Based Models\n\n\n Abstract:\n\n\nI wi ll introduce Generalized Energy Based Models (GEBM) for generative modelli ng. These models combine two trained components: a base distribution (gene rally an implicit model)\, which can learn the support of data with low in trinsic dimension in a high dimensional space\; and an energy function\, t o refine the probability mass on the learned support. Both the energy func tion and base jointly constitute the final model\, unlike GANs\, which ret ain only the base distribution (the “generator”). In particular\, while th e energy function is analogous to the GAN critic function\, it is not disc arded after training. GEBMs are trained by alternating between learning th e energy and the base\, much like a GAN. Both training stages are well-def ined: the energy is learned by maximising a generalized likelihood\, and t he resulting energy-based loss provides informative gradients for learning the base. Samples from the posterior on the latent space of the trained m odel can be obtained via MCMC\, thus finding regions in this space that pr oduce better quality samples. Empirically\, the GEBM samples on image-gene ration tasks are of better quality than those from the learned generator a lone\, indicating that all else being equal\, the GEBM will outperform a G AN of the same complexity. GEBMs also return state-of-the-art performance on density modelling tasks\, and when using base measures with an explicit form.\n\n\n Speaker\n\n\nArthur Gretton is a Professor with the Gatsby Com putational Neuroscience Unit\, and director of the Centre for Computationa l Statistics and Machine Learning (CSML) at UCL. He received degrees in Ph ysics and Systems Engineering from the Australian National University\, an d a PhD with Microsoft Research and the Signal Processing and Communicatio ns Laboratory at the University of Cambridge. He previously worked at the MPI for Biological Cybernetics\, and at the Machine Learning Department\, Carnegie Mellon University.\n\nArthur’s recent research interests in machi ne learning include the design and training of generative models\, both im plicit (e.g. GANs) and explicit (high/infinite dimensional exponential fam ily models and energy-based models)\, nonparametric hypothesis testing\, s urvival analysis\, causality\, and kernel methods.\n\nHe has been an assoc iate editor at IEEE Transactions on Pattern Analysis and Machine Intellige nce from 2009 to 2013\, an Action Editor for JMLR since April 2013\, an Ar ea Chair for NeurIPS in 2008 and 2009\, a Senior Area Chair for NeurIPS in 2018\, an Area Chair for ICML in 2011 and 2012\, a member of the COLT Pro gram Committee in 2013\, and a member of Royal Statistical Society Researc h Section Committee since January 2020. Arthur was program chair for AISTA TS in 2016 (with Christian Robert)\, tutorials chair for ICML 2018 (with R uslan Salakhutdinov)\, workshops chair for ICML 2019 (with Honglak Lee)\, program chair for the Dali workshop in 2019 (with Krikamol Muandet and Sha kir Mohammed)\, and co-organsier of the Machine Learning Summer School 201 9 in London (with Marc Deisenroth).\n\nZoom Link\n\nMeeting ID: 924 5390 4 989\n\nPasscode: 690084\n\n \n DTSTART:20201106T203000Z DTEND:20201106T213000Z SUMMARY:Arthur Gretton URL:/mathstat/channels/event/arthur-gretton-325928 END:VEVENT END:VCALENDAR