Invited Talk by Dr. Emtiyaz Khan on Inference through the Optimizer : Bayesian deep learning via perturbed adaptive learning-rate methods
Approximate Bayesian inference is promising in improving generalization and reliability of deep learning, but most inference methods are difficult to implement within existing deep-learning code bases. I will present new methods that perform inference within existing deep-learning optimizers by simply perturbing the network weights during gradient evaluations. Resulting methods require lower memory, computation, and implementation efforts compared to existing ones. They also improve performance of existing deep-learning methods, e.g., by avoiding local sharp minima and by performing exploration during deep reinforcement lear
Dr. Emtiyaz is a team leader (equivalent to Full Professor) at the center for Advanced Intelligence Project (AIP), RIKEN in Tokyo where he leads the Approximate Bayesian Inference (ABI) Team. He is an Action Editor for the prestigious Journal of Machine Learning (JMLR) and was an area chair for top machine learning conferences such as ICML and NIPS. Before joining Riken, he was a scientist at EPFL, Switzerland where he taught two large machine learning courses for which he received a teaching award. Prior to it, he was a post-doc at EPFL with Prof. Matthias Seeger. He finished his PhD at University of British Columbia (UBC) in 2012 under the supervision of Prof. Kevin Murphy.