Similarity-based Contrastive Divergence Methods for Energy-based Deep Learning Models

Conference/Journal
Authors
Adepu Ravi Sankar Vineeth N Balasubramanian
BibTex
Abstract
Abstract Energy-based deep learning models like Restricted Boltzmann Machines are increasingly used for real-world applications. However, all these models inherently depend on the Contrastive Divergence (CD) method for training and maximization of log likelihood of generating the given data distribution. CD, which internally uses Gibbs sampling, often does not perform well due to issues such as biased samples, poor mixing of Markov chains and high-mass probability modes. Variants of CD such as PCD, Fast PCD and Tempered ...