Paper ID | MLSP-21.2 |
Paper Title |
Relaxed Wasserstein with Applications to GANs |
Authors |
Xin Guo, Johnny Hong, Tianyi Lin, Nan Yang, University of California, Berkeley, United States |
Session | MLSP-21: Generative Neural Networks |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 15:30 - 16:15 |
Presentation Time: | Wednesday, 09 June, 15:30 - 16:15 |
Presentation |
Poster
|
Topic |
Machine Learning for Signal Processing: [MLR-APPL] Applications of machine learning |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models, which have attracted great attention in various applications. However, this framework has two main drawbacks: (i) Wasserstein-1 (or Earth-Mover) distance is restrictive such that WGANs cannot always fit data geometry well; (ii) It is difficult to achieve fast training of WGANs. In this paper, we propose a new class of \textit{Relaxed Wasserstein} (RW) distances by generalizing Wasserstein-1 distance with Bregman cost functions. We show that RW distances achieve nice statistical properties while not sacrificing the computational tractability. Combined with the GANs framework, we develop Relaxed WGANs (RWGANs) which are not only statistically flexible but can be approximated efficiently using heuristic approaches. Experiments on real images demonstrate that the RWGAN with Kullback-Leibler (KL) cost function outperforms other competing approaches, e.g., WGANs, even with gradient penalty. |