Paper ID | MLSP-15.4 |
Paper Title |
DEEP DETERMINISTIC INFORMATION BOTTLENECK WITH MATRIX-BASED ENTROPY FUNCTIONAL |
Authors |
Xi Yu, University of Florida, United States; Shujian Yu, NEC Laboratories Europe, Germany; Jose C. Principe, University of Florida, United States |
Session | MLSP-15: Learning Algorithms 2 |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation |
Poster
|
Topic |
Machine Learning for Signal Processing: [MLR-INFO] Information-theoretic learning |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
We introduce the matrix-based Renyi's $\alpha$-order entropy functional to parameterize Tishby et al. information bottleneck (IB) principle with a neural network. We term our methodology Deep Deterministic Information Bottleneck (DIB), as it avoids variational inference and distribution assumption. We show that deep neural networks trained with DIB outperform the variational objective counterpart and those that are trained with other forms of regularization, in terms of generalization performance and robustness to adversarial attack. |