Paper ID | MLSP-3.4 | ||
Paper Title | Parametric Spectral Filters for Fast Converging, Scalable Convolutional Neural Networks | ||
Authors | Luke Wood, Google, United States; Eric Larson, Southern Methodist University, United States | ||
Session | MLSP-3: Deep Learning Training Methods 3 | ||
Location | Gather.Town | ||
Session Time: | Tuesday, 08 June, 13:00 - 13:45 | ||
Presentation Time: | Tuesday, 08 June, 13:00 - 13:45 | ||
Presentation | Poster | ||
Topic | Machine Learning for Signal Processing: [MLR-DEEP] Deep learning techniques | ||
IEEE Xplore Open Preview | Click here to view in IEEE Xplore | ||
Abstract | Using spectral multiplication to compute convolution in neural networks has been investigated by a number of researchers because of its potential in speeding up computations for large images. However, previous methods require the learning of arbitrarily large convolution filters in the spectral domain, causing two untenable problems: an explosion in the number of trainable parameters per filter and an inability to reuse filters across images of differing sizes. To address this, we propose the usage of spectral approximation functions to approximate the massive Spectral domain filters with only a few trainable parameters. Our empirical analysis suggests that the proposed approximation maintains the benefits of arbitrarily large filters (such as improved rate of convergence in training, accuracy, and stability) while relying on significantly fewer trainable parameters. |