Paper ID | CI-1.1 |
Paper Title |
GATE TRIMMING: ONE-SHOT CHANNEL PRUNING FOR EFFICIENT CONVOLUTIONAL NEURAL NETWORKS |
Authors |
Fang Yu, Chuanqi Han, Pengcheng Wang, Xi Huang, Li Cui, Institute of Computing Technology, Chinese Academy of Sciences, China |
Session | CI-1: Theory for Computational Imaging |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 15:30 - 16:15 |
Presentation Time: | Wednesday, 09 June, 15:30 - 16:15 |
Presentation |
Poster
|
Topic |
Computational Imaging: [IMT] Computational Imaging Methods and Models |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Channel pruning is a promising technique of model compression and acceleration because it reduces the space and time complexity of convolutional neural networks (CNNs) while maintaining their performance. In existing methods, channel pruning is performed by iterative optimization or training with sparsity-induced regularization, which all undermine the utility due to their inefficiency. In this work, we propose a one-shot global pruning approach called Gate Trimming (GT), which is more efficient to compress the CNNs. To achieve this, GT performs the pruning operation once, avoiding expensive retraining or re-evaluation of channel redundancy. In addition, GT globally estimates the effect of channels across all layers by information gain (IG). Based on the IG of channels, GT accurately prunes the redundant channels and makes little negative effect on CNNs. The experimental results show that the proposed GT is superior to the state-of- the-art methods. |