Paper ID | IVMSP-7.6 |
Paper Title |
DEEP NEURAL NETWORKS WITH FLEXIBLE COMPLEXITY WHILE TRAINING BASED ON NEURAL ORDINARY DIFFERENTIAL EQUATIONS |
Authors |
Zhengbo Luo, Sei-ichiro Kamata, Zitang Sun, Weilian Zhou, Waseda University, Japan |
Session | IVMSP-7: Machine Learning for Image Processing I |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation |
Poster
|
Topic |
Image, Video, and Multidimensional Signal Processing: [IVTEC] Image & Video Processing Techniques |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Most structures of deep neural networks (DNN) are with a fixed complexity of both computational cost (parameters and FLOPs) and the expressiveness. In this work, we experimentally investigate the effectiveness of using neural ordinary differential equations (NODEs) as a component to provide further depth to relatively shallower networks rather than stacked layers (depth) which achieved improvement with fewer parameters. Moreover, we construct deep neural networks with flexible complexity based on NODEs which enables the system to adjust its complexity while training. The proposed method achieved more parameter-efficient performance than stacking standard DNNs, and it alleviates the defect of the heavy cost required by NODEs. |