2021 IEEE International Conference on Acoustics, Speech and Signal Processing

6-11 June 2021 • Toronto, Ontario, Canada

Extracting Knowledge from Information

2021 IEEE International Conference on Acoustics, Speech and Signal Processing

6-11 June 2021 • Toronto, Ontario, Canada

Extracting Knowledge from Information

Technical Program

Paper Detail

Paper IDIVMSP-22.2
Paper Title MULTI-SCALE CASCADE DISPARITY REFINEMENT STEREO NETWORK
Authors Xiaogang Jia, Wei Chen, Zhengfa Liang, Xin Luo, Mingfei Wu, Yusong Tan, Libo Huang, National University of Defense Technology, China
SessionIVMSP-22: Image & Video Sensing, Modeling and Representation
LocationGather.Town
Session Time:Thursday, 10 June, 14:00 - 14:45
Presentation Time:Thursday, 10 June, 14:00 - 14:45
Presentation Poster
Topic Image, Video, and Multidimensional Signal Processing: [IVELI] Electronic Imaging
IEEE Xplore Open Preview  Click here to view in IEEE Xplore
Virtual Presentation  Click here to watch in the Virtual Conference
Abstract Stereo matching has attracted much attention in recent years. Traditional methods can quickly generate a disparity result, but the accuracy is low. On the contrary, methods based on neural networks can achieve a high accuracy level, but they are difficult to reach the real-time level. Therefore, this paper presents MCDRNet, which combines traditional methods with neural networks to achieve real-time and accurate stereo matching results. Concretely, our network first generates a rough disparity map based on the traditional ADCensus algorithm. Then we design a novel Multi-Scale Cascade Network to refine the disparity map from coarse to fine. We evaluate our best-trained model on the KITTI official website. The results show that our network is much faster than most current top-performing methods(31×than CSPN, 56×than GANet, etc.). Meanwhile, it is more accurate than traditional stereo methods(SGM, SPS-St) and other fast 2D convolution networks(Fast DS-CS, DispNetC, etc.), demonstrating the rationalities and feasibilities of our method.