Wpnets and pwnets: from the perspective of channel fusion

D Liang, F Yang, T Zhang, J Tian, P Yang - IEEE Access, 2018 - ieeexplore.ieee.org
D Liang, F Yang, T Zhang, J Tian, P Yang
IEEE Access, 2018ieeexplore.ieee.org
The performance and parameters of neural networks have a positive correlation, and there
are a lot of parameter redundancies in the existing neural network architectures. By
exploring the channels relationship of the whole and part of the neural network, the
architectures of the convolution network with the tradeoff between the parameters and the
performance are obtained. Two network architectures are implemented by dividing the
convolution kernels of one layer into multiple groups, thus ensuring that the network has …
The performance and parameters of neural networks have a positive correlation, and there are a lot of parameter redundancies in the existing neural network architectures. By exploring the channels relationship of the whole and part of the neural network, the architectures of the convolution network with the tradeoff between the parameters and the performance are obtained. Two network architectures are implemented by dividing the convolution kernels of one layer into multiple groups, thus ensuring that the network has more connections and fewer parameters. In these two network architectures, the information of one network flows from the whole to the part, which is called whole-to-part connected networks (WPNets), and the information of the other network flows from the part to the whole, which is called part-to-whole connected networks (PWNets). WPNets use the whole channel information to enhance partial channel information, and the PWNets use partial channel information to generate or enhance the whole channel information. We evaluate the proposed architectures on three competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet), and our models obtain comparable results even with far fewer parameters compared to many state of the arts. Our network architecture code is available at github.
ieeexplore.ieee.org
顯示最佳搜尋結果。 查看所有結果