Authors: Bohan Zhuang,Chunhua Shen,Ian Reid
ArXiv: 1808.02631
Document:
PDF
DOI
Abstract URL: http://arxiv.org/abs/1808.02631v1
In this paper, we propose to train a network with binary weights and
low-bitwidth activations, designed especially for mobile devices with limited
power consumption. Most previous works on quantizing CNNs uncritically assume
the same architecture, though with reduced precision. However, we take the view
that for best performance it is possible (and even likely) that a different
architecture may be better suited to dealing with low precision weights and
activations.
Specifically, we propose a "network expansion" strategy in which we aggregate
a set of homogeneous low-precision branches to implicitly reconstruct the
full-precision intermediate feature maps. Moreover, we also propose a
group-wise feature approximation strategy which is very flexible and highly
accurate. Experiments on ImageNet classification tasks demonstrate the superior
performance of the proposed model, named Group-Net, over various popular
architectures. In particular, with binary weights and activations, we
outperform the previous best binary neural network in terms of accuracy as well
as saving more than 5 times computational complexity on ImageNet with ResNet-18
and ResNet-50.