Generalized Ternary Connect: End-to-End Learning and Compression of Multiplication-Free Deep Neural Networks

Citation

S. Parajuli; A. Raghavan; S. Chai, Generalized Ternary Connect: End-to-End Learning and Compression of Multiplication-Free Deep Neural Networks, Association for the Advancement of Artificial Intelligence, and arxiv., Innovative Applications of Artificial Intelligence (AAAI-19), and arxiv on-line archive, Honolulu, HI, January 27 – February 1, 2019

Abstract

The use of deep neural networks in edge computing devices hinges on the balance between accuracy and complexity of computations. Ternary Connect (TC) \cite{lin2015neural} addresses this issue by restricting the parameters to three levels −1,0, and +1, thus eliminating multiplications in the forward pass of the network during prediction. We propose Generalized Ternary Connect (GTC), which allows an arbitrary number of levels while at the same time eliminating multiplications by restricting the parameters to integer powers of two. The primary contribution is that GTC learns the number of levels and their values for each layer, jointly with the weights of the network in an end-to-end fashion. Experiments on MNIST and CIFAR-10 show that GTC naturally converges to an `almost binary’ network for deep classification networks (e.g. VGG-16) and deep variational auto-encoders, with negligible loss of classification accuracy and comparable visual quality of generated samples respectively. We demonstrate superior compression and similar accuracy of GTC in comparison to several state-of-the-art methods for neural network compression. We conclude with simulations showing the potential benefits of GTC in hardware.


Read more from SRI