Sample-free Bayesian Learning for Quantized Neural Networks

Year
2020
Type(s)
Author(s)
Jiahao Su, Milan Cvitkovic, Furong Huang,
Source
International Conference on Learning Representations (ICLR), 2020.
Url
https://openreview.net/pdf?id=rylVHR4FPB
BibTeX
BibTeX

Bayesian learning of model parameters in neural networks is important in scenarios where estimates with well-calibrated uncertainty are desirable. In this paper, we propose Bayesian quantized networks (BQNs), quantized neural networks (QNNs), for which we learn a posterior distribution over their discrete parameters. We provide a set of efficient algorithms for learning and prediction in BQNs without the need to sample from their parameters or activations, which not only allows for differentiable learning in QNNs but also reduces the variance in gradients. We evaluate BQNs on MNIST, Fashion-MNIST, KMNIST, and CIFAR10 image classification datasets. Compared to the bootstrap ensemble of QNNs (E-QNN). We demonstrate BQNs achieve lower predictive errors and better-calibrated uncertainties than E-QNN (with less than 20% of the negative log-likelihood).

The code is here.