Skip to yearly menu bar Skip to main content


Poster

Tight and Efficient Upper Bound on Spectral Norm of Convolutional Layers

Ekaterina Grishina · Mikhail Gorbunov · Maxim Rakhuba

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Wed 2 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Controlling the spectral norm of convolution layers has been shown to enhance generalization and robustness in CNNs, as well as training stability and the quality of generated samples in GANs. Existing methods for computing singular values either result in loose bounds or lack scalability with input and kernel sizes. In this paper, we obtain a new upper bound that is independent of input size, differentiable and can be efficiently computed during training. Through experiments, we demonstrate how this new bound can be used to improve the performance of convolutional architectures.

Live content is unavailable. Log in and register to view live content