secretaire-inma@uclouvain.be +32 10 47 80 36
Home > Publications > Orthogonal regularizers in deep learning: how to h...
2022 • Conference Paper

Orthogonal regularizers in deep learning: how to handle rectangular matrices?

Authors:
Massart, Estelle
Published in:
2022 26th International Conference on Pattern Recognition (ICPR)

Orthogonal regularizers typically promote column orthonormality of some matrix W ∈ ℝ n×p , by measuring the discrepancy between W ⊤ W and the identity according to some matrix norm. This paper explores the behavior of these regularizers when W is horizontal (n < p), so that column orthonormality cannot be achieved. Our motivation comes from orthogonal regularization of feed-forward neural networks: it is there desired to regularize all (vertical and horizontal) weight matrices of the model.One possible solution to address this issue is to transpose horizontal matrices before regularization. We prove that transposition is useless for the Frobenius norm (squared), as the corresponding regularizer promotes simultaneously orthonormality of the rows and of the columns of W. On the other hand, we highlight important qualitative differences with newer regularizers, including the MC and SRIP orthogonal regularizers. We conclude the paper with some numerical results supporting our theoretical findings.

Related Resources