Paper Title:

On permutation symmetries in Bayesian neural network posteriors: a variational perspective

Published on:

16 October 2023

Primary Category:

Machine Learning

Paper Authors:

Simone Rossi,

Ankit Singh,

Thomas Hannagan

•

Recent work shows gradient descent solutions in NNs are linearly connected under permutation symmetries.

•

This paper extends the concept of marginalized loss barriers and solution interpolation to Bayesian NNs.

•

The authors propose an algorithm to align distributions of two BNN solutions using permutation matrices.

•

Experiments show nearly zero marginalized loss barriers when aligning BNN solutions this way.

•

The problem is framed as combinatorial optimization using an approximation to the sum of bilinear assignment problem.

Permutation symmetries enable linear connectivity in Bayesian neural networks

This paper extends recent work showing gradient descent solutions in neural networks are linearly connected when accounting for permutation symmetries. The authors propose an algorithm to align distributions of Bayesian neural network solutions, finding nearly zero loss barriers.

Neural network linear connectivity modulo permutation

Using symmetries to improve neural network PDE solvers

Learning layer equivariances with gradients

Symmetry-guided training for quantum neural networks

Learning non-linear functions in two-layer neural networks with a single gradient step

Understanding Hessian alignment for domain generalization

No comments yet, be the first to start the conversation...

Sign up to comment on this paper