Paper Image

Permutation symmetries enable linear connectivity in Bayesian neural networks

Published on:

16 October 2023

Primary Category:

Machine Learning

Paper Authors:

Simone Rossi,

Ankit Singh,

Thomas Hannagan

Bullets

Key Details

Recent work shows gradient descent solutions in NNs are linearly connected under permutation symmetries.

This paper extends the concept of marginalized loss barriers and solution interpolation to Bayesian NNs.

The authors propose an algorithm to align distributions of two BNN solutions using permutation matrices.

Experiments show nearly zero marginalized loss barriers when aligning BNN solutions this way.

The problem is framed as combinatorial optimization using an approximation to the sum of bilinear assignment problem.

AI generated summary

Permutation symmetries enable linear connectivity in Bayesian neural networks

This paper extends recent work showing gradient descent solutions in neural networks are linearly connected when accounting for permutation symmetries. The authors propose an algorithm to align distributions of Bayesian neural network solutions, finding nearly zero loss barriers.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up