Paper Image

Neural network linear connectivity modulo permutation

Published on:

9 April 2024

Primary Category:

Machine Learning

Paper Authors:

Ekansh Sharma,

Devin Kwok,

Tom Denton,

Daniel M. Roy,

David Rolnick,

Gintare Karolina Dziugaite

Bullets

Key Details

Refines arguments about role of permutation symmetry in non-convex loss landscapes

Defines weak, strong, and simultaneous weak linear connectivity modulo permutation

Shows single permutation aligns optimization and pruning trajectories

Provides first evidence for strong connectivity in wide networks

AI generated summary

Neural network linear connectivity modulo permutation

This paper refines arguments about permutation symmetry being the sole source of non-convexity in neural network loss landscapes. It introduces weak, strong, and simultaneous weak notions of linear connectivity modulo permutation. Experiments show a single permutation can simultaneously connect optimization trajectories and sequences of iteratively pruned networks. There is also evidence that barrier height decreases with network width when interpolating between three networks, suggesting strong connectivity may be achievable.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up