Paper Image

Deep neural network prediction reliability

Published on:

8 February 2024

Primary Category:

Machine Learning

Paper Authors:

Lahav Dabah,

Tom Tirer

Bullets

Key Details

Calibration aligns confidence with correctness likelihood

Conformal prediction produces prediction sets with coverage guarantees

Temperature scaling calibration negatively impacts adaptive conformal prediction

Analysis reveals mathematical properties causing this behavior

Applying conformal prediction before calibration is recommended

AI generated summary

Deep neural network prediction reliability

This paper investigates two techniques for assessing the reliability of deep neural network classifiers: calibration, which adjusts the prediction confidence to better match correctness likelihood; and conformal prediction, which produces a set of predictions with marginal coverage guarantees. The key finding is that calibrating confidence scores via temperature scaling can negatively impact adaptive conformal prediction methods by increasing prediction set sizes. After extensive experiments revealing this surprising phenomenon, mathematical analysis provides reasoning based on properties of the temperature scaling procedure. The conclusion suggests utilizing adaptive conformal prediction methods before confidence calibration, to benefit from enhanced conditional coverage.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up