Conference Papers Year : 2021

Beyond First-Order Uncertainty Estimation with Evidential Models for Open-World Recognition

Abstract

In this paper, we tackle the challenge of jointly quantifying in-distribution and out-of-distribution (OOD) uncertainties. We introduce KLoS, a KL-divergence measure defined on the classprobability simplex. By leveraging the secondorder uncertainty representation provided by evidential models, KLoS captures more than existing first-order uncertainty measures such as predictive entropy. We design an auxiliary neural network, KLoSNet, to learn a refined measure directly aligned with the evidential training objective. Experiments show that KLoSNet acts as a class-wise density estimator and outperforms current uncertainty measures in the realistic context where no OOD data is available during training. We also report comparisons in the presence of OOD training samples, which shed a new light on the impact of the vicinity of this data with OOD test data.
Fichier principal
Vignette du fichier
UDL2021-paper-062.pdf (1.8 Mo) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03347628 , version 1 (17-09-2021)

Identifiers

  • HAL Id : hal-03347628 , version 1

Cite

Charles Corbière, Marc Lafon, Nicolas Thome, Matthieu Cord, Patrick Pérez. Beyond First-Order Uncertainty Estimation with Evidential Models for Open-World Recognition. ICML 2021 Workshop on Uncertainty and Robustness in Deep Learning, Sep 2021, Virtual, Austria. ⟨hal-03347628⟩
258 View
240 Download

Share

More