Skip to yearly menu bar Skip to main content


Discriminant Distance-Aware Representation on Deterministic Uncertainty Quantification Methods

Jiaxin Zhang · Kamalika Das · Sricharan Kumar

MR1 & MR2 - Number 20
[ ]
Sat 4 May 6 a.m. PDT — 8:30 a.m. PDT


Uncertainty estimation is a crucial aspect of deploying dependable deep learning models in safety-critical systems. In this study, we introduce a novel and efficient method for deterministic uncertainty estimation called Discriminant Distance-Awareness Representation (DDAR). Our approach involves constructing a DNN model that incorporates a set of prototypes in its latent representations, enabling us to analyze valuable feature information from the input data. By leveraging a distinction maximization layer over optimal trainable prototypes, DDAR can learn a discriminant distance-awareness representation. We demonstrate that DDAR overcomes feature collapse by relaxing the Lipschitz constraint that hinders the practicality of deterministic uncertainty methods (DUMs) architectures. Our experiments show that DDAR is a flexible and architecture-agnostic method that can be easily integrated as a pluggable layer with distance-sensitive metrics, outperforming state-of-the-art uncertainty estimation methods on multiple benchmark problems.

Live content is unavailable. Log in and register to view live content