Presentation
1 August 2021 A critical evaluation of uncertainty estimation in neural networks
Marian Anghel, Patrick Kelly, Nicolas Hengartner
Author Affiliations +
Abstract
Quantifying the predictive uncertainty of Neural Network (NN) models remains a dificult, unsolved problem especially since the ground truth is usually not available. In this work we evaluate many regression uncertainty estimation models and discuss their accuracy using training sets where the uncertainty is known exactly. We compare three regression models, a homoscedastic model, a heteroscedastic model, and a quantile model and show that: while all models can learn an accurate estimation of response, the accurate estimation of uncertainty is very difficult; the quantile model has the best performance in estimating uncertainty; model bias is confused with uncertainty and it is very difficult to disentangle the two when we have only one measurement per training point; improved accuracy of the estimated uncertainty is possible, but the experimental cost for learning uncertainty is very large since it requires multiple estimations of the response almost everywhere in the input space.
Conference Presentation
© (2021) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Marian Anghel, Patrick Kelly, and Nicolas Hengartner "A critical evaluation of uncertainty estimation in neural networks", Proc. SPIE 11843, Applications of Machine Learning 2021, 118430H (1 August 2021); https://doi.org/10.1117/12.2594739
Advertisement
Advertisement
KEYWORDS
Neural networks

Performance modeling

Error analysis

Back to Top