TY - GEN
T1 - AutoDEUQ
T2 - 26th International Conference on Pattern Recognition, ICPR 2022
AU - Egele, Romain
AU - Maulik, Romit
AU - Raghavan, Krishnan
AU - Lusch, Bethany
AU - Guyon, Isabelle
AU - Balaprakash, Prasanna
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Deep neural networks are powerful predictors for a variety of tasks. However, they do not capture uncertainty directly. Using neural network ensembles to quantify uncertainty is competitive with approaches based on Bayesian neural networks while benefiting from better computational scalability. However, building ensembles of neural networks is a challenging task because, in addition to choosing the right neural architecture or hyperparameters for each member of the ensemble, there is an added cost of training each model. To address this issue, we propose AutoDEUQ, an automated approach for generating an ensemble of deep neural networks. Our approach leverages joint neural architecture and hyperparameter search to generate ensembles. We use the law of total variance to decompose the predictive variance of deep ensembles into aleatoric (data) and epistemic (model) uncertainties. We show that AutoDEUQ outperforms probabilistic backpropagation, Monte Carlo dropout, deep ensemble, distribution-free ensembles, and hyper ensemble methods on a number of regression benchmarks.
AB - Deep neural networks are powerful predictors for a variety of tasks. However, they do not capture uncertainty directly. Using neural network ensembles to quantify uncertainty is competitive with approaches based on Bayesian neural networks while benefiting from better computational scalability. However, building ensembles of neural networks is a challenging task because, in addition to choosing the right neural architecture or hyperparameters for each member of the ensemble, there is an added cost of training each model. To address this issue, we propose AutoDEUQ, an automated approach for generating an ensemble of deep neural networks. Our approach leverages joint neural architecture and hyperparameter search to generate ensembles. We use the law of total variance to decompose the predictive variance of deep ensembles into aleatoric (data) and epistemic (model) uncertainties. We show that AutoDEUQ outperforms probabilistic backpropagation, Monte Carlo dropout, deep ensemble, distribution-free ensembles, and hyper ensemble methods on a number of regression benchmarks.
UR - http://www.scopus.com/inward/record.url?scp=85143623114&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85143623114&partnerID=8YFLogxK
U2 - 10.1109/ICPR56361.2022.9956231
DO - 10.1109/ICPR56361.2022.9956231
M3 - Conference contribution
AN - SCOPUS:85143623114
T3 - Proceedings - International Conference on Pattern Recognition
SP - 1908
EP - 1914
BT - 2022 26th International Conference on Pattern Recognition, ICPR 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 21 August 2022 through 25 August 2022
ER -