TY - GEN
T1 - Stochastic Functional Verification of DNN Design through Progressive Virtual Dataset Generation
AU - Choi, Jinhang
AU - Irick, Kevin M.
AU - Hardin, Justin
AU - Qiu, Weichao
AU - Yuille, Alan
AU - Sampson, Jack
AU - Narayanan, Vijaykrishnan
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/4/26
Y1 - 2018/4/26
N2 - Deep Neural Networks have emerged as state-of-the-art solutions for complex intelligence problems. DNNs derive their predictive power by learning from millions of training examples in either a supervised or semi-supervised fashion. As such, a critical aspect of the DNN system design procedure is the collection of large annotated training datasets that exhibit high coverage of the problem space. While data synthesis and annotation techniques have been proposed to mitigate the burden of acquiring large datasets, these methods do not quantify the usefulness of each generated dataset and its subsequent impact on training effort. In this work we establish parallels between the autonomous design of DNNs for machine vision applications and the task of functionally verifying a hardware design. Similar to automatic test vector generation, we propose a technique that progressively generates training datasets using virtual synthetic models. Furthermore, we propose an automated DNN design framework that jointly tries to stochastically maximize training coverage while minimizing the number of training and validation cycles utilizing insights from functional verification.
AB - Deep Neural Networks have emerged as state-of-the-art solutions for complex intelligence problems. DNNs derive their predictive power by learning from millions of training examples in either a supervised or semi-supervised fashion. As such, a critical aspect of the DNN system design procedure is the collection of large annotated training datasets that exhibit high coverage of the problem space. While data synthesis and annotation techniques have been proposed to mitigate the burden of acquiring large datasets, these methods do not quantify the usefulness of each generated dataset and its subsequent impact on training effort. In this work we establish parallels between the autonomous design of DNNs for machine vision applications and the task of functionally verifying a hardware design. Similar to automatic test vector generation, we propose a technique that progressively generates training datasets using virtual synthetic models. Furthermore, we propose an automated DNN design framework that jointly tries to stochastically maximize training coverage while minimizing the number of training and validation cycles utilizing insights from functional verification.
UR - https://www.scopus.com/pages/publications/85057072628
UR - https://www.scopus.com/pages/publications/85057072628#tab=citedBy
U2 - 10.1109/ISCAS.2018.8351686
DO - 10.1109/ISCAS.2018.8351686
M3 - Conference contribution
AN - SCOPUS:85057072628
T3 - Proceedings - IEEE International Symposium on Circuits and Systems
BT - 2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 IEEE International Symposium on Circuits and Systems, ISCAS 2018
Y2 - 27 May 2018 through 30 May 2018
ER -