Training Set Optimization with Uncertainty Quantification for Machine Learning Models of Electromagnetic Structures

Yiliang Guo, Osama Waqar Bhatti, Madhavan Swaminathan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural Networks surrogate modeling for EM simulations saves computational and design time. Introducing uncertainty estimates into deterministic prediction models provides insight into the reliability and confidence of the model. However, gathering training data to train models is a very time-consuming and resource-consuming task. In this paper, we introduce a method to harness useful insights from confidence bounds to reduce the training set size required to train a model with reasonable accuracy and latency. Using a high-speed differential via structure, we show that the training samples required are 35% less with a slight trade-off in accuracy using the proposed method.

Original languageEnglish (US)
Title of host publication2022 IEEE Electrical Design of Advanced Packaging and Systems, EDAPS 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665491945
DOIs
StatePublished - 2022
Event2022 IEEE Electrical Design of Advanced Packaging and Systems, EDAPS 2022 - Urbana, United States
Duration: Dec 12 2022Dec 14 2022

Publication series

NameIEEE Electrical Design of Advanced Packaging and Systems Symposium
Volume2022-December
ISSN (Print)2151-1225
ISSN (Electronic)2151-1233

Conference

Conference2022 IEEE Electrical Design of Advanced Packaging and Systems, EDAPS 2022
Country/TerritoryUnited States
CityUrbana
Period12/12/2212/14/22

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Automotive Engineering
  • General Computer Science

Fingerprint

Dive into the research topics of 'Training Set Optimization with Uncertainty Quantification for Machine Learning Models of Electromagnetic Structures'. Together they form a unique fingerprint.

Cite this