TY - GEN
T1 - SENSITIVITY-CONSTRAINED FOURIER NEURAL OPERATORS FOR FORWARD AND INVERSE PROBLEMS IN PARAMETRIC DIFFERENTIAL EQUATIONS
AU - Behroozi, Abdolmehdi
AU - Shen, Chaopeng
AU - Kifer, Daniel
N1 - Publisher Copyright:
© 2025 13th International Conference on Learning Representations, ICLR 2025. All rights reserved.
PY - 2025
Y1 - 2025
N2 - Parametric differential equations of the form ∂u/∂t = f(u, x, t, p) are fundamental in science and engineering. While deep learning frameworks like the Fourier Neural Operator (FNO) efficiently approximate differential equation solutions, they struggle with inverse problems, sensitivity calculations ∂u/∂p, and concept drift. We address these challenges by introducing a novel sensitivity loss regularizer, demonstrated through Sensitivity-Constrained Fourier Neural Operators (SC-FNO). Our approach maintains high accuracy for solution paths and outperforms both standard FNO and FNO with Physics-Informed Neural Network regularization. SC-FNO exhibits superior performance in parameter inversion tasks, accommodates more complex parameter spaces (tested with up to 82 parameters), reduces training data requirements, and decreases training time while maintaining accuracy. These improvements apply across various differential equations and neural operators, enhancing their reliability without significant computational overhead (30%-130% extra training time per epoch). Models and selected experiment code are available at: https://github.com/AMBehroozi/SC_Neural_Operators.
AB - Parametric differential equations of the form ∂u/∂t = f(u, x, t, p) are fundamental in science and engineering. While deep learning frameworks like the Fourier Neural Operator (FNO) efficiently approximate differential equation solutions, they struggle with inverse problems, sensitivity calculations ∂u/∂p, and concept drift. We address these challenges by introducing a novel sensitivity loss regularizer, demonstrated through Sensitivity-Constrained Fourier Neural Operators (SC-FNO). Our approach maintains high accuracy for solution paths and outperforms both standard FNO and FNO with Physics-Informed Neural Network regularization. SC-FNO exhibits superior performance in parameter inversion tasks, accommodates more complex parameter spaces (tested with up to 82 parameters), reduces training data requirements, and decreases training time while maintaining accuracy. These improvements apply across various differential equations and neural operators, enhancing their reliability without significant computational overhead (30%-130% extra training time per epoch). Models and selected experiment code are available at: https://github.com/AMBehroozi/SC_Neural_Operators.
UR - https://www.scopus.com/pages/publications/105010212726
UR - https://www.scopus.com/inward/citedby.url?scp=105010212726&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:105010212726
T3 - 13th International Conference on Learning Representations, ICLR 2025
SP - 29663
EP - 29684
BT - 13th International Conference on Learning Representations, ICLR 2025
PB - International Conference on Learning Representations, ICLR
T2 - 13th International Conference on Learning Representations, ICLR 2025
Y2 - 24 April 2025 through 28 April 2025
ER -