Abstract
Recent progress towards universal machine-learned interatomic potentials holds considerable promise for materials discovery. Yet the accuracy of these potentials for predicting phase stability may still be limited. In contrast, cluster expansions provide accurate phase stability predictions but are computationally demanding to parameterize from first principles, especially for structures of low dimension or with a large number of components, such as interfaces or multimetal catalysts. We overcome this trade-off via transfer learning. Using Bayesian inference, we incorporate prior statistical knowledge from machine-learned and physics-based potentials, enabling us to sample the most informative configurations and to efficiently fit first-principles cluster expansions. This algorithm is tested on Pt:Ni, showing robust convergence of the mixing energies as a function of sample size with reduced statistical fluctuations.
Original language | English (US) |
---|---|
Article number | 113073 |
Journal | Computational Materials Science |
Volume | 242 |
DOIs | |
State | Published - Jun 2024 |
All Science Journal Classification (ASJC) codes
- General Computer Science
- General Chemistry
- General Materials Science
- Mechanics of Materials
- General Physics and Astronomy
- Computational Mathematics