Studies leveraging pre-trained or fine-tuned FAIR-Chem models#

Many papers have now used FAIR-Chem models to accelerate screening and discovery efforts and enable new computational chemistry simulations. We highlight some here just to give an idea of the breadth of possibilities and how they have been used. Feel free to reach out (or submit PRs) if you want them included!

Accelerating computational catalysis#

[1]

Christopher Blais, Chao Xu, and Richard West. Uncertainty quantification of linear scaling, machine learning, and dft derived thermodynamics for the catalytic partial oxidation of methane on rhodium. Chemrxiv, 2024.

[2]

Adeesh Kolluru and John R Kitchin. Adsorbdiff: adsorbate placement via conditional denoising diffusion. arXiv preprint arXiv:2405.03962, 2024.

[3]

Janice Lan, Aini Palizhati, Muhammed Shuaibi, Brandon M Wood, Brook Wander, Abhishek Das, Matt Uyttendaele, C Lawrence Zitnick, and Zachary W Ulissi. Adsorbml: a leap in efficiency for adsorption energy calculations using generalizable machine learning potentials. npj Computational Materials, 9(1):172, 2023.

[4]

Brook Wander, Muhammed Shuaibi, John R Kitchin, Zachary W Ulissi, and C Lawrence Zitnick. Cattsunami: accelerating transition state energy calculations with pre-trained graph neural networks. arXiv preprint arXiv:2405.02078, 2024.

Transfer/fine-tuning strategies for FAIR-Chem pre-trained checkpoints#

[1]

John Falk, Luigi Bonati, Pietro Novelli, Michele Parrinello, and Massimiliano Pontil. Transfer learning for atomistic simulations using gnns and kernel mean embeddings. Advances in Neural Information Processing Systems, 2024.

[2]

Adeesh Kolluru, Nima Shoghi, Muhammed Shuaibi, Siddharth Goyal, Abhishek Das, C Lawrence Zitnick, and Zachary Ulissi. Transfer learning using attentions across atomic systems with graph neural networks (taag). The Journal of Chemical Physics, 2022.

[3]

Joseph Musielewicz, Xiaoxiao Wang, Tian Tian, and Zachary Ulissi. Finetuna: fine-tuning accelerated molecular simulations. Machine Learning: Science and Technology, 3(3):03LT01, 2022.

[4]

Kento Nishio, Kiyou Shibata, and Teruyasu Mizoguchi. Lightweight and high-precision materials property prediction using pre-trained graph neural networks and its application to a small dataset. Applied Physics Express, 17(3):037002, 2024.

[5]

Nima Shoghi, Adeesh Kolluru, John R Kitchin, Zachary W Ulissi, C Lawrence Zitnick, and Brandon M Wood. From molecules to materials: pre-training large generalizable models for atomic property prediction. arXiv preprint arXiv:2310.16802, 2023.

[6]

Bowen Wang, Chen Liang, Jiaze Wang, Furui Liu, Shaogang Hao, Dong Li, Jianye Hao, Guangyong Chen, Xiaolong Zou, and Pheng-Ann Heng. Dr-label: improving gnn models for catalysis systems by label deconstruction and reconstruction. arXiv preprint arXiv:2303.02875, 2023.

[7]

Xiaoxiao Wang, Joseph Musielewicz, Richard Tran, Sudheesh Kumar Ethirajan, Xiaoyan Fu, Hilda Mera, John R Kitchin, Rachel C Kurchin, and Zachary W Ulissi. Generalization of graph-based active learning relaxation strategies across materials. Machine Learning: Science and Technology, 5(2):025018, 2024.

Transfer/fine-tuning applications for FAIR-Chem pre-trained checkpoints#

[1]

Christian M Clausen, Jan Rossmeisl, and Zachary W Ulissi. Adapting oc20-trained equiformerv2 models for high-entropy materials. The Journal of Physical Chemistry C, 2024.

[2]

Roman A Eremin, Innokentiy S Humonen, Pavel N Zolotarev, Inna V Medrish, Leonid E Zhukov, and Semen A Budennyy. Hybrid dft/data-driven approach for searching for new quasicrystal approximants in sc-x (x= rh, pd, ir, pt) systems. Crystal Growth & Design, 22(7):4570–4581, 2022.

[3]

Aaron G Garrison, Javier Heras-Domingo, John R Kitchin, Gabriel dos Passos Gomes, Zachary W Ulissi, and Samuel M Blau. Applying large graph neural networks to predict transition metal complex energies using the tmqm_wb97mv data set. Journal of Chemical Information and Modeling, 63(24):7642–7654, 2023.

[4]

Kuzma Khrabrov, Anton Ber, Artem Tsypin, Konstantin Ushenin, Egor Rumiantsev, Alexander Telepov, Dmitry Protasov, Ilya Shenbin, Anton Alekseev, Mikhail Shirokikh, and others. $$\backslash $nablaˆ 2$ dft: a universal quantum chemistry dataset of drug-like molecules and a benchmark for neural network potentials. arXiv preprint arXiv:2406.14347, 2024.

Catalyst discovery or optimization#

[1]

Abhishek Agarwal, Sriram Goverapet Srinivasan, and Beena Rai. Heusler alloys as catalysts for hydrogen production by ammonia decomposition: data-driven screening via graph neural networks. Chemrxiv, 2024.

[2]

Kirby Broderick, Robert A Burnley, Andrew J Gellman, and John R Kitchin. Surface segregation studies in ternary noble metal alloys: comparing dft and machine learning with experimental data. ChemPhysChem, pages e202400073, 2024.

[3]

Richard Tran, Liqiang Huang, Yuan Zi, Shengguang Wang, Benjamin M Comer, Xuqing Wu, Stefan J Raaijman, Nishant K Sinha, Sajanikumari Sadasivan, Shibin Thundiyil, and others. Rational design of oxide catalysts for oer with oc22. arXiv preprint arXiv:2311.00784, 2023.

[4]

Richard Tran, Duo Wang, Ryan Kingsbury, Aini Palizhati, Kristin Aslaug Persson, Anubhav Jain, and Zachary W Ulissi. Screening of bimetallic electrocatalysts for water purification with machine learning. The Journal of chemical physics, 2022.

[5]

Brook Wander, Kirby Broderick, and Zachary W Ulissi. Catlas: an automated framework for catalyst discovery demonstrated for direct syngas conversion. Catalysis Science & Technology, 12(20):6256–6267, 2022.

Uncertainty quantification#

[1]

Joseph Musielewicz, Janice Lan, Matt Uyttendaele, and John R. Kitchin. Rotationally invariant latent distances for uncertainty estimation of relaxed energy predictions by graph neural network potentials. 2024. URL: https://arxiv.org/abs/2407.10844, arXiv:2407.10844.

[2]

Janghoon Ock, Tian Tian, John Kitchin, and Zachary Ulissi. Beyond independent error assumptions in large gnn atomistic models. The Journal of Chemical Physics, 2023.

[3]

Muhammed Shuaibi, Saurabh Sivakumar, Rui Qi Chen, and Zachary W Ulissi. Enabling robust offline active learning for machine learning potentials using simple physics-based priors. Machine Learning: Science and Technology, 2(2):025007, 2020.

Properties beyond energies and forces#

[1]

Xiang Fu, Andrew Rosen, Kyle Bystrom, Rui Wang, Albert Musaelian, Boris Kozinsky, Tess Smidt, and Tommi Jaakkola. A recipe for charge density prediction. arXiv preprint arXiv:2405.19276, 2024.

[2]

Rohan Yuri Sanspeur and John R Kitchin. Circumventing data imbalance in magnetic ground state data for magnetic moment predictions. Machine Learning: Science and Technology, 5(1):015023, 2024.

[3]

Ethan M Sunshine, Muhammed Shuaibi, Zachary W Ulissi, and John R Kitchin. Chemical properties from graph neural network-predicted electron densities. The Journal of Physical Chemistry C, 127(48):23459–23466, 2023.