cyrusyc commited on
Commit
bdab21e
1 Parent(s): 58b84ea

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -3
README.md CHANGED
@@ -4,12 +4,14 @@ tags:
4
  - Machine Learning Interatomic Potential
5
  ---
6
 
7
- # Model Card for mace-universal
8
 
9
- [MACE](https://github.com/ACEsuit/mace) (Multiple Atomic Cluster Expansion) is a machine learning interatomic potential (MLIP) with higher order equivariant message passing. For more information about MACE formalism, please see authors' [paper](https://arxiv.org/abs/2206.07697).
10
 
 
 
11
 
12
- [2023-08-14-mace-universal.model](https://huggingface.co/cyrusyc/mace-universal/blob/main/2023-08-14-mace-universal.model) was trained with MPTrj data, [Materials Project](https://materialsproject.org) relaxation trajectories compiled by [CHGNet](https://arxiv.org/abs/2302.14231) authors to cover 89 elements and 1.6M configurations. The checkpoint was used for materials stability prediction on [Matbench Discovery](https://matbench-discovery.materialsproject.org/) and the associated [preprint](https://arXiv.org/abs/2308.14920).
13
 
14
  # Usage
15
 
@@ -54,6 +56,13 @@ dyn.run(steps)
54
  If you use the pretrained models in this repository, please cite all the following:
55
 
56
  ```
 
 
 
 
 
 
 
57
  @inproceedings{Batatia2022mace,
58
  title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
59
  author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
 
4
  - Machine Learning Interatomic Potential
5
  ---
6
 
7
+ # Model Card for mace-universal / mace-mp
8
 
9
+ MACE-MP is a pretrained general-purpose foundational interatomic potential published with the [preprint arXiv:2401.00096](https://arxiv.org/abs/2401.00096).
10
 
11
+ This repository is the archive of pretrained checkpoints for manual loading using `MACECalculator` or further fine-tuning. Now the easiest way to use models is to follow the **[documentation for foundtional models](https://mace-docs.readthedocs.io/en/latest/examples/foundation_models.html)**.
12
+ All the models are trained with MPTrj data, [Materials Project](https://materialsproject.org) relaxation trajectories compiled by [CHGNet](https://arxiv.org/abs/2302.14231) authors to cover 89 elements and 1.6M configurations. The checkpoint was used for materials stability prediction on [Matbench Discovery](https://matbench-discovery.materialsproject.org/) and the associated [preprint](https://arXiv.org/abs/2308.14920).
13
 
14
+ [MACE](https://github.com/ACEsuit/mace) (Multiple Atomic Cluster Expansion) is a machine learning interatomic potential (MLIP) with higher order equivariant message passing. For more information about MACE formalism, please see authors' [paper](https://arxiv.org/abs/2206.07697).
15
 
16
  # Usage
17
 
 
56
  If you use the pretrained models in this repository, please cite all the following:
57
 
58
  ```
59
+ @article{batatia2023foundation,
60
+ title={A foundation model for atomistic materials chemistry},
61
+ author={Batatia, Ilyes and Benner, Philipp and Chiang, Yuan and Elena, Alin M and Kov{\'a}cs, D{\'a}vid P and Riebesell, Janosh and Advincula, Xavier R and Asta, Mark and Baldwin, William J and Bernstein, Noam and others},
62
+ journal={arXiv preprint arXiv:2401.00096},
63
+ year={2023}
64
+ }
65
+
66
  @inproceedings{Batatia2022mace,
67
  title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
68
  author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},