acesuit / mace-mp Goto Github PK
View Code? Open in Web Editor NEWMACE-MP models
Home Page: https://arxiv.org/abs/2401.00096
License: MIT License
MACE-MP models
Home Page: https://arxiv.org/abs/2401.00096
License: MIT License
Hello,
I am working on developing E(3)-equivariant neural networks, and the MACE code and papers have been invaluable for me. Thank you for the effort you have evidently put into the clarity of both code and ideas.
I'd like to understand how the MACE-MP-0 training runs went in more detail so I can see how my exploratory runs, trained for far fewer epochs, compare to a strong baseline.
To that end, do you have logs of the training and validation losses over time for any of the MACE-MP-0 foundation model training runs? Figure 57 (p. 113) of https://arxiv.org/pdf/2401.00096 has plots, but it would be great if I could work off of something more reliable than my eyesight. Additionally, are there any caveats about those training runs that I should understand if I'm comparing my own work?
Thanks again.
Hello! I am going through your paper and was looking for the datafiles in each of the "Similarity Statement" sections. These are the files that can be viewed on https://chemiscope.org
I can't find these files online, is there a link for them? Thank you!
Hi, thanks for uploading the training scripts. They're very helpful.
Is the statistics file referenced in the training shell scripts (e.g., mptrj-gga-ggapu-statistics.json mentioned here) available for download anywhere?
Hi,
Thank you for this model, it performs great.
I have some problems on an amorphous oxide, for which I have some DFT configurations too. I would like to use these for retraining the model and improve my results. Is there any script or API that would enable me to do that?
Thank you very much for your help!
Giuliana
Hi,
Thanks for building these models. I noticed that the training scripts for the MP pre-trained models use small batch sizes of 16. What was the reasoning for this choice?
My application requires training on graphs with hundreds to a few thousand nodes, and I was hoping that MACE's lack of explicit triplet angle computation (as in DimeNet or GemNet) would offer more favorable memory scaling. Any insights would be greatly appreciated.
Thanks,
Rees
Could you please add the training command line or parameters to this repo?
Hi folks,
Very exciting to see the foundation model for multi-element systems. :)
I was wondering if there is a model that is readily available to load in LAMMPS and perform structural minimization?
I see that one needs a .pt
file to load into LAMMPS from the tutorial here: https://mace-docs.readthedocs.io/en/latest/guide/lammps.html#id2
But the publicly available MACE model is of the format .model
from here: https://github.com/ACEsuit/mace-mp/releases/tag/mace_mp_0.
Are these two the same file?
EDIT: adding @cortner, since I work with him.
EDIT2: Also, I don't see a ML-MACE package available on LAMMPS, is it still not publicly released?
Hi,
Where could I find the data splits for this model?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.