A Graph Neural Network using a variant of self-attention for 3D points and graphs processing.
The model consists of stacked layers of equivariant graph self-attention and equivariant normalization. Lastly, a Tensor Field Network convolution is applied to obtain invariant features. Graph pooling (mean or max over the nodes) is applied to these features, and the result is fed to a final MLP to get scalar predictions.
In this setup, the model is a graph-to-scalar network. The pooling can be removed to obtain a graph-to-graph network, and the final TFN can be modified to output features of any type (invariant scalars, 3D vectors, ...).
The following datasets were used to train this model:
- Quantum Machines 9 - Database providing quantum chemical properties for a relevant, consistent, and comprehensive chemical space of small organic molecules.
Performance numbers for this model are available in NGC.