NGC | Catalog
CatalogModelsSE(3)-Transformer checkpoint (PyTorch, AMP, QM9 homo task)

SE(3)-Transformer checkpoint (PyTorch, AMP, QM9 homo task)

For downloads and more information, please view on a desktop device.
Logo for SE(3)-Transformer checkpoint (PyTorch, AMP, QM9 homo task)

Description

SE(3)-Transformer checkpoint trained on QM9 homo task

Publisher

NVIDIA Deep Learning Examples

Latest Version

21.07.0

Modified

April 4, 2023

Size

106.36 MB

Model Overview

A Graph Neural Network using a variant of self-attention for 3D points and graphs processing.

Model Architecture

The model consists of stacked layers of equivariant graph self-attention and equivariant normalization. Lastly, a Tensor Field Network convolution is applied to obtain invariant features. Graph pooling (mean or max over the nodes) is applied to these features, and the result is fed to a final MLP to get scalar predictions.

In this setup, the model is a graph-to-scalar network. The pooling can be removed to obtain a graph-to-graph network, and the final TFN can be modified to output features of any type (invariant scalars, 3D vectors, ...).

Model high-level architecture

Training

This model was trained using script available on NGC and in GitHub repo

Dataset

The following datasets were used to train this model:

  • Quantum Machines 9 - Database providing quantum chemical properties for a relevant, consistent, and comprehensive chemical space of small organic molecules.

Performance

Performance numbers for this model are available in NGC

References

License

This model was trained using open-source software available in Deep Learning Examples repository. For terms of use, please refer to the license of the script and the datasets the model was derived from.