NGC | Catalog
CatalogResourcesBERT for TensorFlow

BERT for TensorFlow

For downloads and more information, please view on a desktop device.
Logo for BERT for TensorFlow

Description

BERT is a method of pre-training language representations which obtains state-of-the-art results on a wide array of NLP tasks.

Publisher

NVIDIA

Use Case

Nlp

Framework

TensorFlow

Latest Version

20.06.17

Modified

November 12, 2021

Compressed Size

1.37 MB