NGC | Catalog
Welcome Guest
CatalogModelsMegatron-BERT 345M Uncased

Megatron-BERT 345M Uncased

For downloads and more information, please view on a desktop device.
Logo for Megatron-BERT 345M Uncased

Description

345M parameter BERT Megatron model with uncased vocab

Publisher

NVIDIA NeMo

Use Case

Natural Language Processing

Framework

Pytorch

Latest Version

1

Modified

May 17, 2022

Size

1.25 GB

Overview

This is a nemo file for Megatron BERT 345m with uncased BERT vocab.

Please be sure to download the latest version in order to ensure compatibility with the latest NeMo release.

Model Architecture

NeMo Megatron is a new capability in the NeMo framework that allows developers to effectively train and scale language models to billions of parameters. Unlike BERT, the position of the layer normalization and the residual connection in the model architecture (similar to GPT-2 architucture) are swapped, which allowed the models to continue to improve as they were scaled up. This model reaches higher scores compared to BERT on a range of Natural Language Processing (NLP) tasks.

This 345m papameter model has 24 layers (Transformer blocks), 1024 hidden-units, and 16 attention heads.

For more information about NeMo Megatron visit https://github.com/NVIDIA/NeMo

Dataset

This model was trained on text sourced from Wikipedia, RealNews, OpenWebText, and CC-Stories. We offer versions of this model pretrained both with a cased and uncased vocabulary.

How to use this Model

NVIDIA NeMo can be used for easy fine-tuning to a number of different tasks. Tutorial notebooks on fine-tuning the model for Named Entity Recognition, Relation Extraction can be found on the tutorials page of NeMo.

Source code and developer guide is available at https://github.com/NVIDIA/NeMo Refer to documentation at https://docs.nvidia.com/deeplearning/nemo/neural-modules-release-notes/index.html

In the following we show examples for how to finetune BioMegatron on different downstream tasks.

Usage example 1: Finetune on RE dataset ChemProt https://github.com/NVIDIA/NeMo/blob/r1.7.2/tutorials/nlp/Relation_Extraction-BioMegatron.ipynb

Usage example 2: Finetune on NER dataset NBCI https://github.com/NVIDIA/NeMo/blob/r1.7.2/tutorials/nlp/Token_Classification-BioMegatron.ipynb

Limitations

No known limitations available at this time.

Licence

License to use this model is covered by the NGC TERMS OF USE unless another License/Terms Of Use/EULA is clearly specified. By downloading the public and release version of the model, you accept the terms and conditions of the NGC TERMS OF USE.