NGC | Catalog
Welcome Guest
CatalogModelsNMT Multilingual De/Es/Fr En Transformer6x6

NMT Multilingual De/Es/Fr En Transformer6x6

For downloads and more information, please view on a desktop device.
Logo for NMT Multilingual De/Es/Fr En Transformer6x6

Description

Multilingual Neural Machine Translation model to translate from German/Spanish/French to English

Publisher

NVIDIA

Use Case

Other

Framework

Other

Latest Version

1.2.0

Modified

September 16, 2021

Size

972.59 MB

Model Overview

This model can be used for translating text in source language (De/Es/Fr) to a text in target language (En).

Model Architecture

The model is based on Transformer "Big" architecture originally presented in "Attention Is All You Need" paper [1]. It is using SentencePiece tokenizer [2].

Training

These models were trained on a collection of many publicly available datasets comprising of millions of parallel sentences. The NeMo toolkit [5] was used for training this model over roughly 700k steps.

Datasets

While training this model, we used the following datasets:

German

Spanish

French

Tokenizer Construction

We used the SentencePiece tokenizer [2] with shared encoder and decoder BPE tokenizers.

Performance

The accuracy of translation models are often measured using BLEU scores [3]. The model achieves the following sacreBLEU [4] scores on WMT test sets

De
WMT13 - 34.6
WMT14 - 36.3

Es
WMT12 - 40.3
WMT13 - 36.9

Fr
WMT13 - 36.2
WMT14 - 40.6

How to Use this Model

The model is available for use in the NeMo toolkit [5], and can be used as a pre-trained checkpoint for inference or for fine-tuning on another dataset.

Automatically load the model from NGC

import nemo
import nemo.collections.nlp as nemo_nlp
nmt_model = nemo_nlp.models.machine_translation.MTEncDecModel.from_pretrained(model_name="mnmt_deesfr_en_transformer6x6")

Translating text with this model

python [NEMO_GIT_FOLDER]/examples/nlp/machine_translation/nmt_transformer_infer.py --model=mnmt_deesfr_en_transformer6x6.nemo --srctext=[TEXT_IN_SRC_LANGUAGE] --tgtout=[WHERE_TO_SAVE_TRANSLATIONS] --target_lang en --source_lang [SOURCE_LANGUAGE]

where [SOURCE_LANGUAGE] can be 'de' or 'es' or 'fr'

Input

This translate method of the NMT model accepts a list of de-tokenized strings.

Output

The translate method outputs a list of de-tokenized strings in the target language.

Limitations

No known limitations at this time.

References

[1] Vaswani, Ashish, et al. "Attention is all you need." arXiv preprint arXiv:1706.03762 (2017).

[2] https://github.com/google/sentencepiece

[3] https://en.wikipedia.org/wiki/BLEU

[4] https://github.com/mjpost/sacreBLEU

[5] NVIDIA NeMo Toolkit

Licence

License to use this model is covered by the NGC TERMS OF USE unless another License/Terms Of Use/EULA is clearly specified. By downloading the public and release version of the model, you accept the terms and conditions of the NGC TERMS OF USE.