NGC | Catalog
CatalogModelsRiva ASR Mandarin LM

Riva ASR Mandarin LM

Logo for Riva ASR Mandarin LM
Features
Description
Base Mandarin 4-gram LM
Publisher
NVIDIA
Latest Version
deployable_v2.1
Modified
October 6, 2023
Size
7.88 GB

Speech Recognition: Mandarin N-Gram Language Models ==================================================

Model Overview --------------

When deployed, the ASR engine can optionally condition the transcript output on n-gram language models.

Model Architecture ------------------

These models are simple 4-gram language models trained with Kneser-Ney smoothing using KenLM.

Intended Use ------------

Primary use case intended for these models is automatic speech recognition.

Input Sequence of zero or more words.

Output Likelihood of word sequence.

How to Use This Model ---------------------

There are a variety of formats contained within this model archive:

ARPA-formatted Language Models:

  • riva_zh_asr_set_2.1_4gram.arpa
  • riva_zh_asr_set_2.1_4gram_pruned_0_0_0_1.arpa

KenLM-formatted Binary Language Models

  • riva_zh_asr_set_2.1_4gram.bin
  • riva_zh_asr_set_2.1_4gram_pruned_0_0_0_1.bin

ARPA and KenLM binary formatted files can be used directly by the CTC CPU Decoder.

Training Information --------------------

NA

Limitations -----------

Currently, TLT cannot train LMs for ASR inference. To train custom LMs for ASR inference, use KenLM and consult the Riva Documentation.

License -------

By downloading and using the models and resources packaged with TLT Conversational AI, you would be accepting the terms of the Riva license.

Ethical AI

NVIDIA’s platforms and application frameworks enable developers to build a wide array of AI applications.

Consider potential algorithmic bias when choosing or creating the models being deployed. Work with the model’s developer to ensure that it meets the requirements for the relevant industry and use case; that the necessary instruction and documentation are provided to understand error rates, confidence intervals, and results; and that the model is being used under the conditions and in the manner intended. --------------------------------------------------