NGC | Catalog
CatalogModels
Models
The NGC catalog offers 100s of pre-trained models for computer vision, speech, recommendation, and more. Bring AI faster to market by using these models as-is or quickly build proprietary models with a fraction of your custom data.
Sort: Most Popular
Displaying 0 models
NVIDIA AI Enterprise Support
91
Use Case
61
58
18
15
13
13
5
5
1
NVIDIA Platform
103
96
82
41
25
24
1
Framework
Industry
Solution
Publisher
Language
Other
Logo for RIVA Conformer ASR Japanese
Japanese (ja-JP) Conformer ASR model trained on ASR set 4.0
Logo for RIVA Conformer ASR Italian
Italian (it-IT) Conformer ASR model trained on ASR set 1.0
Logo for RIVA Punctuation
For each word in the input text, the model: predicts a punctuation mark that should follow the word (if any).
Logo for RIVA Punctuation
For each word in the input text, the model: predicts a punctuation mark that should follow the word (if any).
Logo for RIVA Punctuation and Capitalization for Brazilian Portuguese
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods and question marks) and 2) predicts if the word should be capitalized or not.
Logo for RIVA Punctuation and Capitalization for Korean
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods and question marks) and 2) predicts if the word should be capitalized or not.
Logo for EfficientDet-D0 checkpoint (Pytorch, AMP, COCO17)
EfficientDet-D0 Pytorch checkpoint trained on COCO using batchsize=1200
Logo for Joint Intent and Slot Classification DistilBert
Intent and Slot classification of the queries for the misty bot with DistilBert model trained on weather, smalltalk and POI (places of interest) data.
Logo for Riva ASR Mandarin Inverse Normalization Grammar
Logo for RIVA Citrinet ASR Hindi
Hindi Citrinet ASR model trained on RIVA ASR set
Logo for RIVA Citrinet-1024 ASR Brazilian Portuguese
Brazilian Portuguese (pt-BR) Citrinet-1024 ASR model trained on ASR set 1.0
Logo for RIVA Citrinet-1024 ASR Spanish EMEA
Spanish EMEA (es-ES) Citrinet-1024 ASR model trained on ASR set 1.0
Logo for RIVA Conformer ASR Japanese
Japanese (ja-JP) Conformer ASR model trained on ASR set 3.0
Logo for RIVA Diarizer Neural VAD
Neural VAD model used in Riva Speaker Diarization
Logo for RIVA EnglishUS Hifigan
HifiGAN is a neural vocoder model for text-to-speech applications. It is intended as the second part of a two-stage speech synthesis pipeline, with a mel-spectrogram generator such as FastPitch as the first stage.
Logo for RIVA Named Entity Recognition
The model identifies a category/entity the word in the input text belongs to.
Logo for RIVA Punctuation and Capitalization for French
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods, hyphens and question marks) and 2) predicts if the word should be capitalized or not.
Logo for RIVA Punctuation and Capitalization for Spanish
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods and question marks) and 2) predicts if the word should be capitalized or not.
Logo for RIVA Punctuation for Hindi
For each word in the input text, the model predicts a punctuation mark that should follow the word (if any), the model supports commas, poornvirams, exclaimation marks and question marks.
Logo for Wide&Deep TF2 checkpoint (Base, 128k, AMP, NVTabular, Multihot)
Wide&Deep Base TensorFlow2 checkpoint trained with AMP on NVTabular preprocessed dataset with multihot embeddings
Logo for BART PyT checkpoint (Summarization, XSum)
BART PyT checkpoint for summarization on XSum dataset
Logo for BERT PaddlePaddle checkpoint (Large, Pretraining, AMP, LAMB)
BERT Large PaddlePaddle checkpoint pretrained with LAMB optimizer using AMP
Logo for BERT PyTorch checkpoint (Dist-4L-288D, SQuAD1, seqLen384, AMP)
BERT Distilled 4L-288D PyTorch checkpoint distilled on SQuAD v1.1 dataset using AMP
Logo for BERT PyTorch checkpoint (Dist-6L-768D, Pretraining, AMP)
BERT Distilled 6L-768D PyTorch Phase2 checkpoint pretrained using 67K steps on seqLen128 and 6k steps on seqLen512