NGC | Catalog
CatalogModels
Models
The NGC catalog offers 100s of pre-trained models for computer vision, speech, recommendation, and more. Bring AI faster to market by using these models as-is or quickly build proprietary models with a fraction of your custom data.
Sort: Most Popular
Displaying 0 models
NVIDIA AI Enterprise Support
91
Use Case
62
58
18
15
13
13
5
5
3
3
1
1
1
NVIDIA Platform
103
96
82
41
25
24
3
1
Framework
Industry
Solution
Publisher
Language
Other
Logo for STT Eo Conformer-Transducer Large
Conformer-Transducer-Large model for Esperanto Automatic Speech Recognition, finetuning from English SSL model on Mozilla Common Voice Esperanto 11.0 dataset.
Logo for Modulus Darcy FNO
Fourier neural operator (FNO) model. Fourier Neural Operator (FNO) is a family of resolution invariant network architectures that use spectral convolutions to learn mappings between function spaces.
Logo for RIVA Punctuation and Capitalization for Korean
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods and question marks) and 2) predicts if the word should be capitalized or not.
Logo for Joint Intent and Slot Classification DistilBert
Intent and Slot classification of the queries for the misty bot with DistilBert model trained on weather, smalltalk and POI (places of interest) data.
Logo for LangID PearlNet
PearlNet Lang ID model for Spoken Language Identification
Logo for Riva ASR Mandarin Inverse Normalization Grammar
Logo for RIVA Citrinet-1024 ASR Spanish EMEA
Spanish EMEA (es-ES) Citrinet-1024 ASR model trained on ASR set 1.0
Logo for RIVA Conformer ASR Italian
Italian (it-IT) Conformer ASR model trained on ASR set 1.0
Logo for RIVA Conformer ASR Japanese
Japanese (ja-JP) Conformer ASR model trained on ASR set 3.0
Logo for RIVA Punctuation
For each word in the input text, the model: predicts a punctuation mark that should follow the word (if any).
Logo for RIVA Punctuation
For each word in the input text, the model: predicts a punctuation mark that should follow the word (if any).
Logo for RIVA Punctuation and Capitalization for Brazilian Portuguese
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods and question marks) and 2) predicts if the word should be capitalized or not.
Logo for RIVA Punctuation and Capitalization for French
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods, hyphens and question marks) and 2) predicts if the word should be capitalized or not.
Logo for RIVA Punctuation and Capitalization for Spanish
For each word in the input text, the model: 1) predicts a punctuation mark that should follow the word (if any), the model supports commas, periods and question marks) and 2) predicts if the word should be capitalized or not.
Logo for RIVA Punctuation for Hindi
For each word in the input text, the model predicts a punctuation mark that should follow the word (if any), the model supports commas, poornvirams, exclaimation marks and question marks.
Logo for Wide&Deep TF2 checkpoint (Base, 128k, AMP, NVTabular, Multihot)
Wide&Deep Base TensorFlow2 checkpoint trained with AMP on NVTabular preprocessed dataset with multihot embeddings
Logo for BERT PaddlePaddle checkpoint (Large, Pretraining, AMP, LAMB)
BERT Large PaddlePaddle checkpoint pretrained with LAMB optimizer using AMP
Logo for BERT PyTorch checkpoint (Dist-4L-288D, SQuAD1, seqLen384, AMP)
BERT Distilled 4L-288D PyTorch checkpoint distilled on SQuAD v1.1 dataset using AMP
Logo for BERT-Base checkpoint (PyTorch, AMP, SST-2, seqLen128)
BERT-Base PyTorch checkpoint finetuned on GLUE/SST-2 using seqLen=128
Logo for BioBERT TF checkpoint (Base, Uncased, NER, ChemProt, AMP)
BioBERT Base Uncased Fine tuned checkpoint for Named Entity Recognition on BC5CDR Chemical dataset.
Logo for BioBERT TF checkpoint (Base, Uncased, RE, ChemProt, AMP)
BioBERT Base Uncased Fine tuned checkpoint for Relation Extraction on ChemProt Dataset.
Logo for EfficientDet-D0 checkpoint (Pytorch, AMP, COCO17)
EfficientDet-D0 Pytorch checkpoint trained on COCO using batchsize=1200
Logo for EfficientDet-D0 checkpoint (TensorFlow2, AMP, COCO17)
EfficientDet-D0 TensorFlow2 checkpoint trained on COCO using batchsize=1600
Logo for efficientnet-b0 pretrained weights (PyTorch, AMP, ImageNet)
efficientnet-b0 ImageNet pretrained weights