NGC | Catalog
CatalogResources
Resources
The NGC catalog offers step-by-step instructions and scripts through Jupyter Notebooks for various use cases, including machine learning, computer vision, and conversational AI. These resources help you examine, understand, customize, test, and build AI faster, while taking advantage of best practices.
Sort: Last Modified
A SSD Detection model for Endoscopy Surgical Tools
Logo for Endoscopy out of body Sample App Data
Holoscan Sample App data for Endoscopy out of body detection
Model for HoloHub Sample App for MONAI Endoscopic Tool Segmentation
Holoscan Sample App Data for AI-based Endoscopy Tool Tracking
Holoscan Sample App Data for AI Ultrasound Segmentation for Scoliosis
Holoscan Sample App Data for Multi-AI Ultrasound Pipeline
Holoscan Sample App Data for AI Colonoscopy Segmentation of Polyps
Logo for NGC CLI
NGC CLI
Resource
NGC CLI
GXF (x86_64)
Resource
Headers and x86_64 libraries for GXF, for use with the Holoscan SDK
Logo for Riva Skills Embedded Quick Start
Scripts and utilities for getting started with Riva Speech Skills on Embedded platforms
Logo for Riva Skills Quick Start
Scripts and utilities for getting started with Riva Speech Skills
GXF (arm64)
Resource
Headers and arm64 libraries for GXF, for use with the Holoscan SDK
Logo for NVIDIA Time Series Prediction Platform
NVIDIA Time Series Prediction Platform is a tool designed to compare easily and experiment with arbitrary combinations of forecasting models, time-series datasets, and other configurations.
Logo for TAO Toolkit Getting Started
Quick start guide for TAO Toolkit.
Logo for TAO Converter
Binary to decrypt a .etlt from TAO Toolkit and generate a TensorRT engine
Logo for DOCA Container Resources
Set of container configuration files for the various DOCA containers.
Logo for Holoscan Debian Packages
The debian packages for the Holoscan SDK
Logo for TAO Classification-tf2 Notebook
This notebook shows an example use case for classification using the Train Adapt Optimize (TAO) Toolkit.We will be using the pascal VOC dataset for the tutorial.
Logo for BERT for PaddlePaddle
BERT is a method of pre-training language representations which obtains state-of-the-art results on a wide array of NLP tasks.
Logo for TFT PyTorch codebase
PyTorch codebase for training and using TFT model
Logo for Tacotron2 and WaveGlow PyTorch codebase
PyTorch codebase for training and using Tacotron2 and Waveglow models
Logo for SIM TensorFlow2 codebase
TensorFlow2 codebase for training and using SIM model
Logo for ConvNets PyTorch codebase
PyTorch codebase for training and using ConvNet models
Logo for BERT TensorFlow1 codebase
TensorFlow1 codebase for training and using BERT model
Logo for BERT PaddlePaddle codebase
PaddlePaddle codebase for training and using BERT model