NGC | Catalog
CatalogContainersTriton Inference Server

Triton Inference Server

Logo for Triton Inference Server
Features
Description
Triton Inference Server is an open source software that lets teams deploy trained AI models from any framework, from local or cloud storage and on any GPU- or CPU-based infrastructure in the cloud, data center, or embedded devices.
Publisher
NVIDIA
Latest Tag
24.02-py3-igpu
Modified
March 2, 2024
Compressed Size
5.14 GB
Multinode Support
Yes
Multi-Arch Support
Yes
24.02-py3-igpu (Latest) Security Scan Results

Linux / arm64

Sorry, your browser does not support inline SVG.
Sorry, your browser does not support inline SVG.