NGC Catalog
CLASSIC
Welcome Guest
Containers
TensorRT-LLM Release

TensorRT-LLM Release

For copy image paths and more information, please view on a desktop device.
Description
TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and support state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs.
Publisher
NVIDIA
Latest Tag
0.21.0rc1
Modified
June 11, 2025
Compressed Size
28.7 GB
Multinode Support
No
Multi-Arch Support
Yes
0.21.0rc1 (Latest) Security Scan Results
No results available.

Description

TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and support state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in performant way.

Overview

TensorRT-LLM Release Container

The TensorRT-LLM Release container provides a pre-built environment for running TensorRT-LLM.

Visit the official GitHub repository for more details.

Running TensorRT-LLM Using Docker

A typical command to launch the container is:

docker run --rm -it --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 --gpus=all \
            nvcr.io/nvidia/tensorrt-llm/release:x.xx.x

where x.xx.x is the version of the TensorRT-LLM container to use. To sanity check, run the following command:

python3 -c "import tensorrt_llm"

This command will print the TensorRT-LLM version if everything is working correctly. After verification, you can explore and try the example scripts included in /app/tensorrt_llm/examples.

Alternatively, if you have already cloned the TensorRT-LLM repository, you can use the following convenient command to run the container:

make -C docker ngc-release_run LOCAL_USER=1 DOCKER_PULL=1 IMAGE_TAG=x.xx.x

This command pulls the specified container from the NVIDIA NGC registry, sets up the local user's account within the container, and launches it with full GPU support.

For comprehensive information about TensorRT-LLM, including documentation, source code, examples, and installation guidelines, visit the following official resources:

  • TensorRT-LLM GitHub Repository
  • TensorRT-LLM Online Documentation

Security CVEs

To review known CVEs on this image, refer to the Security Scanning tab on this page.

License

By pulling and using the container, you accept the terms and conditions of this End User License Agreement and Product-Specific Terms.