Linux / arm64
Linux / amd64
The core of NVIDIA TensorRT is a C++ library that facilitates high-performance inference on NVIDIA graphics processing units (GPUs). TensorRT takes a trained network, which consists of a network definition and a set of trained parameters, and produces a highly optimized runtime engine that performs inference for that network.
Before you can run an NGC deep learning framework container, your Docker environment must support NVIDIA GPUs. To run a container, issue the appropriate command as explained in the Running A Container chapter in the NVIDIA Containers And Frameworks User Guide and specify the registry, repository, and tags. For more information about using NGC, refer to the NGC Container User Guide.
Procedure
Select the Tags tab and locate the container image release that you want to run.
In the Pull Tag column, click the icon to copy the docker pull command.
Open a command prompt and paste the pull command. The pulling of the container image begins. Ensure the pull completes successfully before proceeding to the next step.
Run the container image.
docker run --gpus all -it --rm -v local_dir:container_dir nvcr.io/nvaie/tensorrt:xx.xx-py3
Where:
-it means interactive
--rm will delete the container when finished
xx.xx is the container version. For example, 21.07.
You can build and run the TensorRT C++ samples from within the image. For details on how to run each sample, see the TensorRT Developer Guide.
cd /workspace/tensorrt/samples
make -j4
cd /workspace/tensorrt/bin
./sample_mnist
You can also execute the TensorRT Python samples.
cd /workspace/tensorrt/samples/python/introductory_parser_samples
python caffe_resnet50.py -d /workspace/tensorrt/python/data
See /workspace/README.md inside the container for information on customizing your image.
In order to save space, some of the dependencies of the Python samples have not been pre-installed in the container. To install these dependencies, run the following command before you run these samples:
/opt/tensorrt/python/python_setup.sh
For the latest TensorRT container Release Notes see the TensorRT Container Release Notes website.
For the latest TensorRT product Release Notes, Developer and Installation Guides, see the TensorRT Product Documentation website.
Please review the Security Scanning tab to view the latest security scan results. For certain open-source vulnerabilities listed in the scan results, NVIDIA provides a response in the form of a Vulnerability Exploitability eXchange (VEX) document. The VEX information can be reviewed and downloaded from the Security Scanning tab.
Collection of ftrace events may not work correctly, a newer version such as Nsight Systems 2024.5.4 from JetPack 6.1 or JetPack 5.1 could be used instead to collect ftrace events. Profiling from Nsight Systems GUI on IGX with a discrete GPU might not work, as well as connecting to such a devkit from a Ubuntu x86_64 host over SSH. In this case, please use the Nsight Systels command line (nsys) directly on the target.
By pulling and using the container, you accept the terms and conditions of this End User License Agreement.