NGC | Catalog

DeepStream

Logo for DeepStream
Features
Description
DeepStream SDK delivers a complete streaming analytics toolkit for AI based video and image understanding and multi-sensor processing. This container is for NVIDIA Enterprise GPUs.
Publisher
NVIDIA
Latest Tag
6.4-samples-multiarch
Modified
December 15, 2023
Compressed Size
6.63 GB
Multinode Support
No
Multi-Arch Support
Yes
6.4-samples-multiarch (Latest) Security Scan Results

Linux / amd64

Sorry, your browser does not support inline SVG.

Linux / arm64

Sorry, your browser does not support inline SVG.

Before You Start

We have introduced a new NGC DeepStream SDK Collection.

This collection serves as a hub for all DeepStream assets. Make sure you check it out!

DeepStream containers for Jetson-based devices

Please refer to the section below which describes the different container options offered for Jetson-based devices:

Container Name

Architecture

License Type

Notes

deepstream:6.4-triton-multiarch

Multi-Arch

X86 + Jetson

Deployment

The DeepStream Triton container enables inference using Triton Inference Server. With Triton developers can run inference natively using TensorFlow, TensorFlow-TensorRT, PyTorch and ONNX-RT. Inference with Triton is supported in the reference application (deepstream-app)

deepstream:6.4-samples-multiarch

Multi-Arch

X86 + Jetson

 

Deployment

The DeepStream samples container extends the base container to also include sample applications that are included in the DeepStream SDK along with associated config files, models, and streams. This container is ideal to understand and explore the DeepStream SDK using the provided samples.

 

Notes:

·       DeepStream dockers or dockers derived from previous releases (before DeepStream 6.1) will need to update their CUDA GPG key to perform software updates. You can find additional details here.

·       These containers use the NVIDIA Container Runtime for Jetson to run DeepStream applications. The NVIDIA Container Toolkit seamlessly expose specific parts of the device (i.e. BSP) to the DeepStream container, giving the applications resources need to run the application.

·       Jetpack 6.0 Developer Preview, NVIDIA Container Runtime no longer mounts user level libraries like CUDA, cuDNN and TensorRT from the host. These will instead be installed inside the containers.

Getting Started

Prerequisites:

Ensure these prerequisites are installed in your system before proceeding to the next step:

Component

Details

JetPack 6.0 Developer Preview

A Jetson device running L4T BSP r36.2

Codecs script

DeepStream dockers no longer package libraries for certain multimedia operations such as: audio data parsing, CPU decode, and CPU encode. This translates into limited functionality with MP4 files. We provide a script to install these components. Make sure to execute the script within the container:
/opt/nvidia/deepstream/deepstream/user_additional_install.sh

Fix for RTSP EOS issue

Sometimes with RTSP streams the application gets stuck on reaching EOS. This is because of an issue in rtpjitterbuffer component. To fix this issue,a script “update_rtpmanager.sh” at /opt/nvidia/deepstream/deepstream/ has been provided with required details to update gstrtpmanager library. The script should be executed once above mentioned packages are installed as prerequisite[BR1] 

Pull the container:

1.     From the top-right corner of this page, select the pull-down Get Container and copy the URL to the default container. Alternatively, click on View all tags to select a different container.

2.     Open a command prompt on your Linux compatible system and run the following command. Ensure the pull completes successfully before proceeding to the next step.

docker pull nvcr.io/nvidia/deepstream:6.4-triton-multiarch

Run the container:

  1. Allow external applications to connect to the host's X display:
xhost +
  1. Run the docker container (use the desired container tag in the command line below):
    If using docker (recommended):
docker run -it --rm --net=host --runtime nvidia  -e DISPLAY=$DISPLAY -w /opt/nvidia/deepstream/deepstream-6.4 -v /tmp/.X11-unix/:/tmp/.X11-unix nvcr.io/nvidia/deepstream:6.4-triton-multiarch

 

Docker command line options explained:

Option

Description

-it

means run in interactive mode

--runtime nvidia

Use an alternative runtime (will use the NVIDIA container runtime)

--rm

will delete the container when finished

--privileged

grants access to the container to the host resources. This flag is need to run Graph Composer from the -devel container

-v

Specifies the mounting directory, and used to mount host's X11 display in the container filesystem to render output videos
Users can mount additional directories (using -v option) as required to easily access configuration files, models, and other resources. (i.e., use -v /home:/home) to mount the home directory into the container filesystem.

--cap-add SYSLOG

This option needs to be included to enable usage of the nvds_logger functionality inside the container

-p

To enable RTSP out, network port needs to be mapped from container to host to enable incoming connections using the “-p” option in the command line, eg:-p 8554:8554

 

See /opt/nvidia/deepstream/deepstream-6.4/README inside the container for deepstream-app usage information.

Additional argument to add to above docker command for accessing CSI Camera from Docker: -v /tmp/argus_socket:/tmp/argus_socket For USB Camera additional argument --device /dev/video

 

NOTE
Please refer to 
/opt/nvidia/deepstream/deepstream-6.4/README inside the container for details on deepstream-app usage.

Known Limitations

·      The DeepStream containers for Jetson are intended to be a deployment vehicle. Please refer “Docker Containers” Section within the DeepStream 6.4 Plugin Guide Section for instructions on how to build custom containers based on DeepStream from either Jetson device or your workstation.

 

DeepStream 6.4 Triton containers (x86 and Jetson) have the following CVE(s):

 

CVE

Description

CVE-2022-29501

This impacts both libslurm37 and libpmi2-0-dev packages. SchedMD Slurm 21.08.x through 20.11.x has Incorrect Access Control that leads to Escalation of Privileges and code execution. There is no public fix available for Ubuntu 22.04. There is a patch for those who have an Ubuntu PRO subscription. Also, there is a fix available upstream.  This is not a CVE specific to DeepStream.

CVE-2022-29500

This impacts both libslurm37 and libpmi2-0 packages. SchedMD Slurm 21.08.x through 20.11.x has Incorrect Access Control that leads to Information Disclosure. There is no public fix available for Ubuntu 22.04. There is a patch for those who have an Ubuntu PRO subscription. Also, there is a fix available upstream.  This is not specific to DeepStream.

CVE-2023-4563

This is a duplicate of CVE-2023-4244. This impacts the Linux Kernel. CVE-2023-4244. A use-after-free vulnerability in the Linux kernel’s netfilter: nf_tables component can be exploited to achieve local privilege escalation. This was discovered late in the development cycle. There is a patch available that users can apply. This impacts systems that DeepStream containers run on top of a Linux Kernel.

CVE-2023-6176

This impacts the Linux Kernel. A null pointer dereference flaw was found in the Linux kernel API for the cryptographic algorithm scatterwalk functionality. This issue occurs when a user constructs a malicious packet with specific socket configuration, which could allow a local user to crash the system or escalate their privileges on the system. This was discovered during the release cycle. There is a patch for those who have an Ubuntu PRO subscription. Also, a fix is available upstream. This impacts systems that run DeepStream containers on top of a Linux Kernel.

 

License

The following licenses apply to the DeepStream SDK assets:

Asset

Applicable EULA

Notes

SDK

DeepStream SDK EULA

A copy of the license is available on the following folder of the SDK:
/opt/nvidia/deepstream/deepstream-6.4/LicenseAgreement.pdf

Containers

DeepStream NGC License

License grants redistribution rights allowing developers to build applications on top of the DeepStream containers

Development Containers

DeepStream NGC Development License

A development-only license. Does not allow redistribution of the container

TAO Models

NVIDIA AI Product License

All TAO pre-trained models included in the DeepStream SDK are covered by the NVIDIA AI Product License.

NOTE: By pulling, downloading, or using the DeepStream SDK, you accept the terms and conditions of the EULA licenses listed above.

Please note that all container images come with the following packages installed:

The software listed below is provided under the terms of GPLv3.

To obtain source code for software provided under licenses that require redistribution of source code, including the GNU General Public License (GPL) and GNU Lesser General Public License (LGPL), contact oss-requests@nvidia.com. This offer is valid for a period of three (3) years from the date of the distribution of this product by NVIDIA CORPORATION.

Component

License

autoconf

GPL 3.0

libtool

GPL 3.0

libglvnd-dev

GPL 3.0

libgl1-mesa-dev

GPL 3.0

libegl1-mesa-dev

GPL 3.0

libgles2-mesa-dev

GPL 3.0

Ethical AI

NVIDIA’s platforms and application frameworks enable developers to build a wide array of AI applications. Consider potential algorithmic bias when choosing or creating the models being deployed. Work with the model’s developer to ensure that it meets the requirements for the relevant industry and use case; that the necessary instruction and documentation are provided to understand error rates, confidence intervals, and results; and that the model is being used under the conditions and in the manner intended.