NGC | Catalog
CatalogContainersHoloscan Container

Holoscan Container

For copy image paths and more information, please view on a desktop device.
Logo for Holoscan Container



The Holoscan container includes the Holoscan libraries, GXF extensions, headers, example source code, and sample datasets. It is the recommended way to run the Holoscan examples or build your own applications.



Latest Tag



December 1, 2023

Compressed Size

4.93 GB

Multinode Support


Multi-Arch Support


v0.6.0-dgpu (Latest) Security Scan Results

Linux / arm64

Linux / amd64


The Holoscan container is part of NVIDIA Holoscan, the AI sensor processing platform that combines hardware systems for low-latency sensor and network connectivity, optimized libraries for data processing and AI, and core microservices to run streaming, imaging, and other applications, from embedded to edge to cloud. It can be used to build streaming AI pipelines for a variety of domains, including Medical Devices, High Performance Computing at the Edge, Industrial Inspection and more.

In previous releases, the prefix Clara was used to define Holoscan as a platform designed initially for medical devices. As Holoscan has grown, its potential to serve other areas has become apparent. With version 0.4.0, we're proud to announce that the Holoscan SDK is now officially built to be domain-agnostic and can be used to build sensor AI applications in multiple domains. Note that some of the content of the SDK (sample applications) or the documentation might still appear to be healthcare-specific pending additional updates. Going forward, domain specific content will be hosted on the HoloHub repository.

The Holoscan container includes the Holoscan libraries, GXF extensions, headers, example source code, and sample datasets, as well as all the dependencies that were tested with Holoscan. It is the recommended way to run the Holoscan examples, while still allowing you to create your own C++ and Python Holoscan application.

Getting Started

Visit the Holoscan User Guide to get started with the Holoscan SDK.


  • Prerequisites for each supported platform are documented in the user guide.
  • Additionally, on x86_64, you'll need the NVIDIA Container Toolkit with Docker. This should already be installed on NVIDIA developer kits with HoloPack or JetPack.

Running the container

  1. Log in to the NGC docker registry

    docker login
  2. Press the Copy Image Path button at the top of this webpage and choose the version you want to test:

    • select v<version>-dgpu for x86_64 systems or a holoscan developer kit configured with a discrete GPU
    • select v<version>-igpu for holoscan developer kits configured with an integrated GPU

    Set it as your NGC_CONTAINER_IMAGE_PATH in your terminal.

    # For example
  3. Ensure that X11 is configured to allow commands from docker:

    xhost +local:docker
  4. Start the container

    • Add --device /dev/ajantv20:/dev/ajantv20 in the docker run command if you also have an AJA capture card you'd like to access from the container.
    • Similarly, add --device /dev/video0:/dev/video0 (and/or video1, etc...) and --group-add video to make your V4L2 video devices (USB, HDMI IN) available in the container.
    # Find the nvidia_icd.json file which could reside at different paths
    # Needed due to
    nvidia_icd_json=$(find /usr/share /etc -path '*/vulkan/icd.d/nvidia_icd.json' -type f,l -print -quit 2>/dev/null | grep .) || (echo "nvidia_icd.json not found" >&2 && false)
    # --ipc=host, --cap-add=CAP_SYS_PTRACE, --ulimit memlock=-1 are needed for the distributed applications using UCX to work.
    # See
    # Run the container
    docker run -it --rm --net host \
      --runtime=nvidia \
      -v /tmp/.X11-unix:/tmp/.X11-unix \
      -v $nvidia_icd_json:$nvidia_icd_json:ro \
      -e NVIDIA_DRIVER_CAPABILITIES=graphics,video,compute,utility,display \
      --ipc=host \
      --cap-add=CAP_SYS_PTRACE \
      --ulimit memlock=-1 \

    For iGPU containers, you might need to add --privileged to run vulkan/holoviz applications at this time.

Using the installed libraries and headers

The Holoscan SDK is installed under /opt/nvidia/holoscan. It includes a CMake configuration file inside lib/cmake/holoscan, allowing you to import holoscan in your CMake project (link libraries + include headers):

find_package(holoscan REQUIRED CONFIG PATHS "/opt/nvidia/holoscan")
target_link_libraries(yourTarget PUBLIC holoscan::core)

Alternatives to hardcoding PATHS inside find_package in CMake are listed under the Config Mode Search Procedure documentation.


Python, C++, and GXF examples are installed in /opt/nvidia/holoscan/examples alongside their source code, and run instructions (also available on the GitHub repository).

Running the examples

Example to run the Hello World example:

# Python
python3 /opt/nvidia/holoscan/examples/hello_world/python/

# C++
cd /opt/nvidia/holoscan/examples

Make sure to edit any relative path in the yaml config if you want to run from a different working directory.

Building the examples

You can rebuild the C++ and GXF examples as-is or copy them anywhere on your system to experiment with.

Example to build all the C++ and GXF examples:

export src_dir="/opt/nvidia/holoscan/examples/" # Add "<example_of_your_choice>/cpp" to build a specific example
export build_dir="</path/of/your/choice/>"
cmake -S $src_dir -B $build_dir -G Ninja \
  -D Holoscan_ROOT="/opt/nvidia/holoscan"
cmake --build $build_dir -j

Also see the HoloHub repository for a collection of Holoscan operators and applications which you can use in your pipeline or for reference.


By pulling and using the container, you accept the terms and conditions of this End User License Agreement.