NGC | Catalog
CatalogResourcesTAO Toolkit Getting Started

TAO Toolkit Getting Started

For downloads and more information, please view on a desktop device.
Logo for TAO Toolkit Getting Started

Description

Quick start guide for TAO Toolkit.

Publisher

NVIDIA

Latest Version

5.0.0

Modified

September 27, 2023

Compressed Size

13.6 MB

TAO Toolkit Quick Start

The NVIDIA TAO Toolkit, built on TensorFlow and PyTorch, simplifies and accelerates the model training process by abstracting away the complexity of AI models and the deep learning framework. You can use the power of transfer learning to fine-tune NVIDIA pretrained models with your own data and optimize the model for inference throughput — all without the need for AI expertise or large training datasets.

TAO quick start video.

TAO Toolkit 5.0

TAO Toolkit 5.0 version packages containers, models, Jupyter notebooks, start-up script and Helm chart for K8s deployment. Here are list of assets as part of TAO 5.0

Containers

The containers for the TAO Toolkit 5.0 are hosted on NGC under this instance.

Container Name Description TAG
TAO TensorFlow 1 TensorFlow 1.15.x container for training DNNs nvcr.io/nvidia/tao/tao-toolkit:5.0.0-tf1.15.5
TAO TensorFlow 2 TensorFlow 2.11.x container for training DNNs nvcr.io/nvidia/tao/tao-toolkit:5.0.0-tf2.11.0
TAO PyTorch PyTorch container for training DNNs nvcr.io/nvidia/tao/tao-toolkit:5.0.0-pyt
TAO Deploy TensorRT container for optimization nvcr.io/nvidia/tao/tao-toolkit:5.0.0-deploy
TAO Data Service Container for AI-assisted annotation and few other data services nvcr.io/nvidia/tao/tao-toolkit:5.0.0-dataservice
TAO Services Container for TAO services nvcr.io/nvidia/tao/tao-toolkit:5.0.0-api

Resources

Container Name Description Location
TAO Toolkit Getting Started Resource to help get started with TAO Toolkit. nvidia/tao/tao-getting-started

Helm Chart

The TAO Toolkit Services Helm chart is hosted on NGC at nvidia/tao/tao-toolkit-api

Models

The following models are released as part of the TAO Toolkit 5.0.

Model Name Description NGC Instance
Pre-trained NVImageNet Back Bone weights Backbone weights trained on NVImageNet to facilitate transfer learning using TAO nvidia/tao/pretrained_nvimagenet_backbones
Pre-trained ImageNet Back Bone weights Backbone weights trained on ImageNet to facilitate transfer learning using TAO nvidia/tao/pretrained_imagenet_backbones
Optical Character Recognition Model to recognise characters from a preceding OCDNet model nvidia/tao/ocrnet
Optical Character Detection Network to detect characters in an image nvidia/tao/ocdnet
Mask Auto Label Pretrained model to generate semantic segmentation labels nvidia/tao/mask_auto_label

Requirements

The following system configuration is recommended to achieve reasonable training performance with TAO Toolkit and supported models provided:

  • 16 GB system RAM
  • 16 GB of GPU RAM
  • 8 core CPU
  • 1 NVIDIA GPU
  • 100 GB of SSD space

TAO Toolkit is supported on discrete GPUs, such as H100, A100, A40, A30, A2, A16, A100x, A30x, V100, T4, Titan-RTX and Quadro-RTX.

Note: TAO Toolkit is not supported on GPU's before the Pascal generation

Software requirements

Software Version Comment
Ubuntu LTS 20.04
python >=3.6.9<3.7 Not needed if you are using TAO API (See #3 below)
docker-ce >19.03.5 Not needed if you are using TAO API (See #3 below)
docker-API 1.40 Not needed if you are using TAO API (See #3 below)
nvidia-container-toolkit >1.3.0-1 Not needed if you are using TAO API (See #3 below)
nvidia-container-runtime 3.4.0-1 Not needed if you are using TAO API (See #3 below)
nvidia-docker2 2.5.0-1 Not needed if you are using TAO API (See #3 below)
nvidia-driver >525.85 Not needed if you are using TAO API (See #3 below)
python-pip >21.06 Not needed if you are using TAO API (See #3 below)

Package Content

Download the TAO package which contains startup scripts, Jupyter notebooks and config files.
TAO is supported on Google Colab; if you want to try on Colab, you can skip this step and directly scroll down to #4 in the How to run TAO section.

ngc registry resource download-version "nvidia/tao/tao-getting-started:5.0.0 --dest ./"
cd ./getting_started_v5.0.0

File Hierarchy

setup
    |--> quickstart_launcher.sh
    |--> quickstart_api_bare_metal
    |--> quickstart_api_aws_eks
    |--> quickstart_api_azure_aks
    |--> quickstart_api_gcp_gke
notebooks
    |--> tao_api_starter_kit
        |--> api
            |--> automl
            |--> end2end
            |--> dataset_prepare
        |--> client
            |--> automl
            |--> end2end
            |--> dataset_prepare
    |--> tao_launcher_starter_kit
        |--> dino
        |--> deformable_detr
        |--> classification_pyt
        |--> ocdnet
        |-->  ...
    |--> tao_data_services
        |--> data
        |-->  ...

How to run TAO?

TAO is available as a docker container or as a collection of Python wheels.

There are 4 ways to run TAO depending on user preference and their setup. See the full list below.

1. Launcher CLI

The TAO Launcher is a lightweight Python based CLI application to run TAO. The launcher basically acts as a front-end for the multiple TAO Toolkit containers built on both PyTorch and Tensorflow. The multiple containers essentially get launched automatically based on the type of model you plan to use for your computer vision or conversational AI use-cases.


TAO Launcher

To get started, use the setup/quickstart_launcher.sh to validate your setup and install TAO launcher. Jupyter notebooks to train using the Launcher is provided under notebooks/launcher_starter_kit.

Detail instructions on installing pre-requisite and setup is provided in TAO documentation - Launcher

2. Directly from Container

Users have option to also run TAO directly using the docker container. To use container directly, user needs to know which container to pull. There are multiple containers under TAO, and depending on the model that you want to train you will need to pull the appropriate container. This is not required when using the Launcher CLI.

export DOCKER_REGISTRY="nvcr.io"
export DOCKER_NAME="nvidia/tao/tao-toolkit"
export DOCKER_TAG="***" ## for TensorFlow docker
export DOCKER_TAG="***" ## for PyTorch docker
export DOCKER_CONTAINER=$DOCKER_REGISTRY/$DOCKER_NAME:$DOCKER_TAG

docker run -it --rm --gpus all -v /path/in/host:/path/in/docker $DOCKER_CONTAINER \
detectnet_v2 train -e /path/to/experiment/spec.txt -r /path/to/results/dir -k $KEY --gpus 4

More information about running directly from docker is provided in TAO documentation - Container

3. TAO APIs

TAO Toolkit API is a Kubernetes service that enables building end-to-end AI models using REST APIs. The API service can be installed on a Kubernetes cluster (local / AWS EKS) using a Helm chart along with minimal dependencies. TAO toolkit jobs can be run using GPUs available on the cluster and can scale to a multi-node setting. Users can use a TAO client CLI to interact with TAO services remotely or can integrate it in their own apps and services directly using REST APIs.


TAO API

To get started, use the provided one-click deploy script to deploy either on bare-metal setup or on managed Kubernetes service like Amazon EKS. Jupyter notebooks to train using the APIs directly or using the client app is provided under notebooks/api_starter_kit
setup/quickstart_api_bare_metal
setup/quickstart_api_aws_eks

More information about setting up the API services and the API is provided in TAO documentation - API

4. Python Wheel

Users can also run TAO directly on bare-metal without docker or K8s. Users can deploy TAO notebooks directly on Google Colab without having to configure infrastructure. The full instructions are provided in the Colab notebook below.

CV Task Model Arch One-click Deploy
Classification ResNet18 Train on Colab
Multi-task Classification ResNet18 Train on Colab
Object Detection DSSD Train on Colab
Object Detection EfficientDet Train on Colab
Object Detection RetinaNet Train on Colab
Object Detection SSD Train on Colab
Object Detection YOLOv3 Train on Colab
Object Detection YOLOv4 Train on Colab
Object Detection YoloV4 Tiny Train on Colab
Action Recognition ActionRecognition Train on Colab
OCR LPRNet Train on Colab
Pose Action Classification PoseClassificationNet Train on Colab
Emotion Recognition EmotionNet Train on Colab
Gesture Recognition GestureNet Train on Colab
Heart Rate Estimation HeartRateNet Train on Colab

After starting TAO service locally or remotely, start Jupyter notebook

jupyter notebook --ip 0.0.0.0 --port 8888 --allow-root

Open an internet browser on localhost and navigate to the following URL: http://0.0.0.0:8888

Open the notebook that you are interested in training and start training.

Note: All the instructions to train, prune, optimize and download pretrained models are provided in the notebook.

Jupyter notebooks

All Notebooks and required spec files are provided in this package. The table below maps which notebook to use for fine-tuning either a purpose-build models like PeopleNet or an open model architecture like YOLO.

Purpose-built Model Launcher CLI notebook
PCB Classification notebooks/tao_launcher_starter_kit/classification_pyt/classification.ipynb
Retail Object Recognition notebooks/tao_launcher_starter_kit/metric_learning_recognition/metric_learning_recognition.ipynb
Optical Inspection notebook/tao_launcher_starter_kit/optical_inspection/OpticalInspection.ipynb
Mask Auto Label notebooks/tao_launcher_starter_kit/mal/mal.ipynb
OCRNet notebooks/tao_launcher_starter_kit/ocrnet/ocrnet.ipynb
OCDNet notebooks/tao_launcher_starter_kit/ocdnet/ocdnet.ipynb
PeopleSemSegFormer notebooks/tao_launcher_starter_kit/segformer/segformer.ipynb
PeopleNet notebooks/tao_launcher_starter_kit/detectnet_v2/detectnet_v2.ipynb
TrafficCamNet notebooks/tao_launcher_starter_kit/detectnet_v2/detectnet_v2.ipynb
DashCamNet notebooks/tao_launcher_starter_kit/detectnet_v2/detectnet_v2.ipynb
FaceDetectIR notebooks/tao_launcher_starter_kit/detectnet_v2/detectnet_v2.ipynb
VehicleMakeNet notebooks/tao_launcher_starter_kit/classification/classification.ipynb
VehicleTypeNet notebooks/tao_launcher_starter_kit/classification/classification.ipynb
PeopleSegNet notebooks/tao_launcher_starter_kit/mask_rcnn/mask_rcnn.ipynb
PeopleSemSegNet notebooks/tao_launcher_starter_kit/unet/unet_isbi.ipynb
Bodypose Estimation notebooks/tao_launcher_starter_kit/bpnet/bpnet.ipynb
License Plate Detection notebooks/tao_launcher_starter_kit/detectnet_v2/detectnet_v2.ipynb
License Plate Recognition notebooks/tao_launcher_starter_kit/lprnet/lprnet.ipynb
Facial Landmark notebooks/tao_launcher_starter_kit/fpenet/fpenet.ipynb
FaceDetect notebooks/tao_launcher_starter_kit/facenet/facenet.ipynb
ActionRecognitionNet notebooks/tao_launcher_starter_kit/action_recognition_net/actionrecognitionnet.ipynb
PoseClassificationNet notebooks/tao_launcher_starter_kit/pose_classification_net/pose_classificationnet.ipynb
ReIdentificationNet notebooks/tao_launcher_starter_kit/re_identification_net/reidentificationnet.ipynb

Open model architecture Jupyter notebook
Deformable DETR notebooks/tao_launcher_starter_kit/deformable_detr/deformable_detr.ipynb
DINO notebooks/tao_launcher_starter_kit/dino/dino.ipynb
Image Classification notebooks/tao_launcher_starter_kit/classification_pyt/classification_pyt.ipynb
Image Classification notebooks/tao_launcher_starter_kit/classification_tf2/classification.ipynb
Image Classification notebooks/tao_launcher_starter_kit/classification_tf1/classification.ipynb
Optical Inspection notebooks/tao_launcher_starter_kit/optical_inspection/optical_inspection.ipynb
Metric Learning Recognition notebooks/tao_launcher_starter_kit/metric_learning_recognition/metric_learning_recognition.ipynb
Segformer notebooks/tao_launcher_starter_kit/segformer/segformer.ipynb
DetectNet_v2 notebooks/tao_launcher_starter_kit/detectnet_v2/detectnet_v2.ipynb
FasterRCNN notebooks/tao_launcher_starter_kit/faster_rcnn/faster_rcnn.ipynb
YOLOV3 notebooks/tao_launcher_starter_kit/yolo_v3/yolo_v3.ipynb
YOLOV4 notebooks/tao_launcher_starter_kit/yolo_v4/yolo_v4.ipynb
YOLOv4-Tiny notebooks/tao_launcher_starter_kit/yolo_v4_tiny/yolo_v4_tiny.ipynb
SSD notebooks/tao_launcher_starter_kit/ssd/ssd.ipynb
DSSD notebooks/tao_launcher_starter_kit/dssd/dssd.ipynb
RetinaNet notebooks/tao_launcher_starter_kit/retinanet/retinanet.ipynb
MaskRCNN notebooks/tao_launcher_starter_kit/mask_rcnn/mask_rcnn.ipynb
UNET notebooks/tao_launcher_starter_kit/unet/unet_isbi.ipynb
EfficientDet notebooks/tao_launcher_starter_kit/efficientdet/efficientdet.ipynb
Mask Auto Label notebooks/tao_launcher_starter_kit/mal/mal.ipynb

Blogs

Train like a 'pro' with AutoML in TAO
Deploy TAO on Azure ML
Synthetic Data and TAO
Action Recognition Blog
Real-time License Plate Detection
2 Pose Estimation: Part 1
Part 2
Building ConvAI with TAO Toolkit

License

TAO Toolkit getting Started License for TAO containers is included in the banner of the container. License for the pre-trained models are available with the model cards on NGC. By pulling and using the Train Adapt Optimize (TAO) Toolkit container to download models, you accept the terms and conditions of these licenses.