NGC | Catalog
Welcome Guest
CatalogResourcesTAO Image Classification

TAO Image Classification

For downloads and more information, please view on a desktop device.
Logo for TAO Image Classification

Description

In this notebook, you will learn how to leverage the simplicity and convenience of TAO to take a pretrained resnet18 model and finetune on a sample dataset converted from PascalVOC.

Publisher

NVIDIA

Use Case

Other

Framework

Other

Latest Version

v1

Modified

May 24, 2022

Compressed Size

3.27 MB

Image Classification Jupyter Notebook

Image classification is the task of categorizing an image into one of several predefined classes, often also giving a probability of the input belonging to a certain class. This task is crucial in understanding and analyzing images, and it comes quite effortlessly to human beings with our complex visual systems.

This sample Jupyter notebook provided here, contains a pretrained classification model that you can retrain on Google Vertex AI, simply by deploying with the NGC One-Click Deploy feature.

image1

About this Model

The model used is a pretrained classification model built on vresnet18 architecture. It contains pretrained weights of most of the popular classification models. These weights can be used as a starting point to customize for your use-case with NVIDIA TAO Toolkit.

We will be using the pascal VOC dataset to retrain the model. You’ll need to manually download the dataset from the link shown here: http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar

Once downloaded, move the data to the $DATA_DOWNLOAD_DIR as prescribed in the Jupyter notebook.

image1

Customizing the model with the TAO Toolkit

To fine-tune and customize the model, you’ll be using the TAO (Train, Adapt and Optimize) Toolkit. The TAO Toolkit, a low-code AI model development solution, leverages the power of transfer learning to help fine-tune pretrained models with your own data. Transfer learning which is the process of transferring learned features from one application to another. It is a commonly used training technique where you use a model trained on one task and retrain to use it on a different task. With the TAO Toolkit, you can customize models for tasks in computer vision, natural language processing and speech.

Once you have customized the model, you can then use the built-in optimization techniques such as model pruning and quantization to optimize the model for inference on the target GPU, without sacrificing accuracy.

All the training steps are covered in the Jupyter notebook.

Get Started with Training

To help you get started, we have created a sample Jupyter Notebook that can be easily deployed on Vertex AI using NGC’s One Click Deploy feature. This feature automatically sets up the Vertex AI instance with an optimal configuration, preloads the dependencies, runs the software from NGC without any need to set up the infrastructure.

Simply click on the button that reads “Deploy to Vertex AI” and follow the instructions.

Note: A customized kernel for the Jupyter Notebook is used as the primary mechanism for deployment. This kernel has been built on the TAO Toolkit container. For more information on the container itself, please refer to this link for more information:

https://catalog.ngc.nvidia.com/orgs/nvidia/teams/tao/containers/tao-toolkit-tf

The container version is:

nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.4-py3