NGC | Catalog
CatalogResourcesMerlin Jupyter Notebook Examples

Merlin Jupyter Notebook Examples

Logo for Merlin Jupyter Notebook Examples
Description
This resource is a collection of jupyter notebook examples to provide end-to-end examples for NVIDIA Merlin.
Publisher
NVIDIA
Latest Version
1
Modified
April 4, 2023
Compressed Size
170.36 KB

These example notebooks demonstrate how to use NVTabular with TensorFlow, PyTorch, and HugeCTR. Each example provides additional details about the end-to-end workflow, which includes ETL, Training, and Inference.

Each example notebook is structured as follows:

  1. 01-Download-Convert.ipynb: Demonstrates how to download the dataset and convert it into the correct format so that it can be consumed.
  2. 02-ETL-with-NVTabular.ipynb: Demonstrates how to execute the preprocessing and feature engineering pipeline (ETL) with NVTabular on the GPU.
  3. 03-Training-with-TF.ipynb: Demonstrates how to train a model with TensorFlow based on the ETL output.
  4. 03-Training-with-PyTorch.ipynb: Demonstrates how to train a model with PyTorch based on the ETL output.
  5. 03-Training-with-HugeCTR.ipynb: Demonstrates how to train a model with HugeCTR based on the ETL output.
  6. 04-Triton-Inference-with-TF.ipynb: Demonstrates how to use Inference with the Triton Inference Server (depending on the deep learning framework).

Deploying the notebooks

To deploy these notebooks with the optimal configuration to the cloud please click the "deploy" button on the top right of NGC.

Reading the notebook without leaving NGC

If you want to read through the notebook example without leaving our website, follow these steps:

  1. Navigate to the File Browser tab of the asset in NGC
  2. Select the version you'd like to see
  3. Next to the .ipynb file select "View Jupyter"
  4. There you have it! You can read a notebook for documentation and copy code samples without ever leaving NGC.
  5. All the instructions you need to get started are in the resource - so head over and see how to get up and running.