Linux / arm64
l4t-pytorch docker image contains PyTorch and torchvision pre-installed in a Python 3.6 environment to get up & running quickly with PyTorch on Jetson. These containers support the following releases of JetPack for Jetson Nano, TX1/TX2, Xavier NX, and AGX Xavier:
For additional machine learning containers for Jetson, see the
l4t-tensorflow images. Note that the PyTorch pip wheel installers for aarch64 used by these containers are available to download independently from the Jetson Zoo.
Depending on your version of JetPack-L4T, different tags of the
l4t-pytorch container are available, each with support for Python 3.6. Be sure to clone a tag that matches the version of JetPack-L4T that you have installed on your Jetson.
JetPack 4.6 (L4T R32.6.1)
JetPack 4.5 (L4T R32.5.0)
JetPack 4.4.1 (L4T R32.4.4)
JetPack 4.4 (L4T R32.4.3)
JetPack 4.4 Developer Preview (L4T R32.4.2)
l4t-pytorch containers require JetPack 4.4 or newer
First pull one of the
l4t-pytorch container tags from above, corresponding to the version of JetPack-L4T that you have installed on your Jetson. For example, if you are running the latest JetPack 4.6 (L4T R32.6.0) release:
sudo docker pull nvcr.io/nvidia/l4t-pytorch:r32.6.1-pth1.9-py3
Then to start an interactive session in the container, run the following command:
sudo docker run -it --rm --runtime nvidia --network host nvcr.io/nvidia/l4t-pytorch:r32.6.1-pth1.9-py3
You should then be able to start a Python3 interpreter and
import torch and
To mount scripts, data, ect. from your Jetson's filesystem to run inside the container, use Docker's
-v flag when starting your Docker instance:
sudo docker run -it --rm --runtime nvidia --network host -v /home/user/project:/location/in/container nvcr.io/nvidia/l4t-pytorch:r32.6.1-pth1.9-py3
To access or modify the Dockerfiles and scripts used to build this container, see this GitHub repo.
l4t-pytorch container includes various software packages with their respective licenses included within the container.