Linux / arm64
The l4t-ml
docker image contains TensorFlow, PyTorch, JupyterLab, and other popular ML and data science frameworks such as scikit-learn, scipy, and Pandas pre-installed in a Python 3 environment. These containers support the following releases of JetPack for Jetson Nano, TX1/TX2, Xavier NX, AGX Xavier, AGX Orin, Orin NX, and Orin Nano:
For additional machine learning containers for Jetson, see the l4t-pytorch
and l4t-tensorflow
images. Note that the TensorFlow and PyTorch pip wheel installers for aarch64 are available to download independently from the Jetson Zoo.
Depending on your version of JetPack-L4T, different tags of the l4t-ml
container are available, each with support for Python 3. Be sure to clone a tag that matches the version of JetPack-L4T that you have installed on your Jetson.
JetPack 5.1.1 (L4T R35.3.1)
l4t-ml:r35.3.1-py3
JetPack 5.1 (L4T R35.2.1)
l4t-ml:r35.2.1-py3
JetPack 5.0.2 (L4T R35.1.0)
l4t-ml:r35.1.0-py3
JetPack 5.0.1 Developer Preview (L4T R34.1.1)
l4t-ml:r34.1.1-py3
JetPack 5.0.0 Developer Preview (L4T R34.1.0)
l4t-ml:r34.1.0-py3
JetPack 4.6.1 (L4T R32.7.1)
l4t-ml:r32.7.1-py3
JetPack 4.6 (L4T R32.6.1)
l4t-ml:r32.6.1-py3
JetPack 4.5 (L4T R32.5.0)
l4t-ml:r32.5.0-py3
JetPack 4.4.1 (L4T R32.4.4)
l4t-ml:r32.4.4-py3
JetPack 4.4 (L4T R32.4.3)
l4t-ml:r32.4.3-py3
JetPack 4.4 Developer Preview (L4T R32.4.2)
l4t-ml:r32.4.2-py3
note: the l4t-ml
containers require JetPack 4.4 or newer
First pull one of the l4t-ml
container tags from above, corresponding to the version of JetPack-L4T that you have installed on your Jetson. For example, if you are running the latest JetPack 5.1 (L4T R35.2.1) release:
sudo docker pull nvcr.io/nvidia/l4t-ml:r35.2.1-py3
Then to start an interactive session in the container, run the following command:
sudo docker run -it --rm --runtime nvidia --network host nvcr.io/nvidia/l4t-ml:r35.2.1-py3
You should then be able to start a Python3 interpreter and import
the packages above.
Unless a user-provided run command overrides the default, a JupyterLab server instance is automatically started along with the container.
You can connect to it by navigating your browser to http://localhost:8888
(or substitute the IP address of your Jetson device if you wish to connect from a remote host, i.e. with the Jetson in headless mode). Note that the default password used to login to JupyterLab is nvidia
.
To mount scripts, data, ect. from your Jetson's filesystem to run inside the container, use Docker's -v
flag when starting your Docker instance:
sudo docker run -it --rm --runtime nvidia --network host -v /home/user/project:/location/in/container nvcr.io/nvidia/l4t-ml:r35.2.1-py3
To access or modify the Dockerfiles and scripts used to build this container, see this GitHub repo.
The l4t-ml
container includes various software packages with their respective licenses included within the container.
If you have any questions or need help, please visit the Jetson Developer Forums.