Linux / arm64
This container is used in the NVIDIA Deep Learning Institute course Getting Started with AI on Jetson Nano and should be run on an NVIDIA Jetson Nano. This course is also an element of the Jetson AI Fundamentals and the Jetson AI Certification Program. If you have not done so yet, we highly recommend you take the full free course, and check out other self-paced online courses and instructor-led workshops available from the NVIDIA Deep Learning Institute.
The following are required to run this container.
If you've never used Docker, we recommend their Orientation and Setup.
The data collected during the course is stored in a mounted directory on the host device. This way, the data and trained models aren't lost when the container shuts down. The commands below assume the mounted directory is ~/nvdli-data
, so make sure you create it first:
mkdir ~/nvdli-data
Run the container using the container tag that corresponds to the version of JetPack-L4T that you have installed on your Jetson.
JetPack Release | Container Version Tag | Language |
---|---|---|
4.4 | v2.0.0-r32.4.3 | en-US |
4.4.1 | v2.0.1-r32.4.4 | en-US |
4.4.1 | v2.0.1-r32.4.4zh | zh-CN |
4.5 | v2.0.1-r32.5.0 | en-US |
4.5 | v2.0.1-r32.5.0zh | zh-CN |
4.6 | v2.0.1-r32.6.1 | en-US |
4.6 | v2.0.1-r32.6.1zh | zh-CN |
4.6 | v2.0.1-r32.6.1tw | zh-TW |
4.6 | v2.0.1-r32.6.1ja | JA |
4.6 | v2.0.2-r32.6.1kr | KR |
4.6.1 | v2.0.2-r32.7.1 | en-US |
4.6.1 | v2.0.2-r32.7.1zh | zh-CN |
4.6.1 | v2.0.2-r32.7.1tw | zh-TW |
4.6.1 | v2.0.2-r32.7.1ja | JA |
4.6.1 | v2.0.2-r32.7.1kr | KR |
6.0 | v2.0.3-r36.3.0 | en-US |
6.0 | v2.0.3-r36.3.0zh | zh-CN |
6.0 | v2.0.3-r36.3.0tw | zh-TW |
6.0 | v2.0.3-r36.3.0ja | JA |
6.0 | v2.0.3-r36.3.0kr | KR |
The docker run
command will automatically pull the container if it is not on your system already.
sudo docker run --runtime nvidia -it --rm --network host \
--volume ~/nvdli-data:/nvdli-nano/data \
--device /dev/video0 \
nvcr.io/nvidia/dli/dli-nano-ai:v2.0.3-r36.3.0
sudo docker run --runtime nvidia -it --rm --network host \
--volume ~/nvdli-data:/nvdli-nano/data \
--volume /tmp/argus_socket:/tmp/argus_socket \
--device /dev/video0 \
nvcr.io/nvidia/dli/dli-nano-ai:v2.0.2-r32.7.1
note: if you have both CSI (original 4GB Nano and 2GB Nano only) and USB cameras plugged in (or multiple USB cameras), also add --device /dev/video1
above. Then in the DLI notebooks, you will need to set the capture_device
number to 1 (the CSI camera will be /dev/video0
and the USB camera will be /dev/video1
- don't use the CSI camera through V4L2)
When the container is launched, the JupyterLab server will automatically be started. Text similar to the following will be printed out to the user:
allow 10 sec for JupyterLab to start @ http://192.168.55.1:8888 (password dlinano)
JupterLab logging location: /var/log/jupyter.log (inside the container)
You can then navigate the browser on your PC to the URL shown above (http://192.168.55.1:8888) and login to JupyterLab with the password dlinano. Then proceed with the DLI course as normal.
If you have any questions or need help, please visit the Jetson Developer Forums
Copyright 2020-2024 NVIDIA
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
Also used in this container, and with its own licensing:
Also used in this container, and with its own licensing: