Linux / arm64
The Nova Extrinsics Sensor Calibration Tool is a containerized application that provides extrinsic calibration between sensors on a Nova Carter robot or Nova Orin Developer Kit. On initial setup or when recalibration is required, this application is used to generate a URDF calibration file with the relative position of different sensors on the robot.
The URDF calibration file is used for robotics functions that combine information from multiple sensor modalities. This calibration process achieves highly consistent results by using calibration targets and a mobile device connected to the robot over a web browser, which guides you through the procedure of placing the calibration targets in different positions around the robot.
Visuals of the calibration tool guiding the operator to position a calibration target in different locations.
This tool is self-contained and performs both data collection and calibration optimization on the
robot. It does not require external services or an internet connection after the container is deployed
to the robot. By default the resulting URDF calibration file is deployed to /etc/nova/calibration/isaac_calibration.urdf
.
If Isaac Cloud Services are available, this tool also provides automatic Over-The-Air (OTA) backup of calibration files, which can later be re-deployed using Isaac Cloud Services.
The Nova Extrinsics Sensor Calibration Tool requires the following components and space considerations:
The Nova Extrinsics Sensor Calibration Tool uses two different checkerboards with the same pattern, each of them for different calibration tasks:
Both of the PDF files, above, include the design of the checkerboards (page 1) and printing instructions (page 2) that must be followed for the tool to work properly.
Illustration of the checkerboard. Both the large and small checkerboards have the same pattern in different dimensions: 100mm squares vs 45mm squares respectively.
The checkerboards must have the following qualities:
There are different ways to find a service to print calibration targets, for example:
The calibration tool runs on the robot but requires that the operator use a mobile device (tablet recommended) attached to the back of the large calibration target for visualization of the tool's web UI. Otherwise, images might look flipped.
It is necessary to attach three holders at the back of the large calibration target, one for each hand and one for the mobile device (tablet case is sufficient).
Use strong velcro (able to hold 15 lb or more) to attach the mobile device and the holders at the back of the large calibration target, as shown in the image below.
NOTE: Do not hold the large calibration board by its sides. Any occlusion or modification of the checkerboard borders may lead to a non-optimal or invalid calibration result.
View of the back of the large checkerboard with tablet and holders attached with velcro.
You must perform the calibration process in a bright room with diffuse light, to avoid glare on the calibration target.
During the full calibration process the robot must be on flat ground. 4 m - 6 m of empty space around the robot and 2 m of empty space around the large calibration board are required at all times. Space requirements are per camera.
During the calibration process, you must:
Empty space requirement around the robot: 4 m in each direction and 6 m in the direction of the camera currently being calibrated.
Empty space requirement around the large target board during calibration: 2 m in each direction.
If the app guides you to poses that are close to objects or unreachable, feel free to reposition the robot, wait for the robot to be static in the new position and keep following the steps in the UI. Do not move the robot while following the steps on the UI.
The calibration process uses a web-based app running on a Nova Carter robot or Nova Orin Developer Kit, a calibration target, and a mobile device. Make sure you have all the components before starting the calibration process.
The web-based app supports the calibration of the following tasks:
FOUR_FISHEYES
: Front fisheye camera - left fisheye camera - right fisheye camera - rear fisheye camera (Nova Carter only)FISHEYES_FRONT_LEFT_RIGHT
: Front fisheye camera - left fisheye camera - right fisheye cameraSTEREO_FISHEYE_FRONT
: Front stereo camera - front fisheye cameraSTEREO_FISHEYE_REAR
: Rear fisheye camera - rear stereo camera (Nova Carter only)STEREO_FISHEYE_LEFT
: Left fisheye camera - left stereo cameraSTEREO_FISHEYE_RIGHT
: Right fisheye camera - right stereo cameraGROUND --ground-sensor front_stereo_camera
: Robot/ground - Front stereo cameraGROUND --ground-sensor front_3d_lidar
: Robot/ground - 3D LidarFRONT_STEREO_LIDAR
: 3D Lidar - front stereo cameraFRONT_REAR_STEREO_LIDAR
: 3D Lidar - front stereo camera - rear stereo camera (Nova Carter only)LEFT_RIGHT_STEREO_LIDAR
: 3D Lidar - left stereo camera - right stereo camera (Nova Carter only)The calibration process involves the setup of the environment and calibration steps following the instructions in the web app.
Set up the robot by following the Nova Carter getting started guide or Nova Orin Developer Kit getting started guide. This step installs Nova Orin Init and sets up Docker. Nova Orin Init configures the robot and deploys the CAD nominals file in /etc/nova/calibration/isaac_nominals.urdf
. Verify that Nova Orin Init is in the latest version.
Open a terminal, ssh to the robot and pull the Docker image (one-time setup):
ssh nvidia@<robot_ip>
docker pull nvcr.io/nvidia/isaac/nova_extrinsics_sensor_calibration_tool:release_3.0.1-aarch64
run_calibration_app.sh
in the robot and add
the following content (one-time setup):#!/bin/bash
IMAGE_PATH=nvcr.io/nvidia/isaac/nova_extrinsics_sensor_calibration_tool:release_3.0.1-aarch64
docker run -it --rm --privileged --network=host \
--entrypoint /usr/bin/python3 \
--mount type=bind,source=/dev,target=/dev \
--mount type=bind,source=/etc/nova,target=/etc/nova \
--mount type=bind,source=/sys/devices,target=/sys/devices \
--mount type=bind,source=/tmp/argus_socket,target=/tmp/argus_socket \
--mount type=bind,source=/sys/bus/iio/devices,target=/sys/bus/iio/devices \
--mount type=bind,source=/mnt/nova_ssd/recordings,target=/mnt/nova_ssd/recordings \
$IMAGE_PATH /app/extensions/calibration/apps/calibration_app "$@"
FIRST
return mode by
running the command below. This setting allows
the calibration software to receive the Lidar pulses corresponding to the calibration target
instead of their potential reflections./opt/nvidia/nova/python/venv/bin/python3 -c '
from nova_init.hesai.hesai_utils import *
l = HesaiLidar("192.168.1.201")
print(f"Previous setting: {l.get_return_mode()}")
l.set_return_mode(HesaiReturnMode.FIRST)
if l.get_return_mode() == HesaiReturnMode.FIRST:
print("SUCCESS. Lidar set to FIRST return mode")'
NOTE: This is a persistent change, make sure to set it back to its default setting (for example,
FIRST_STRONGEST
) after calibrating the robot.
NOTE: Not performing this step may lead to slowdowns in the calibration app and inaccurate results.
The following sequence of steps describes the process to calibrate all the sensors in Nova Carter, using the front stereo camera to calibrate with the ground (also known as the anchor sensor in the remainder of this document):
Front stereo camera to ground and front stereo camera to 3D Lidar:
Place the small calibration target flat on the ground. Make sure that the large checkerboard is not in sight of the camera stream when performing the camera-to-ground step. Make sure the small calibration target is close enough to the camera for detection. Try to position the board such that it's as close as possible to the camera without it being occluded or partially out of the camera frame.
After the ground plane is estimated, the tool switches to Lidar-camera calibration mode. Then, it guides the operator to place the large checkerboard in different locations in the scene to calibrate the Lidar with respect to the front stereo camera.
Front stereo camera to front fisheye camera: From this step onwards, the tool uses camera-camera calibration to estimate the remaining transformations between sensors.
Front fisheye camera to left, right, and rear fisheye cameras.
Rear fisheye camera to rear stereo camera.
Left fisheye camera to left stereo camera.
Right fisheye camera to right stereo camera.
After following the previous steps, the hierarchy of calibrations between sensors is described by the following transformation tree:
Robot
│
Front Stereo Camera
│
├─ Lidar
│
└─ Front Fisheye Camera
│
├─ Rear Fisheye Camera
│ └─ Rear Stereo Camera
│
├─ Left Fisheye Camera
│ └─ Left Stereo Camera
│
└─ Right Fisheye Camera
└─ Right Stereo Camera
NOTE: You can perform the following calibration without a 3D Lidar. Skip the first step and make sure to perform the ground calibration (
--ground-sensor
) in the second step along with front stereo - front fisheye camera calibration. To just perform the ground calibration and nothing else, you can choose the--calibration-type GROUND
and specify the--ground-sensor
.
Verify that the robot is on flat ground and with the required empty space around it.
Open a terminal, ssh to the robot, and run the commands below step by step. After launching every command follow the instructions from step 3 onwards until calibration succeeds. Then return to this step to launch the following commands:
a. Front stereo camera - 3D Lidar step with ground calibration with respect to front stereo camera (anchor). This step is optional (depending if the 3D Lidar is present and needs to be calibrated):
bash run_calibration_app.sh --calibration-type FRONT_STEREO_LIDAR --ground-sensor front_stereo_camera
NOTE: The app asks the operator to put the board on the floor near the robot, use the small board that does not have handles on the back so that board is flush with the ground. If the board is not detected, then move it around and close to the robot to perform the ground calibration step.
b. Front stereo camera (anchor) - front fisheye camera. This step has two options, depending whether the operator executed the first step or not:
bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_FRONT
--ground-sensor front_stereo_camera
flag.bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_FRONT --ground-sensor front_stereo_camera
c. Front fisheye camera (anchor) - rear fisheye camera - left fisheye - right fisheye step:
bash run_calibration_app.sh --calibration-type FOUR_FISHEYES
d. Right stereo camera - right fisheye camera (anchor):
bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_RIGHT
e. Left stereo camera - left fisheye camera (anchor):
bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_LEFT
f. Rear stereo camera - rear fisheye camera step (anchor):
bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_REAR
Open Google Chrome in the mobile device behind the large calibration board and navigate to the following address:
http://<robot_ip>:3000/calibration.html
If any calibration step fails, you are prompted to re-launch the calibration app (rerun the same command to launch the app again).
Screenshot of the calibration tool web UI guiding the operator to position the checkerboard towards their right.
You must:
Operator correctly rotating the calibration target guided by the calibration tool.
The following sequence of steps describes the process to calibrate all the sensors in Nova Orin Developer Kit, using the front stereo camera to calibrate with the ground (also known as the anchor sensor):
Place the Nova Orin Developer Kit on a flat surface around 30 cm (12 inches) from the ground
Camera-camera calibration for all cameras in Nova Orin Developer Kit: The web tool guides the operator to place the large checkerboard in different locations in the scene, for each of the following calibration tasks. Open a terminal, ssh to the robot, and run the commands below step by step. After launching every command follow the instructions from steps 3 and 4 from Calibrating Nova Carter Sensors until calibration succeeds. Then return to this step to launch the following command:
bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_FRONT
bash run_calibration_app.sh --calibration-type FISHEYES_FRONT_LEFT_RIGHT
bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_LEFT
bash run_calibration_app.sh --calibration-type STEREO_FISHEYE_RIGHT
Assemble the Nova Orin Developer Kit on the robot such that the camera’s field of views are unobstructed and the small calibration target is fully visible when placed close to the robot and flat on the ground. Make sure to orient the Nova Orin Developer Kit such that the front camera is pointing to the robot’s main direction of travel, and the bottom plate is parallel to the ground.
Front stereo camera to ground calibration: Follow the instructions in the
Calibrating Camera to Ground Section
with one important difference: Instead of creating a new URDF file, edit the file
/etc/nova/calibration/isaac_calibration.urdf
(or isaac_nominals.urdf
if calibration file is not
present). In particular, edit the line:
<origin xyz="-0.016000 0.075000 0.012500" rpy="0.000000 0.000000 0.000000"/>
In the code block:
<joint name="front_stereo_camera_joint" type="fixed">
<origin xyz="-0.016000 0.075000 0.012500" rpy="0.000000 0.000000 0.000000"/>
<parent link="base_link"/>
<child link="front_stereo_camera"/>
</joint>
For example, if you are mounting the Nova Orin Developer Kit such that its center-front point is at a height of Z = 0.4 m above the robot origin and X = -0.1 m behind it, the front_stereo_camera
joint should be adjusted accordingly to reflect Z = 0.4123 m above the robot origin, X = -0.116 m behind it, and Y = 0.075 to its left:
<joint name="front_stereo_camera_joint" type="fixed">
<origin xyz="-0.116000 0.075000 0.412500" rpy="0.000000 0.000000 0.000000"/>
<parent link="base_link"/>
<child link="front_stereo_camera"/>
</joint>
After following the previous steps, the hierarchy of calibrations between sensors is described by the following transformation tree:
Robot
│
Front Stereo Camera
│
└─ Front Fisheye Camera
│
├─ Left Fisheye Camera
│ └─ Left Stereo Camera
│
└─ Right Fisheye Camera
└─ Right Stereo Camera
This section describes the camera-robot/ground calibration process, and it is not limited to
Nova Carter or Nova Orin Developer Kit. It supports custom locations of the
front_stereo_camera
, and it has
been tested with elevations (Z) ranging between 0.15 m and 0.75 m with camera parallel to the
ground.
NOTE: The calibration target on the ground needs to be visible by the
front_stereo_camera
at a close distance to the robot. The exact distance depends on camera position and orientation.
View of the Camera-robot/ground calibration step, showing the small calibration target flat on the ground, in front of a robot with a Nova Orin Developer Kit.
Camera-robot/ground calibration involves the following steps after mounting the sensors on the robot:
front_stereo_camera
) with respect of base_link
in CAD model of your robot or with measuring
tape. Note that the origin of the front_stereo_camera
is in its left principal
point, and base_link
is on the ground./etc/nova/calibration/isaac_nominals.urdf
. For example, in the
particular case of a robot with only one front_stereo_camera
(isaac_nominals_single_camera.urdf), edit the line: <origin xyz="0.000000 0.000000 0.000000" rpy="0.000000 0.000000 0.000000"/>
In the code block:
<joint name="front_stereo_camera_joint" type="fixed">
<origin xyz="0.000000 0.000000 0.000000" rpy="0.000000 0.000000 0.000000"/>
<parent link="base_link"/>
<child link="front_stereo_camera"/>
</joint>
For example, if you are mounting the front_stereo_camera
such that its principal point at a height
of Z = 0.4 m above the robot origin and X = -0.1 m behind it, the front_stereo_camera
joint should
be adjusted accordingly:
<joint name="front_stereo_camera_joint" type="fixed">
<origin xyz="-0.100000 0.000000 0.400000" rpy="0.000000 0.000000 0.000000"/>
<parent link="base_link"/>
<child link="front_stereo_camera"/>
</joint>
bash run_calibration_app.sh --calibration-type GROUND --ground-sensor front_stereo_camera
http://<robot_ip>:3000/calibration.html
, and click the button in the web UI to perform ground calibration. If the board is not detected,
adjust the position of the calibration target accordingly following the previous instructions./etc/nova/calibration/isaac_calibration.urdf
.Calibration file OTA backup and deploy is available to users of Isaac Cloud Services. See the OTA File Service page for documentation and deployment instructions.
After the (OTA service is configured),
calibration files are automatically backed up at the end of the calibration execution, if the
calibration app is launched with the following additional command-line argument:
--upload-bucket <bucket_name>
NOTE:
bucket_name
must be the name you give when creating your bucket.
The repeatability error is measured as the transformation between a given sensor calibration and the mean transformation from a set of repeated calibrations for the same sensor.
For stereo Camera-Lidar calibration, rotation is expected to be under 2.7 mrad and translation under 1.5 cm of repeatability error.
For Camera-Camera calibration, rotation is expected to be under 1.0 mrad and translation under 0.7 cm of repeatability error.
For Camera-Ground calibration, rotation is expected to be under 3.4 mrad and translation under 0.7 cm of repeatability error.
The re-projection error is measured as the pixel distance between estimated keypoints and detected keypoints for each pair of cameras with overlapping field of view. The re-projection of 99% of keypoints is expected to be under 1.1 pixels.
Extrinsic calibration of all available sensors is expected to take approximately 30 minutes on Nova Carter and 15 minutes on Nova Orin Developer Kit.
Data collection and calibration time varies per each sensor modality, approximately taking:
The extrinsic calibration tool is designed specifically for the Nova Carter robot, Nova Orin Developer Kit, and their sensors (3D Lidar and cameras). Different robots and sensor configurations are not supported at the moment.
The calibration tool performs Lidar-robot/ground calibration, camera-robot/ground calibration, Lidar-stereo camera calibration, and camera-camera calibration for all fisheye and stereo cameras in Nova Carter. Lidar-robot/ground and camera-robot/ground calibration refer to estimating the relative pose of the sensor (3D Lidar or camera) with respect to the ground and includes 3 degrees of freedom: pitch, roll, and translation in Z axis.
The calibration takes multiple incremental steps to calibrate all the desired sensors in the robot, where each of the steps includes a subset of sensors. The URDF calibration file is modified with each calibration step and its content is used in remaining calibration tasks. See the Calibrating Nova Carter Sensors section for details on the specific calibration steps.
Steps may involve calibration of multiple sensors (sub-steps). If any calibration sub-step fails (for example, Lidar-front stereo camera), the operator is directed to restart the calibration process from the beginning of that step. Nominals are provided for the remaining sensors.
Follow the instructions for space and calibration target requirements to minimize the failure rate of the tool.
front_stereo_camera
in custom configurations of Nova Sensors. Camera-ground calibration improves
low-lying object perception for Isaac Perceptor.Nvidia Isaac ROS Software License.