build details

Show: section status errors & todos local changes recent changes last change in-page changes feedback controls

Camera calibration and validation

Modified 2018-06-22 by Andrea Censi

Here is an updated, more practical extrinsic calibration and validation procedure.

Place the robot on the pattern

Modified 2018-06-22 by Andrea Censi

Arrange the Duckiebot and checkerboard according to Figure 16.2. Note that the axis of the wheels should be aligned with the y-axis (Figure 16.2).

Figure 16.4 shows a view of the calibration checkerboard from the Duckiebot. To ensure proper calibration there should be no clutter in the background and two A4 papers should be aligned next to each other.

Extrinsic calibration procedure

Modified 2018-06-22 by Andrea Censi

Run the following on the Duckiebot:

duckiebot $ rosrun complete_image_pipeline calibrate_extrinsics

That’s it!

No laptop is required.

You can also look at the output files produced, to make sure it looks reasonable. It should look like Figure 16.6.

Note the difference between the two types of rectification:

  1. In bgr_rectified the rectified frame coordinates are chosen so that the frame is filled entirely. Note the image is stretched - the April tags are not square. This is the rectification used in the lane localization pipeline. It doesn’t matter that the image is stretched, because the homography learned will account for that deformation.

  2. In rectified_full_ratio_auto the image is not stretched. The camera matrix is preserved. This means that the aspect ratio is the same. In particular note the April tags are square. If you do something with April tags, you need this rectification.

Camera validation by simulation

Modified 2018-06-22 by Andrea Censi

You can run the following command to make sure that the camera calibration is reasonable:

duckiebot $ rosrun complete_image_pipeline validate_calibration

What this does is simulating what the robot should see, if the models were correct (Figure 16.8).

Result of validate_calibration.

Then it also tries to localize on the simulated data (Figure 16.10). It usual achieves impressive calibration results!

Simulations are doomed to succeed.

Output of validate_calibration: localization in simulated environment.

Camera validation by running one-shot localization

Modified 2018-06-22 by Andrea Censi

Place the robot in a lane.

Run the following command:

duckiebot $ rosrun complete_image_pipeline single_image_pipeline

What this does is taking one snapshot and performing localization on that single image. The output will be useful to check that everything is ok.

No questions found. You can ask a question on the website.