Feedback

Camera Calibration with MATLAB

Camera calibration is a technique used to improve the quality of images captured with a camera by correcting for lens distortion or to measure object dimensions in world units. A calibrated camera is an essential component in applications like machine vision for measuring actual object sizes, and robotics for navigation and 3D scene reconstruction.

Camera calibration involves determining the characteristics of a camera: intrinsic and extrinsic parameters. Intrinsic parameters define the internal characteristics of the camera, such as focal length of the lens, optical center, and lens distortion coefficients. Knowing these parameters allows us to improve image quality correct for lens distortion and map real-world distances to pixels. Extrinsic parameters define the location of the camera in space with reference to a fixed object, and these parameters are essential to stereo calibration and structure from motion. In this video, you will see how easy it is to perform camera calibration using MATLAB for cameras, including fisheye lens and stereo vision.

Computer Vision Toolbox provides both MATLAB functions and an interactive app for performing camera calibration. The Camera Calibrator app is an easy and interactive interface to complete the calibration workflow.

First, add calibration images of a checkerboard calibration pattern. A checkerboard is used because its regular pattern makes it easy to detect automatically. It is recommended to use between 10 and 20 images for accurate calibration results.

Next, enter the size of the checkerboard square in world units, so millimeters, centimeters, or inches. This is a necessary step to find the mapping between world units and image pixels. The app then automatically detects the checkerboard calibration pattern in the provided images.

Then, you can check the accuracy of the checkerboard detector by zooming in to inspect the results. This helps with finding incorrect detections and removing bad images. Under Options, you can also specify the number of radial distort coefficients calculated. Radial distortion occurs when light rays bend a greater amount near the edges of a lens than they do at the optical center. Typically, two coefficients are enough, but for severe distortion, as in the case of a wide-angle lens, three coefficients might be necessary. You can also enable the estimation of tangential distortion. This distortion occurs when the lens and camera sensor are not parallel.

Now, press the Calibrate button to solve for camera parameters. Once calibration is done, you can evaluate calibration results by visualizing reprojection errors. Reprojection errors

are a global measure of calibration error and are the difference between points detected in the image and points reprojected back onto the image using the camera parameters that you just calculated. This is helpful to identify bad images that you can remove and recalibrate for better results.

You can also visualize the extrinsic parameters to see which angles calibration images are taken from. This is useful to find out when calibration images aren't captured from enough angles and more images might be needed to improve calibration results.

Now that we have seen the calibration workflow for a standard camera, let’s look at the same for a fisheye or wide-angle lens.

Unlike the standard camera lenses, these cameras use a complex series of lenses to enlarge the camera's field of view, enabling it to capture wide panoramic or hemispherical images. However, the lenses achieve this extremely wide-angle view by distorting the lines of perspective in the images. The Computer Vision Toolbox calibration algorithm uses the fisheye camera model proposed by Scaramuzza, where the intrinsic parameters account for extreme distortion and stretching.

In the app, choose the Camera Model option as “Fisheye.” Under Options, you can now choose to enable the estimation of alignment between the sensor and the image plane. After running calibration, you can view the undistorted images that have been compensated for lens distortion. Lens distortion is a common problem and causes straight lines to appear curved. Knowing the camera’s intrinsic parameters lets us apply an un-distortion routine that removes the lens distortion, and you now see that the edges that appeared curved have now been straightened out. Correcting for lens distortion is very useful in computer vision in applications like stitching images together to form a panorama that require images to be undistorted to work well.

Here is an example available with the Computer Vision Toolbox that shows how to measure the diameter of a couple of pennies shown in the image on the right here.

Finally, let’s look at the calibration workflow for stereo cameras using MATLAB. Stereo vision is the process of recovering depth from camera images by comparing two or more views of the same scene. The output of this computation is useful to design a 3D point cloud, where each 3D point corresponds to a pixel in one of the images. The stereo camera calibrator app in MATLAB allows you to estimate geometric parameters of each camera in a stereo camera pair. You can also estimate the translation and rotation between the camera pair. In the app, load calibration checkboard images for the two cameras separately and then follow the same steps as before to perform calibration and analyze the results.

The reprojection error bar graph here displays the mean reprojection error per image, along with the overall mean error. Clicking Show Rectified option in the View section shows the effects of stereo rectification. If the calibration was accurate, the images become undistorted and row-aligned.

Refer to the link the description for a detailed example on Depth Estimation from Stereo Vision in the documentation.

Thank you for watching this video and please visit mathworks.com for more information on camera calibration.