Autonomous and Advanced Driver Assistance Systems (ADAS) enable vehicles to rely on a variety of sensors, including thermal imaging, to create an accurate 3D perception of the environment. That 3D perception is created by combining the images of two cameras together as a stereo pair, known as stereoscopic vision. Those stereo pair cameras can be visible or thermal or otherwise, to capture depth perception through the ability to measure distances by triangulating the cameras with objects in the field of view.
Measuring distance and creating a 3D model of the environment represents a necessary component for machine vision systems to comprehend the scene. That comprehension, combined with a convolutional neural network, enables the system to both identify objects and their respective distances to make the most appropriate decisions for the situation, in conjunction with other sensor modalities such as LIDAR and radar.
3D perception system
To aid in that crucial 3D depth perception, Foresight, part of the Thermal by FLIR® program and an automotive vision systems innovator, has achieved the ability to create a cost-effective 3D perception system through QuadSight®, a multi-spectral vision solution. This solution combines data from two FLIR Boson® thermal imaging cameras and two visible-light cameras, each as stereo pairs, respectively.
The data from these four cameras merge the strengths of each vision system technology to create a comprehensive 3D scene. The two wavelengths of light provide effective vision in harsh weather and in poor lighting - 24 hours a day.
Automatic calibration system enables stereo cameras across the visible and IR spectrums to remain calibrated at all times Now, Foresight has taken that technology to a new level through its industry-first, patent-pending automatic calibration system, which enables stereo cameras across the visible and infrared spectrums to remain calibrated at all times. This ensures the accuracy of the images generated for effective 3D perception and distance measurement of the surrounding environment.
This technological breakthrough will empower vehicle manufacturers the flexibility to mount stereoscopic cameras in both visible or thermal pairs on a single base or asymmetrically as separate units, offering more options for embedding stereo camera systems within a vehicle’s design—all without sacrificing perception or detection capabilities.
Managing roadway conditions in real-time
One of the key hurdles to successfully leveraging stereoscopic imaging in the automotive context is to account for the naturally occurring dynamic changes that all vehicles experience on the road, including small vibrations, bumps, and temperature changes. These phenomena can cause a reduction in accuracy and potentially a complete loss of calibration altogether.
These minute changes require stereoscopic vision systems to continuously calibrate the cameras in order to create clear and accurate stereoscopic 3D perception, including the ability to account for parallax issues that can arise in a rapidly changing scene. A miscalibration may lead to inaccurate depth estimation that can affect the decision-making mechanism of any automated system, especially ADAS and autonomous vehicles.
With this technology, automakers and autonomous vehicle developers now have access to a redundant system and an additional layer of 3D perception data to further refine and improve ADAS and autonomous systems for improved safety.