Heads Up: Correcting Distortion of 3D AR HUD Images

Author:
Anne Corning

With the rise of head-up display (HUD) technology, more automakers are incorporating these systems in vehicles. The next generation of HUD technology is the application of three-dimensional augmented reality (AR-HUD) digital imaging. Compared to previous 2D systems which display static information in the driver’s line of sight, AR-HUD systems integrate images with the background environment outside the vehicle via dual-plane or multi-plane projections. For example, navigation arrows can appear overlaid on the roadway or other vehicles can be highlighted in motion.

Embedding these systems into today’s automobiles and ensuring the clarity of visual images can be a challenge, however. Each manufactured component, including the windshield and the display system, typically undergo rigorous testing at multiple stages during production to ensure visual quality. 

/

Watch HUD windshield qualty inspection at CP Industries using a Radiant ProMetric imager and integrated inspection solution.

But in the final assembly stage, as both components are installed into the vehicle, issues such as image distortion that had been eliminated during prior QA stages can reappear. For example, slight variability in component alignment, or differences between a test projector and the car’s installed projection system can introduce new variation.

The resulting distortion factor can be measured to generate a correction matrix and then the manufacturer can apply a correction factor to calibrate the image. However, identifying, measuring, and correcting the distortion can be time consuming and difficult to accomplish without slowing the assembly line process. 

A New Solution In-Line Measurement & Correction

This paper outlines the challenges of AR HUD system distortion and presents a new approach for efficient in-line inspection and distortion calibration. Radiant's solution utilizes and inspection system that can be positioned inside the already assembled vehicle on the production line via robot. Automatic alignment and specialized algorithms provide rapid distortion measurement and generation of correction coefficients. The solution then delivers the coefficients to the vehicle's HUD engine, completing the correction process. 

To learn more, read the white paper: Correcting AR-HUD Distortion and Defects in Final Assembly.

Image
Read the WP_Correcting AR HUD

 

radiant vision system wechat