Head-up displays (HUDs) are the next evolution in creating a better driving experience. They can help improve car designs by displaying key information and functionality on the driver’s windshield, which can improve the driver’s situational awareness.
For example, displaying a car’s speed in the driver’s line of sight can reduce the eyes-off-the-road time. And when integrated with sensors and the capabilities of advanced driver-assistance systems (ADAS), HUD technology also enables drivers to more easily detect threats or warnings so that they can take action much more quickly. For example, instead of only having a red blinking light or an alarm inside the car, drivers can see a graphic that highlights real-world objects like a pedestrian or a roadblock.
HUD system designs are starting to move away from traditional display technology, which uses thin film transistor (TFT) panels, and are instead adopting augmented reality (AR) to display images further out in front of drivers, in their natural line of sight. This projection also allows these images to represent the real world and provide more useful information, such as navigation, threat identification, and more.
Why AR HUDs?
While traditional HUD systems do have advanced automotive display technology, they still have limitations. For instance, they offer a small field of view (FoV) from the driver’s vantage point. A projected image is small in space—on the order of 5 to 7 horizontal degrees—and typically displayed 2 to 3 m out in front of the driver, which places the image near the front bumper of the car. This limited FoV places constraints on the types of image(s) that can be displayed and where those image(s) can appear in the driver’s view.
Additionally, the limited virtual image distance (VID) makes it difficult to align, or overlay, conformal graphics onto real-world objects. As a result, the information displayed on traditional HUDs is primarily a duplicate of what is visible elsewhere from the driver’s seat, and doesn’t add additional features.
An AR HUD reduces re-accommodation time by the longer VID—from around 2 m with traditional HUDs to 7, 10, or even 20 m with AR HUDs—which means that the driver’s eyes don’t have to shift focus between the real world to the HUD symbols, allowing them to more easily process and understand the information. Figures 1 and 2 illustrate these FoV and VID differences between traditional (past and present) HUDs and AR (future) HUDs.
Decreasing the driver’s eye re-accommodation time becomes even more important when considering ADAS in next-generation cars, which allows critical information to be shown very naturally and efficiently on the display. With the adoption of AR HUDs, car designers now have the option to place cluster and traditional center-stack information directly in the driver’s line of sight, which could provide more dashboard design flexibility in future automobiles.
Design Challenges
During the design process, AR HUDs require different features and have different concerns than traditional HUD systems. AR HUDs provide a wider FoV and also offer a larger eye box, which is the area where the driver’s head and/or eyes have to be in order to see the virtual image. Larger eye boxes are more accommodating to taller and shorter drivers. They also allow for more head movement (up/down and left/right) without compromising the visibility of the virtual image. Having a larger eye box simplifies HUD designs, as you don’t need to adjust the graphics in software. The overlay of an object is correct, regardless of driver height.
There is a trade-off when increasing the FoV and enlarging the eye box: increased brightness, or lumens. The more lumens a light source provides, the brighter the light. When the size of the FoV doubles, the amount of light required also doubles. Similarly, when you double the area of the eye box, you double the amount of light needed. When comparing a traditional HUD, which has a small FoV, to an AR HUD, which has a really big FoV, the amount of required light increases, requiring a light source that is not only efficient, but that can produce a high amount of lumens (light).
Another design challenge is solar load. A simple analogy of solar load is what happens when you focus sunlight using a magnifying glass. The dot it creates carries a lot of solar energy (solar load). When incorporating AR HUDs, since you have a long VID, the optics that do this become very powerful in terms of magnification—on the order of 25x to 30x. Handling this high solar load is critical without absorbing all that heat.
Another design consideration for AR HUD technology is performance consistency: display brightness, color, quality over a wide operating temperature range (hot/cold days/nights,) and reliable performance regardless of driving conditions. Driving-critical information like collision warnings and lane departures must be visible to the driver at all times, such as at night, in a dark tunnel, through rain, and in bright sunlight.
The use of polarized sunglasses to reduce glare has also been known to block images produced by TFT HUD images. AR HUD systems need to allow images to be viewable through polarized sunglasses, as shown in Figure 3.
The HUD must also consistently produce colors that look bright, vibrant, and saturated, which requires a wide color pallet. High saturation is particularly important for the color red, which is often used as a warning color. With the right design, AR HUDs can support high brightness and wide dimming ranges, as well as automatically adjust based on surrounding conditions to make information consistently visible in all driving conditions, enabling the driver to see and react to all kinds of information.
Finally, another design consideration is diagnostics. As HUDs get larger and positioned critical information in the driver’s direct line of sight, it’s crucial that the display never malfunctions. What this means is that the HUD should not display super bright images or corrupted data (static) images, which could cause temporary blindness to the driver. It has become increasingly important to have reliable system monitoring and diagnostics for monitoring the HUD image. Additionally, the system must be able to shut down immediately if it ever malfunctions. The HUD image should never interfere with the driver’s view, which would interfere with the operation of the vehicle.
Future Design Approaches
One of the biggest challenges in AR HUD design is the physical size of the unit in order for it to fit effectively in modern cars without compromising other important vehicle features. The HUD package size can get really big—on the order of 15 to 20 l (0.5 to 0.7 ft3), so the real test is figuring out a way to fit it into a car. To address this challenge, companies are looking at different and newer technologies, such as waveguides and holographic films, to help reduce HUD package size. Because DLP technology works with lasers, it supports both waveguides and holographic films, which can further shrink the size of the unit.
The industry is starting to replace traditional HUD optics such as mirrors with newer, smaller technologies. As an added benefit, these different technologies allow for much bigger FoVs, so now it’s possible to target a 15 x 5 degree FoV HUD or larger. This alternate design approach is allowing more carmakers to better fit AR units into automobiles.
In addition, HUD designs need to support real-time vehicle sensor data and human-machine interaction software to accurately overlay symbols on a dynamic environment. Taking sensor data from multiple systems and representing it visually in the real world so that drivers can understand it and take action requires complex systems to communicate and process vast amounts of data. Automotive system designers are working on developing effective and efficient ways to accurately capture, process, and display this sensor data in real time.
——Source:ecnmag.com