360-degree head-up display provides real time warning for drivers
28.12.2023 - New system see through objects to project holographic representations of road obstacles.
Researchers have developed an augmented reality head-up display that could improve road safety by displaying potential hazards as high-resolution three-dimensional holograms directly in a driver’s field of vision in real time. Current head-up display systems are limited to two-dimensional projections onto the windshield of a vehicle, but researchers from the universities of Cambridge, Oxford and University College London developed a system using 3D laser scanner and lidar data to create a fully 3D representation of London streets.
The system they developed can effectively see through objects to project holographic representations of road obstacles that are hidden from the driver’s field of view, aligned with the real object in both size and distance. For example, a road sign blocked from view by a large truck would appear as a 3D hologram so that the driver knows exactly where the sign is and what information it displays. The 3D holographic projection technology keeps the driver’s focus on the road instead of the windscreen, and could improve road safety by projecting road obstacles and potential hazards in real time from any angle.
“The idea behind a head-up display is that it keeps the driver’s eyes up, because even a fraction of a second not looking at the road is enough time for a crash to happen,” said Jana Skirnewskaja from Cambridge’s Department of Engineering. “However, because these are two-dimensional images, projected onto a small area of the windscreen, the driver can be looking at the image, and not actually looking at the road ahead of them.” For several years, Skirnewskaja and her colleagues have been working to develop alternatives to head-up displays (HUDs) that could improve road safety by providing more accurate information to drivers while keeping their eyes on the road.
“We want to project information anywhere in the driver’s field of view, but in a way that isn’t overwhelming or distracting,” said Skirnewskaja. “We don’t want to provide any information that isn’t directly related to the driving task at hand.” The team developed an augmented reality holographic point cloud video projection system to display objects aligned with real-life objects in size and distance within the driver’s field of view. The system combines data from a 3D holographic setup with lidar – light detection and ranging – data. Lidar uses a pulsed light source to illuminate an object and the reflected light pulses are then measured to calculate how far the object is from the light source.
The researchers tested the system by scanning Malet Street on the UCL campus in central London. Information from the lidar point cloud was transformed into layered 3D holograms, consisting of as many as 400,000 data points. The concept of projecting a 360° obstacle assessment for drivers stemmed from meticulous data processing, ensuring clear visibility of each object’s depth. The researchers sped up the scanning process so that the holograms were generated and projected in real-time. Importantly, the scans can provide dynamic information, since busy streets change from one moment to the next.
“The data we collected can be shared and stored in the cloud, so that any drivers passing by would have access to it – it’s like a more sophisticated version of the navigation apps we use every day to provide real-time traffic information,” said Skirnewskaja. “This way, the system is dynamic and can adapt to changing conditions, as hazards or obstacles move on or off the street.” While more data collection from diverse locations enhances accuracy, the researchers say the unique contribution of their study lies in enabling a 360° view by judiciously choosing data points from single scans of specific objects, such as trucks or buildings, enabling a comprehensive assessment of road hazards.
“We can scan up to 400,000 data points for a single object, but obviously that is quite data-heavy and makes it more challenging to scan, extract and project data about that object in real time,” said Skirnewskaja. “With as little as 100 data points, we can know what the object is and how big it is. We need to get just enough information so that the driver knows what’s around them.” Earlier this year, Skirnewskaja and her colleagues conducted a virtual demonstration with virtual reality headsets loaded with the lidar data of the system at the Science Museum in London. User feedback from the sessions helped the researchers improve the system to make the design more inclusive and user-friendly. For example, they have fine-tuned the system to reduce eye strain, and have accounted for visual impairments.
“We want a system that is accessible and inclusive, so that end users are comfortable with it,” said Skirnewskaja. “If the system is a distraction, then it doesn’t work. We want something that is useful to drivers, and improves safety for all road users, including pedestrians and cyclists.” The researchers are currently collaborating with Google to develop the technology so that it can be tested in real cars. They are hoping to carry out road tests, either on public or private roads, in 2024. (Source: U. Cambridge)
Link: Centre for Molecular Materials, Photonics and Electronics, University of Cambridge, Cambridge, UK
Further reading: New kind of a 3D holographic head-up display, Wiley Industry News, 28 April 2021