Enhanced radar technology to help autonomous vehicles navigate in poor weather
Electrical engineers at the University of California (UC) San Diego have developed a new type of radar that could make it possible for self-driving vehicles to navigate safely in poor weather.
The new and improved system consists of two radar sensors placed on the hood, spaced approximately 1.5 metres apart.
By arranging the sensors in this manner, the image capacity of the system is improved substantially, allowing the system to accurately predict the shape and size of objects within view.
“By having two radars at different vantage points with an overlapping field of view, we create a region of high resolution, with a high probability of detecting the objects that are present,” reported Kshitiz Bansal, computer science and engineering PhD student at US San Diego.
Current autonomous vehicles rely on technologies such as LiDAR or single radar systems to navigate their surroundings.
LiDAR systems operate by bouncing laser beams off nearby objects. From the information gathered, the system can then create a high-resolution 3D picture. While this is suitable for clear days, the technology cannot ‘see’ once the weather deteriorates.
Radar technology transmits radio waves, which will continue to work in a range of climates but can only capture a partial picture of the road scenes.
The new technology developed by UC San Diego compensates for these shortcomings with what is described as “LiDAR-like radar” by Dinesh Bharadia, a professor of electrical and computer engineering at UC San Diego Jacobs School of Engineering.
“Fusing LiDAR and radar can also be done with our techniques, but radars are cheap. This way, we don’t need to use expensive LiDARs.”
Utilising the two radar technology is an inexpensive approach to achieving poor weather perception for autonomous vehicles.
In tests simulating foggy weather, the system performed to an equal standard as trials conducted during clear days/nights. The team concealed another vehicle using a fog machine and the system accurately predicted its 3D geometry, outperforming the LiDAR system.
The team also developed new algorithms to overcome the problem of radar noise. The new algorithm fuses the information from both radars to product a single new image free of noise.
“There are currently no publicly available data sets with this kind of data, from multiple radars with an overlapping field of view,” Bharadia said. “We collected our own data and built our own dataset for training new algorithms and for testing.”
The team is currently working with Toyota to fuse the new radar technology with cameras. Researchers suggest this technology could potentially replace the LiDAR system altogether.
“Radar alone cannot tell us the colour, make or model of the car. These features are also important for improving perception in self-driving cars,” Bharadia said.
Source: AZO Sensors | Upgraded Radar System can Help Self-Driving Cars to Navigate Safely in Bad Weather
20 November 2020