Apr 11, 2024

Accurately constructing free space maps using radar only

Vehicles, cyclists and pedestrians are obstacles that are to be expected on the road. While it is possible to train models to precisely classify these obstacles using on-board sensors, unexpected obstacles like roadworks, fallen trees or traffic accidents are much more difficult to train for due to the rarity of suitable training data. In addition, High Definition (HD) maps lack the real-time data to represent such unexpected obstacles, so a real-time freespace map must be constructed in order to detect them.

Not only is it possible to create freespace maps using radar alone, but there are also many cases where radar is superior. In this blog, we will explore how we create freespace maps and how this approach performs across a number of common but challenging environments.

Our Architecture

Before freespace can be estimated, radar pointclouds must be processed and filtered. Microservices pre-process the radar data for upstream tasks in our 5D Perception® system. Each microservice involved in freespace estimation can be seen in Figure 1:

  • Odometry calculates the position, velocity, orientation angle and rate of orientation angle change
  • The Dynamic Target Filter removes dynamic targets so that only the static scene remains
  • SLAM accumulation increases pointcloud density by overlapping past pointclouds with new pointclouds
Figure 1: High-level radar microservices architecture diagram

The freespace estimation algorithm uses the pre-processed pointcloud to create an accurate freespace map. Figure 2 outlines the individual steps of the algorithm. The goal of our freespace estimation algorithm is to identify freespace beyond what can be seen by ray tracing. A generalised approach to freespace estimation allows us to incorporate multiple detections along each beam. This makes it possible to estimate freespace that might be out of direct line-of-sight but that is still visible to the radar.

Figure 2: Illustration of processing steps required for freespace estimation

Performance in Challenging Environments

Curved Roads and Complex Junctions

Freespace estimation algorithms like Inverse Sensor Modelling (ISM) struggle to estimate freespace around corners. This is largely due to the data losses incurred when 3D pointcloud data is converted into a 2D Bird’s Eye View (BEV) representation, which can obstruct detail at further ranges. Figure 3 shows a road scenario where ISM has reduced range. Our freespace estimate follows the curvature of the road beyond line-of-sight to 50m+.

Figure 3: Scene with roads diverging to the left and the right. The satellite view with position and direction of the vehicle is shown on the left. An ISM freespace estimate (yellow) using LiDAR data is shown in the centre. Our freespace estimate (green) is on the right.

Obstacles at Long Range

Detection range is important for fast moving vehicles like cars and drones because large distances can be covered in a small amount of time. Figure 4 shows a vehicle that has stopped in the hard shoulder on a motorway. Using our sensors, this vehicle is first detected at 300m and an accurate freespace map is created at 125m. The speed limit of the vehicle is 100km/h or 27.8m/s. Therefore the vehicle has 10.8 seconds from first detecting the object to process the obstacle and 4.5 seconds to avoid the vehicle using the freespace map.

Figure 4: Truck circled in red on freespace map (left) and camera (right).

Figure 5 shows the camera and freespace estimate overlaid. The obstacle is closer to make it easier to see the freespace. Freespace narrows as expected beside the obstacle and the truck and is precisely removed.

Figure 5: Freespace overlaid over original camera, showing freespace area narrowing around truck

Indoor Car Park

Indoor environments can be challenging to map because they are often dimly lit, cannot avail of GPS and have targets at close range and long range. Figure 6 shows an indoor car park environment which has harsh lighting and clutter above the radar from the car park ceiling. Detections from the ceiling are filtered out correctly and do not block the freespace estimate. Using our technique, spaces between vehicles on the left and a junction to turn right are both detected at greater than 20m.

Figure 6: Free parking space detected beyond vehicles (orange) and right turn junction detected (blue)

Conclusion

It is clear that our new class of MIMSO® imaging radars are more than adequate for freespace estimation. Estimate accuracy and range is improved by looking beyond line-of-sight. Many freespace algorithms fail around curved and complex road structures but our estimate does not degrade in these conditions.

We have shown how our radar-only freespace estimation algorithm performs indoors and outdoors at a variety of ranges. Furthermore, lighting and weather conditions do not affect performance unlike LiDAR and camera systems. The low cost of the radar and processing stack makes radar-only freespace a viable option for industries such as automotive, mining and agriculture.

To learn more, read our white paper.

Share

Sign up for our newsletter

Stay up to date with all the latest news and events at Provizio.
Email address
Subscribe

Testbed, Deliveries and 5D Perception® Demo Drives

Provizio, Future Mobility Campus Ireland
Shannon Free Zone
V14WV82
Ireland

Company Information

Provizio Ltd
VAT Number: IE3638928AH
Company Number: 654660 (Registered in Ireland)

Testbed, deliveries and 5D Perception® demo drives

Provizio, Future Mobility Campus Ireland
Shannon Free Zone
V14WV82
Ireland

Sales & Support

Newlab Michigan Central,
Detroit,
MI 48216,
United States

Provizio Ltd
VAT Number: IE3638928AH
Company Number: 654660 (Registered in Ireland)
Reach out to us and see how we can help
Talk to us
Copyright © 2024 Provizio, Ltd. All rights reserved.
crossmenuchevron-down