rFpro's ray tracing capability transforms the realism of simulation by being the first to truly represent what cameras ‘see’, accelerating the development of autonomous vehicles.

The AB Dynamics Group company's new ray tracing rendering technology significantly reduces the industry’s dependence on real-world testing for the development of autonomous vehicles (AVs) and Advanced Driver Assistance Systems (ADAS). The technology is the first to accurately simulate how a vehicle’s sensor system perceives the world.

Matt Daley, Operations Director at rFpro said:

“The industry has widely accepted that simulation is the only way to safely and thoroughly subject AVs and autonomy systems to a substantial number of edge cases to train AI and prove they are safe. However, up until now, the fidelity of simulation hasn’t been high enough to replace real-world data. Our ray tracing technology is a physically modelled simulation solution that has been specifically developed for sensor systems to accurately replicate the way they ‘see’ the world.”

Ray tracing is rFpro’s software-in-the-loop (SIL) solution aimed at generating synthetic training data. It uses multiple light rays through the scene to accurately capture all the nuances of the real world. As a multi-path technique, it can reliably simulate the huge number of reflections that happen around a sensor. This is critical for low-light scenarios or environments where there are multiple light sources to accurately portray reflections and shadows. Examples include multi-storey car parks and illuminated tunnels with bright ambient daylight at their exits, or urban night driving under multiple street lights.

Modern cameras used in the automotive industry capture multiple exposures of varying lengths of time. For example, a short, medium and long exposure per frame. To simulate this accurately, rFpro has introduced its multi-exposure camera API. This ensures that the simulated images contain accurate blurring, caused by fast vehicle motions or road vibrations, alongside physically modelled rolling shutter effects.

“Simulating these phenomena is critical to accurately replicating what the camera ‘sees’, otherwise the data used to train ADAS and autonomous systems can be misleading,” said Daley. “This is why traditionally only real-world data has been used to develop sensor systems. Now, for the first time, ray tracing and our multi-exposure camera API is creating engineering-grade, physically modelled images enabling manufacturers to fully develop sensor systems in simulation.

“Ray tracing provides such high-quality simulation data that it enables sensors to be trained and developed before they physically exist.” explains Daley. “As a result, it removes the need to wait for a real sensor before collecting data and starting development. This will significantly accelerate the advancement of AVs and sophisticated ADAS technologies and reduce the requirement to drive so many developmental vehicles on public roads.”

See rFpro's ray tracing technology in action here 

We use necessary cookies to make our site work. To manage these, please use the privacy settings.