rFpro have released a video showing rFpro being used to train, test and validate DLAD (Deep Learning Autonomous Driving) models.
The video shows some of the ways in which rFpro is being used to test DLAD (Deep Learning Autonomous Driving) models. In some clips you can see the view from camera sensors feeding DLAD systems. You can also see the view from human-controlled cars being driven in simulation sharing the virtual world with vehicle models under the control of autonomous systems. We have also included some ‘corner cases’ where you can see scenes lit by the sun when it is low in the sky, obscuring the white lines and creating challenging conditions. In this case the Radar and LiDAR feeds become more important.
The model used is of the centre of Paris. Increasingly OEMs and T1s are requesting more complex urban environments with mixed traffic and pedestrians.
rFpro is being used to Train, Test and Validate DLAD systems. rFpro is also being used to allow human drivers to test vehicles with ADAS and Autonomous systems in simulation, to allow humans to drive autonomous vehicles, as a “passenger”, and to allow human drivers to share a virtual world with autonomous vehicles, in order to subjectively evaluate their behaviour and to drive in a way designed to provoke a response; for example veering into their lane or pulling out in front of them at an intersection.