Missing Rare Events in Autonomous Vehicle Simulation

Missing Rare Events in Simulation:A highly accurate simulation and system model doesn't solve the problem of what scenarios to simulate. If you don't know what edge cases to simulate, your system won't be safe.It is common, and generally desirable, to use vehicle-level simulation...

Dealing with Edge Cases in Autonomous Vehicle Validation

Dealing with Edge Cases:Some failures are neither random nor independent. Moreover, safety is typically more about dealing with unusual cases. This means that brute force testing is likely to miss important edge case safety issues.A significant limitation to a field testing argument...

The Fly-Fix-Fly Antipattern for Autonomous Vehicle Validation

Fly-Fix-Fly Antipattern:Brute force mileage accumulation to fix all the problems you see on the road won't get you all the way to being safe. For so very many reasons...This A400m crash was caused by corrupted software calibration parameters.In practice, autonomous vehicle field...

The Insufficient Testing Pitfall for autonomous system safety

The Insufficient Testing Pitfall:Testing less than the target failure rate doesn't prove you are safe.  In fact you probably need to test for about 10x the target failure rate to be reasonably sure you've met it. For life critical systems this means too much testing to be feasible.In...

The Human Filter Pitfall for autonomous system safety

The Human Filter Pitfall:Using data from human-driven vehicles has gaps corresponding to situations a human knows to avoid getting into in the first place, but which an autonomous system might experience.The operational history (and thus the failure history) of many systems is filtered...