I expect speed limits to hinder the adoption of robocars, without improving any robocar-related safety.
There’s a simple way to make robocars err in the direction of excessive caution: hold the software company responsible for any crash it’s involved in, unless it can prove someone else was unusually reckless. I expect some rule resembling that will be used.
Having speed limits on top of that will cause problems, due to robocars having to drive slower than humans drive in practice (annoying both the passengers and other drivers), when it’s safe for them to sometimes drive faster than humans. I’m unsure how important this effect will be.
Ideally, robocars will be programmed to have more complex rules about maximum speed than current laws are designed to handle.
I expect speed limits to hinder the adoption of robocars, without improving any robocar-related safety.
There’s a simple way to make robocars err in the direction of excessive caution: hold the software company responsible for any crash it’s involved in, unless it can prove someone else was unusually reckless. I expect some rule resembling that will be used.
Having speed limits on top of that will cause problems, due to robocars having to drive slower than humans drive in practice (annoying both the passengers and other drivers), when it’s safe for them to sometimes drive faster than humans. I’m unsure how important this effect will be.
Ideally, robocars will be programmed to have more complex rules about maximum speed than current laws are designed to handle.