For better and worse, the future of travel in America will likely be defined by self-driving vehicles. Many tech enthusiasts are excited about the possibilities of cars that drive themselves, leaving all occupants – including those in the driver’s seat – free to focus on work or entertainment. Supporters of fully autonomous vehicles also predict that they will be safer than human drivers because human error will no longer be factor increasing the risk of crashes.
While these are rosy predictions, the fact of the matter is that we simply don’t know how self-driving cars will perform, particularly as we move from where we are now toward greater levels of automation. And unfortunately, government regulators have taken a rather hands-off approach to ensuring that driver-assist technologies on the market now are safe and reliable.
As just one example, consider the “autopilot” feature available on Tesla vehicles. The name is misleading, because while the feature partially automates the driving experience and allows drivers to go hands-free for extended periods of time, these drivers need to be ready to take over at a moment’s notice. There have been numerous crashes stemming from confusion over the Tesla autopilot feature, including a fatal one here in California.
The National Highway Traffic Safety Administration (NHTSA) has not previously put regulations in place governing the use and performance of partial-automation systems, relying instead on recalls if problems prove to be widespread. This approach has been criticized by the other agencies like the National Transportation Safety Board, which is tasked with investigating crashes.
Perhaps in response to this criticism, the NHTSA recently announced that it will be implementing a rulemaking process to eventually adopt safety standards related to autonomous vehicles. This is very small start, but it is at least a start.
In the meantime, all drivers need to understand that just because a new automobile feature is on the market doesn’t mean it has been proven safe. Placing too much faith in partial-automation systems or driver-assist features can be a highly injurious or fatal mistake.