The World Is Not Yet Ready For Self-Driving Cars
For people who think self-driving cars are soon coming to a highway near you, an incident reported by Ira Boudway of Bloomberg News should be disturbing. Last November in California a highway patrol officer spotted a Tesla going 70 miles an hour past multiple exits with a blinking turn signal. It was a self-driving vehicle with an incapacitated driver. The police figured out a way to stop the vehicle safely. The driver was charged with driving under the influence. (Will he plead in his defense that he was not actually driving?)
According to Boudway’s article (at https://www.dig-in.com/articles/someday-self-driving-cars-will-be-able-to-pull-over-for-police) neither Tesla nor the police are satisfied that this incident ended safely for all concerned. It shows that automakers, engineers, legislators and police must work through some difficult problems:
- How do police pull over an autonomous car?
- What should robot cars do after a collision?
- How do you program a vehicle to recognize human authorities – and distinguish them from civilians?
These “what if” issues will slow the development of driverless vehicles. They are being addressed: Waymo operating in a limited area has a fleet that can distinguish between police and civilians, follow hand signals and navigate construction zones. Police in Michigan have been working with Ford and its engineers. Software exists to allow cities to enter traffic rules and roadway markers into maps used by autonomous vehicles, and could some day enable police to update those maps from their cars. There are still “edge cases” such as recognizing police outside of vehicles, following and acknowledging hand signals, and possible use of autonomous vehicles by terrorists.
The unanswered and possibly unanswerable question is “Can autonomous vehicles be programmed to expect the unexpected?”.