"What can be learned from the Tesla fatality?Google's Driverless-Car Czar on Taking the Human Out of the Equation
Well, first of all, it’s a tragedy. I mean, Joshua Brown lost his life. A couple of key points, though. One is, he was one of probably a hundred or so people who died that day in automotive fatalities, in the U.S. alone. You know the statistics: 35,000 fatalities, up 7 percent from the year prior. Globally, it’s over 1.2 million. It’s as if a 737 was crashing every hour of every day all year. From a macro standpoint, it’s a very, very big problem.
But we need to make sure we’re using the right language when we talk about what happened with that accident, because that wasn’t a self-driving car, what we refer to as an L4, or fully autonomous car. That was a car with traffic-aware cruise control and a lane-keeping function—an L2, where, for better or worse, it was the responsibility of the driver to be cautious. We, as humans, are fallible creatures. [The crash] confirms our sense that the route to full autonomy, though much harder, is the right route. And we’ve learned from experience what happens when you put really smart people with really clear instructions inside a car with capabilities like that Tesla one."
Thursday, August 04, 2016
Google's Driverless-Car Czar on Taking the Human Out of the Equation
Excerpt from a self-driving car reality check
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.