It’s not because the car hit someone, that’s just a thing that happens (a pedestrian got hit by a different car, and tossed in front of the self driving car).
It’s that they tried to obscure details about the incident from authorities, which is a good way to kill trust. (The car stopped after the impact, and when it thought it was clear, it pulled forward and out of the normal traffic path to clear the scene. They tried to hide that it did that, because the pedestrian was actually stuck under the vehicle and was dragged some distance. It could easily happen to a human driver, but you tell people what happened)
It’s not because the car hit someone, that’s just a thing that happens (a pedestrian got hit by a different car, and tossed in front of the self driving car).
It’s that they tried to obscure details about the incident from authorities, which is a good way to kill trust. (The car stopped after the impact, and when it thought it was clear, it pulled forward and out of the normal traffic path to clear the scene. They tried to hide that it did that, because the pedestrian was actually stuck under the vehicle and was dragged some distance. It could easily happen to a human driver, but you tell people what happened)
Human drivers engage in hit and runs all the time. In 2016 there were around 750,000 estimated hit and run accidents according to: https://aaafoundation.org/wp-content/uploads/2018/04/18-0058_Hit-and-Run-Brief_FINALv2.pdf
One of the most annoying things about self driving cars is that people expect them to be perfect before they trust them.
I only need them to be statistically better than humans.
We need data to figure that out, so these pilots are important. Throw some more safety systems in place and keep going.
The best kind of self driving car is one on rails.