Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

People put themselves in dangerous situations all the time. I don't believe for one second that you've never made a traffic mistake in your life, either as a pedestrian or as a driver. But you've been fortunate to be surrounded by others who'll pick up the slack for your mistake and will yield even if they don't have to if it will prevent an accident. The traffic rules don't matter nearly as much as the outcome of people not being killed. Traffic rules are just a means to an end.

So yes, it's quite possible for autonomous vehicles to cause vastly more deaths than human drivers even while being slaves to the rules. Hell, you could do it too as a human driver if you removed your conscience. Just don't hit the brakes the next time you have the right of way and a jaywalker steps out in front of you. You'll easily kill someone within a week and have it be their "fault".



You're onto an important theme: so much of driving is really about group communication--everything from signaling to anticipating behavior based on your own past actions, to waving people through intersections or slowing down to let someone merge or speeding up to make your intent more clear, or hanging back when the traffic gets stupid, or avoiding drivers who seem irrational.

That's why I think the real self-driving car problem is a very special case of the Turing Test--one that might be more difficult to win or solve.


Most higher end cars already have assistant systems that prevent you from running over people or into unexpected obstacles. The problem here seems to be that Uber disabled some of those systems that the Volvo base car normally has.


I’ve read about the incident but haven’t seen mention of why they did this. Was their own system supposed to provide the same safety or something?


Apparently every self driving company does that. So e.g. Waymo also disables such safety systems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: