Odd thoughts on driverless cars | Page 5 | GTAMotorcycle.com

Odd thoughts on driverless cars

NTSB preliminary findings are out.

Uber basically designed a killing machine, on purpose.

https://arstechnica.com/cars/2018/0...led-by-ubers-self-driving-software-ntsb-says/

The Uber decisions regarding autobraking activating inappropriately exactly mirror my experience with a subaru with eye-drive or whatever they call it. It makes bad decisions and should never be allowed in public in its current state.

At the very least if the computer senses impending doom, it sure as hell should notify the driver.
 
Meh... I don't think drivers or AI should be responsible for jaywalkers.

I'm not placing all the blame on the car/driver as obviously much (and probably most) lies with the jaywalker. That being said, if you see someone doing something illegal, murdering them because you don't agree with their choice is not acceptable to me (and I would hope most others).

The car clearly identified long before the accident that there was an obstacle that was likely to intrude on the travelled path but it did fa to mitigate the collision. The law is pretty clear on this and has been for a long time, the driver has a duty to attempt to mitigate damage. Interestingly, the human driver in this situation may be found not guilty as she had very little advance notice of the obstacle. By having more information available, the car places itself in a position to take more blame. It's an interesting dilemma for the courts to sort out. My guess is in the near-term, the sensor package will be used to provide supplemental info to the human driver who can then make the rapid decisions required. For instance, in this case, the car could easily have high beamed and precharged the brakes while giving audible and visual cues to the driver.
 
You think this is enough to kill them? Or that is what you think should be done with companies using live humans as crash test dummies?

Probably both. Hopefully both. Uber has a history of being a terrible company in many ways. Tesla needs a good smack with respect to Autopilot as well.
 
If a person in the same situation was shown to have deliberately decided not to try and avoid the collision, as the software engineers did back in the office months earlier, he'd be charged with something like reckless endangerment or vehicular homicide. Why shouldn't the same justice be applied to Uber?
 

Back
Top Bottom