In the wake of the first fatal self-driving car accident earlier this week, a top executive for the manufacturer of the sensors used on the vehicle says she is "baffled" as to why the technology failed to recognize a pedestrian cross the street.
<iframe width="560" height="315" src="https://www.youtube.com/embed/Cuo8eq9C3Ec" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>
Elaine Herzberg, 49, was fatally struck by a self-driving Uber vehicle last Saturday around 10 p.m. while crossing a road in Tempe, Arizona. Herzberg was not crossing in a designated area but the self-driving car's failure to recognize a person in its path and brake has experts baffled.
Sgt. Ronald Elcock of the Tempe Police Department confirmed the car was in autonomous mode when the accident occurred and said the investigation is still ongoing. Thoma Hall is the president of Velodyne Lidar Inc., the company behind the special laser radar that lets an autonomous car "see" its surroundings, says she doesn't believe the company's technology failed.
Related coverage: <a href="https://thegoldwater.com/news/20956-Self-Driving-Uber-Kills-A-Woman-In-Arizona">Self-Driving Uber Kills A Woman In Arizona</a>
Video of the incident captured on multiple dash cameras shows Herzberg pushing her bike across the street in near total darkness. she doesn't become clearly visible in the video until the moment of the impact. Hall said, "We are as baffled as anyone else. Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our Lidar doesn’t make the decision to put on the brakes or get out of her way."
Hall's analysis implies the blame should be placed elsewhere since the Lidar technology does not actually decide how to respond to a sudden obstacle. The company says they are cooperating with federal investigators to get to the bottom of what went wrong. Hall also said, "In addition to Lidar, autonomous systems typically have several sensors, including camera and radar to make decisions. We don’t know what sensors were on the Uber car that evening, if they were working, or how they were being used."
Related coverage: <a href="https://thegoldwater.com/news/18957-Uber-Eats-Driver-Suspected-of-Fatally-Shooting-a-Customer-Turns-Himself-In">Uber Eats Driver Suspected of Fatally Shooting a Customer Turns Himself In</a>
Because of how Lidar technology uses lasers to map its surroundings, it can effectively see in the dark. "However, it is up to the rest of the system to interpret and use the data to make decisions. We do not know how the Uber system of decision-making works," Hall said. She also implied that answers would be found at Uber, not at Velodyne.
"We at Velodyne are very sad, and sorry about the recent Uber car accident which took a life. David Hall, company CEO, inventor, and founder, believes the accident was not caused by Lidar. The problem lies elsewhere."
Tips? Info? Send me a message!Twitter: #Uber #LIDAR #Volvo #SUV #FatalCrash #Tempe #Arizona #Velodyne
Wow looking at anyone other than the DRIVER and uber is ridiculous. Show me one law that states anything "autonomous" can be on the road without a real person behind the wheel. It's obvious the human driver was just getting a paid ride and was busy paying attention elsewhere. If this was you or me in our normal car the pedestrians death would be on us, well at least in my state it would. Pedestrians always have the right away even in non designated areas.
The family of the victim should go after uber first and then the lazy beast behind the camera that looks to be bust watching her phone or something. At the end of the day the accident will get blamed on the dude that was supposed to be monitoring the cars ability to self drive and take control when someone or something wanders in front of them.