Today's Paper Obits Digital FAQ Newsletters Coronavirus 🔴 Cancellations 🔴NWA Screening Sites Virus Interactive Map Coronavirus FAQ Crime Razorback Sports Today's Photos Puzzles

Uber Technologies Inc.'s self-driving test car that struck and killed a pedestrian last year wasn't programmed to recognize and react to jaywalkers, according to documents released by U.S. safety investigators.

The National Transportation Safety Board on Tuesday released more than 400 pages of reports and supporting documents on the March 2018 crash that killed 49-year-old Elaine Herzberg as she walked her bicycle across a road at night in Tempe, Ariz.

The documents painted a picture of safety and design lapses with tragic consequences but didn't assign a cause for the crash. The safety board is scheduled to do that at a Nov. 19 meeting in Washington.

"We deeply value the thoroughness of the [National Transportation Safety Board]'s investigation into the crash and look forward to reviewing their recommendations once issued after the NTSB's board meeting later this month," the company said in a statement. The company said it regrets the incident and has made critical improvements to prioritize safety.

The case is being closely watched in the emerging industry of self-driving vehicles, a technology that has attracted billions of dollars in investment from companies such as General Motors Co. and Alphabet Inc. in an attempt to transform transportation.

The safety board's data highlights the need for more rigorous standards on how self-driving vehicles can be tested on public roadways, Jason Levine, executive director of the Center for Auto Safety advocacy group, said in an interview. Currently, there are no federal rules and individual states have set a variety of criteria.

"These are life and death consequences, not video game reset buttons for software developers," Levine said. "I think they were playing fast and loose with people's lives, and Elaine Herzberg has paid the price."

The report said the car's sensors detected Herzberg and her bicycle but its computer failed to recognize the hazard. "The system design did not include a consideration for jaywalking pedestrians," it said.

Herzberg was crossing the road outside of a crosswalk.

The Uber vehicle's radar sensors first observed Herzberg about 5.6 seconds before impact and before she entered the vehicle's lane of travel, and it initially classified her as a vehicle. But the system changed its classification of her as different objects several times and failed to predict that her path would cross the lane of the SUV, according to the board.

Uber made extensive changes to its self-driving system after several reviews of its operation and findings by board investigators. The company told the board that the new software would have been able to correctly identify Herzberg and triggered controlled braking to avoid her more than 4 seconds before the original impact, the board said.

The safety driver behind the wheel of the car was watching a video on a mobile device and didn't see Herzberg in time. Less than five months before the accident, Uber had cut back to a single safety driver in its test vehicles. Other companies, such as GM's Cruise affiliate, use two.

Uber vehicles operating in autonomous mode had been involved in 37 crashes before the fatal accident, the board said. The vast majority weren't the fault of the Uber technology. But in one case the car struck a bent post marking a bicycle lane and in another it didn't react to a rapidly approaching vehicle and the safety driver swerved and hit a parked car, the board said.

The safety driver involved in the accident told investigators that "sometimes the vehicle would swerve towards a bicycle."

The Uber Advanced Technologies Group unit that was testing self-driving cars on public streets in Tempe didn't have a stand-alone safety division, a formal safety plan, standard operating procedures or a manager focused on preventing accidents, according to the board.

Instead, Uber had companywide values it promoted to its employees, such as "do the right thing," the board said. The company, in its statement, said that it had also had safety policies and procedures though not a formal safety plan.

Business on 11/07/2019

Print Headline: Report on 2018 fatal self-driving crash finds lapses

Sponsor Content


COMMENTS - It looks like you're using Internet Explorer, which isn't compatible with our commenting system. You can join the discussion by using another browser, like Firefox or Google Chrome.
It looks like you're using Microsoft Edge. Our commenting system is more compatible with Firefox and Google Chrome.