The NTSB today released the report into the fatal Uber Autonomous car accident.
TLDR;
The RADAR, LIDAR, and cameras DID detect and classify pedestrian bicycle correctly.
The system DID determine that emergency braking was required.
But Uber disabled the systems emergency braking feature in autonomous mode.
Uber also disabled Volvo's inbuilt pedestrian safety detection system.
There is also no system to alert the driver that the system detected
an emergency braking scenario.
Uber are rooted.
Previous videos:
https://www.youtube.com/watch?v=QCCmqosHT-o
https://www.youtube.com/watch?v=HjeR13u74Mg
The NTSB report: https://www.ntsb.gov/news/press-releases/Pages/NR20180524.aspx
https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf
Forum: http://www.eevblog.com/forum/blog/eevblog-1088-uber-autonomous-car-accident-report/'>http://www.eevblog.com/forum/blog/eevblog-1088-uber-autonomous-car-accident-report/
EEVblog Main Web Site: http://www.eevblog.com
The 2nd EEVblog Channel: http://www.youtube.com/EEVblog2
Support the EEVblog through Patreon!
http://www.patreon.com/eevblog
Stuff I recommend:
https://kit.com/EEVblog/
Donate With Bitcoin & Other Crypto Currencies!
https://www.eevblog.com/crypto-currency/
T-Shirts: http://teespring.com/stores/eevblog
💗 Likecoin – Coins for Likes: https://likecoin.pro/ @eevblog/dil9/hcq3

Hi. In two previous videos which I'll link in. At the end of this video and down below, we looked at the Uber Autonomous Self-driving Car fatality. If that happened, when was it back in? March And if you're not familiar with the incident, I'll briefly recap.

in the USA in Uber Self-driving car was driving along in fully autonomous mode. It's actually a Volvo Xc90. As you cannot see here, it's fully equipped with our Uber X' suite. They developed their own suite of autonomous self-driving systems so a Lidar, a radar, and like a half a dozen cameras on the thing 360 degrees.

It's supposed to detect oncoming, you know, people, other cars, obstacles, and potential collisions and things like that. and apply the brakes and avoid. And you know, do all that sort of jazz that you'd expect a self-driving car to do anyway. So the car was driving along at night it didn't have any paying passengers in there, and it was in fully autonomous mode basically going down a straight stretch of motorway when it collided with a pedestrian who was walking their bicycle across the road and it was dark at the time.

There's a lot of speculation about what the actual camera is actually seen, but pretty much everyone was in agreement in the tech field that the Lidar and radar system should have picked up this bicycle because the bicycle was like sidon coming across the car. I Won't replay the accident video that's in the previous videos, but really, this was all speculation. Did it actually detect them? Why didn't it break and the person wasn't Actually, they were supposed to be in control of the car, but it was in fully autonomous mode and the driver was actually looking down at the time as it turns out at the autonomous car driving systems and wasn't paying attention to the road and they reacted that last second, didn't have time to stop. Unfortunately, the pedestrian was struck and killed and we've been waiting for the report to come out.

Well, good news is that it's just come out just today. Here it is. the National Transport and Safety Bureau have just released a preliminary report, but it's basically you know these are our conclusions pending any other further information to come along. and it's pretty damning.

So let's take a look at it And oh, of course, link it in down below so you can read it for yourself. This is just the summary woven, the full report in a minute, which doesn't have a whole lot more. But oh, look at the important stuff here. Basically they talked about the bicycle did not have side reflectors, it was dark at the time, they were crossing in an incorrect spot and all that sort of stuff.

So you know really, the pedestrian shouldn't have been there. But that's kind of beside the point. The whole point of this video on all the hype surrounding this accident is that this autonomous system should have detected the bicycle. It's not who's at fault, but they should have actually detected the bicycle and the pedestrian crossing across in practically ideal circumstances apart from my being at night.
but with As I said, Lidar and radar should have detected that and should have applied a braking. But we'll get into that. And here's the first damning part which will read in more detail in the report: The vehicle was factory equipped with several advanced driver assistant functions by the original manufacturer Volvo including a collision avoidance function with automatic emergency braking, as well as functions for detecting driver alertness and road sign information. These Volvo functions were actually disabled deliberately disabled when the car when the Uber car is operating in autonomous computer-controlled mode.

So Uber deliberately disabled these systems that potentially could have actually detected this and stopped if you were driving the original factory fitted. Volvo Xc90 The report states data obtained from the self-driving system shows the system first registered radar and lidar observations of the pedestrian about 6 seconds before impact. So this is what pretty much everyone in the industry expected it would do, and the radar and Lidar systems looks like they worked just fine when the vehicle was traveling at 43 miles per hour as the vehicle and pedestrian paths converged. as self-driving software classified the pedestrian as an unknown object as a vehicle, and then it reclassified it as a bicycle with variant expectations of future travel paths, but it did actually identify it as a bicycle.

so the system worked, whether or not that to classify it as a bicycle I would think that they would have had to include some potential camera data in there as well and everyone was saying that the camera it was too dark for the camera to see anything. Well, that's not the case. The dashcam footage that we saw it was underexposed for that particular camera the camera used on the while the camera based detection systems might have seen something very different. and as I've shown in a previous video, I've played footage of people who have actually gone back there and some older footage of people who've shot it with dash cams at the exact same location at night.

And it's you know, things are plenty visible. So that original dashcam footage from the accident really wasn't very indicative of what the cameras would have seen. But here is the most damning part of this. At one point, three seconds before impact, the self driving system determined that emergency braking was needed to mitigate a collision which it knew it was a bicycle, so therefore it knew that there was a human on that bicycle.

According to /, emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. and there you go. Uber have basically admitted that when it's in autonomous driving mode, they disable emergency braking and this is the big trade-off with autonomous car systems and the things that they're I think all of them are struggling to get right all makers of autonomous car systems in that you can't just be braking and and swerving for every little thing that you detect, there's got to be some sort of you know threshold. Is it real? You've got it as you saw it, detected it like six seconds before it classified as a car, Then it reclassified.
you know, and it's got to project the path that it's going on. And really, for a smooth, you can't have it. Just you know, stopping and starting. and you know, slowing down, speeding up, and jerking around all over the place.

So you know that's one of the things. But the disabled emergency braking in Autonomous Car mode Wow The vehicle operator is relied on to intervene and take actions. So is the driver culpable in this case, is uber culpable for disabling the emergency braking system in autonomous Car mode. I Thought that'd be part and parcel of an autonomous car was with it.

had an emergency braking system enabled. Unbelievable. The system and the system is not designed to alert the operator. so it detected this bicycle.

It classified it as a bicycle and by definition a humans on a bicycle. It knows it's probably going to collide with it and it doesn't. Not only does it not apply the emergency brakes, it doesn't even hurt the operator. Unbelievable.

In the report, the NTSB said the self-driving system data showed the vehicle operator. engaged the steering, wore less than a second before the impact. So they did. Actually, maybe that was the point where you saw in the video whether it looked up and went like that.

It seems like that was a second before the accident, but they didn't brake for another second after the impact. So it took two seconds for the human mind to register that and actually apply the brakes. and well, that's probably understandable. The vehicle operator said in an NTSB interview that she had been monitoring the self-driving interface and that while her personal and business phones were in the vehicle never those were in use until after the crash.

So when we saw just before the incident happened in the video that the driver was looking down like this, we assumed it was a phone. But no, she was actually monitoring the self-driving systems which were down here. so it would have helped if they had a heads-up display maybe and all the aspects of the self-driving system were operating. Normally there are no faults or diagnostic messages I Hope they peer diagnostic message there was an impact and that we detected a bicycle.

Anyway, here is the reconstructed data that they've got and here's the car. Here's the object detected as a bicycle. This was at one point three seconds before impact. It classified it as a bicycle and it probably knew the path because it the system actually determined that emergency braking was required, but they disabled that so the car couldn't emergency brake and it didn't alert the operator.
So this is 25 meters out. This is meters, not feet. So 25 meters out. It classified and detected that bicycle and that's pretty much what you know what you'd expect of one of these autonomous self-driving systems.

So that so the system worked, but it seems like the implementation of that is the issue here. And here's the full preliminary report: our PDF which I'll link in down below, but it doesn't have a huge amount more. It's only four pages long and here's the actual graphic of where it actually happened. It's pretty much almost exactly where I actually predicted based on the image data and Google Maps and things like that I think I got that pretty close.

And here's the car: after the accident. you can see that there's not much damage there at all, which indicates that it was going relatively slowly when it impacted. And once again, here's the full comment on their on them disabled in the Volvo functions. They've got an emergency braking system known as City Safety which most likely could have detected and prevented this accident on its own in the factory-fitted Volvo but of course they deliberately disabled that, presumably so it didn't interfere with the Uber systems when it's in computer control.

and according to over the development of self-driving system, relies on an attentive operator to intervene if the system fails. In addition, the operator is responsible for diagnostic monitoring, diagnostic messages that appear on the interface in the center stack of the vehicle - and tagging events of interest for subsequent review. So that's probably what the driver was doing. That I was like doing what they are instructed to do by the sounds of things.

But because the monitors down here and they didn't have any sort of like heads-up display, their eyes were off the road. and when you take your eyes off the road and you've got no emergency braking system enabled, no wonder this happen. And as it turns out, they're actually driving a known test route here, so they weren't It wasn't designed to be picking up our paying passage as they were testing the system. Basically, so you'd expect the driver on a test route to be monitoring the system and just there's the data.

Again, at first detected something at six seconds before impact when is traveling 43 miles per hour and 1.3 seconds before impacted. Determined that an emergency braking maneuver was required. It refers to decent Oh in Uber self-driving system, and emergency braking maneuver refers to a deceleration greater than 6.5 meters per second squared for those playing along at home. And there's that thing again.

it detected them 25 meters out. That's when it determined that it emergency braking was actually required. And if you're wondering, did the cameras on the Uber car not that dashcam that we have are seen? did it actually pick up the bicycle and the answer is yes it did. Shows the forward-facing video show the pedestrian coming into view and proceeding into the path of the vehicle.
so it's sort the bicycle coming across so there was more than enough light to actually do that. So as I said that, probably use some of that video as part of the detection algorithm to determine that it was a bicycle and not a car. It originally thought it was a car, then it really was a Fied it probably based on that day camera footage. So could the car have stopped in time if it actually did have that emergency braking system enabled in autonomous driving mode at 25 meters are 1.3 seconds before the impact? Well the answer is no, it couldn't have but it would have slowed down very significantly.

Which may have you know lessened would have lessened the impact on the person and I've heard David crunch the numbers on this one. we actually got the data for. They got the some braking data for the Volvo Xc90 used here and we've done some calculations here and once again these are going to be. There's going to be rough calculations.

It's going to be large error, fairly large error bars on this because it's going to be. you know, the weight of the carcass. We don't know the weight of the autonomous driving systems, what they add, and things like that. We don't know.

the tire types, the tire condition, the tire pressures. you know, all that sort of stuff. but we're fairly confident that this is, you know, a reasonable figure. Basically, the final answer is that was originally traveling at 43 miles an hour and after assuming 1.2 seconds, I've allowed 100 milliseconds for the detection of the start of braking.

It would have been traveling twenty two point seven miles per hour, roughly. So basically the speed was hard from 43 to 22 or there abouts. so that would have been a dramatically less impact if the emergency braking system had done its business and slammed the brakes on. So there you have it.

It's not looking very pretty for Uber, is it? Anyway, Um, let us know what you think and down below in the comments and also on the Eevblog forum LinkedIn At the end of this video as I said, I'll link in the two previous videos. look somewhere up there. catch you next time.

Avatar photo

By YTB

20 thoughts on “Eevblog #1088 – uber autonomous car accident report”
  1. Avataaar/Circle Created with python_avatars fullmoon6661 says:

    What kind of safety team they have in Uber they allowed design like this?

  2. Avataaar/Circle Created with python_avatars Someone Random says:

    Uber: We dont trust the car enough to let it use the brakes
    Car: Dudes what the fuck if I had the brakes I could have stopped in time

  3. Avataaar/Circle Created with python_avatars Argoon1981 says:

    That report shows that the car systems were not to blame but uber incompetence by disabling the features that could have made the car stop, in a sense this car was not a true "autonomous" vehicle, has it had no control over the breaks, that at least show that is not the tech that's still not ready, makes me more hopeful for the future of autonomous driving.
    Uber should have been sued, not only by the family of the victim (unfortunately they settled with uber before knowing this new facts), but also by the government and Uber's autonomous vehicle program tests, should be supervised by a third party from now on.
    Uber even tried to blame the driver for the accident, this is like the airplane companies that blamed the pilots on the accidents caused by malfunctioning airplanes, because of negligent supervision and repair, this seems to be standard behavior for big companies.

  4. Avataaar/Circle Created with python_avatars Sqweezel says:

    heads up FLIR & night-vision overlay, that engages when objects are detected would have helped too.
    Even an audible beep for the driver to look at the road.

  5. Avataaar/Circle Created with python_avatars Sqweezel says:

    So the system flashed a message not common of it's normal operation, indicating some sort of action was needed, and the driver looks down at the system to read the message, realizes the message is an emergency breaking message but by this time has no time to react? sounds counter intuitive to me.

  6. Avataaar/Circle Created with python_avatars Dark Fox says:

    "It's the operator's job to keep their eyes on the road at all times, and also keep their eyes off the road at all times to monitor the center console." Smart thinking, Uber. The lawsuit is so easy it'll undoubtedly be settled out of court.

  7. Avataaar/Circle Created with python_avatars SysGhost says:

    It'll be the new Uber slogan: -"We won't stop for anything."

  8. Avataaar/Circle Created with python_avatars Jar says:

    We were watching TV, watching TV…

  9. Avataaar/Circle Created with python_avatars Fred dog says:

    uber sucks.

  10. Avataaar/Circle Created with python_avatars lamelama22 says:

    I think they already settled with the family… but honestly, people should be held criminally liable for this and put in jail for the death. Probably not even the engineers, since they were apparently held to impossible deadlines and competing requirements. Management who pushed this out the door and made it happen should be held accountable… but… that's not how it works in the good ol' USA (or most of the world). If I personally made some dangerous thing and it killed someone, I'd be in jail; but if a company does it…

  11. Avataaar/Circle Created with python_avatars lamelama22 says:

    I think it's worse than you described, as the software basically kept re-classifying it back and forth as various types of objects and projected paths (usually not getting it correct), got confused, and took no action b/c of that. It also should have been able to detect the projected path (regardless of object type) based on it's detected movement, and taken action. I also like how the driver is in a catch-22: they are told to look down at some screen, but are expected to act in an emergency, but aren't alerted… From what I've read, Uber's internal software dev is in complete disarray and this lends credence to that – multiple different groups making different decisions with competing instructions / goals… I work in safety critical software; they clearly don't (but should be). This would make a good additional chapter to add to "Set Phasers On Stun", which is about these kind of engineering design failures / oversights.

  12. Avataaar/Circle Created with python_avatars Keenan says:

    I don't think the "Move Fast and Break Things" mantra works well with autonomous vehicles…

  13. Avataaar/Circle Created with python_avatars Michael Kire Hansen says:

    One thing to mention, City Emergerncy Break systems usually only works under 50 km/t. So it would most likely have been deactivated anyway

  14. Avataaar/Circle Created with python_avatars Federico Revello says:

    @EEVblog I´m writing about the incident and I would like to interview you for this. Could I interview this week?

  15. Avataaar/Circle Created with python_avatars townsend Assvlanche says:

    I turned off my lane departure warning because it was a pain on country roads. The difference is i a m still driving my car.

  16. Avataaar/Circle Created with python_avatars stephane sonneville says:

    The fact that Uber had disabled the Volvo system was knew the day before this accident.

  17. Avataaar/Circle Created with python_avatars Dazzwidd says:

    Im a cab driver, I honestly don't see why autonomous vehicles are necessary. What is the aim?

  18. Avataaar/Circle Created with python_avatars Mark says:

    You'd also expect the cyclist to have picked up the vehicle by the big pair of lights coming straight for them and gone into "brain give way mode"!

  19. Avataaar/Circle Created with python_avatars Michael Kincaid says:

    "This is just a rough estimate"
    goes out to ten decimal places

  20. Avataaar/Circle Created with python_avatars EEVblog says:

    TLDR;

    The RADAR, LIDAR, and cameras DID detect and classify pedestrian bicycle correctly.

    The system DID determine that emergency braking was required.

    But Uber disabled the systems emergency braking feature in autonomous mode.

    Uber also disabled Volvo's inbuilt pedestrian safety detection system.

    There is also no system to alert the driver that the system detected
    an emergency braking scenario.

    Uber are rooted.

Leave a Reply

Your email address will not be published. Required fields are marked *