Comments Locked

14 Comments

Back to Article

  • shabby - Thursday, March 29, 2018 - link

    Thanks uber, you incompetent fucks.
  • PeachNCream - Thursday, March 29, 2018 - link

    It's a shame that one incident may slow down the adoption of self-driving cars significantly when there are a lot of deaths each year contributed to distracted or unsafe driving by humans. It's absolutely terrible that someone was killed by an automated car and we need to take a look at the technology along with the entire series of events that caused the death to understand what happened and how to do better. However, I'm a lot more afraid of humans driving with their emotions and distractions than I am about even current self-driving cars. Just look at the number of people that can't control themselves at an intersection and feel like they have to do burnouts constantly without even a passing thought given to the extra danger they're putting everyone else in all around them.
  • Yojimbo - Thursday, March 29, 2018 - link

    Yeah.. people are afraid of anything new or unknown. We also live in an ultra safety conscious society. But people can't expect self-driving cars to never be involved in fatalities. Driving is an inherently dangerous thing. I doubt it will cause much of a slow-down in the adoption, though, unless lawmakers start to get in the way.
  • superunknown98 - Thursday, March 29, 2018 - link

    Whatever you think of Uber as a company not withstanding, I have read several articles now claiming the fatal accident was the pedestrians fault. The woman had suddenly bolted across the street with her bike and there wasn't enough time to brake. This is bound to happen again as cars have to obey the lays of physics, they are heavy and cannot stop in an instant. If a car traveling 30MPH takes 15FT to come to a complete stop and a pedestrian jumps out from between two parked cars 10FT ahead, guess what will happen? Pedestrian right of way laws make no sense to me, I always wait for traffic to clear before crossing a street.
  • Death666Angel - Thursday, March 29, 2018 - link

    All true it was almost assuredly the fault of the pedestrian crossing the road without checking for traffic in a dark area. And I doubt any human driver could have prevented the accident or outcome. But considering that the sensors should have better than human vision and the system should have better than human reflexes, it is still important to find out why it didn't notice the pedestrian and didn't react even when she was visible in the headlights.
    Do pedestrians have right of way in the US in all circumstances? In Germany, you have right of way when the pedestrian traffic light is green and on zebra crossings. The way the woman crossed that road, she would have been at fault for not obeying the traffic law.
  • nathanddrews - Friday, March 30, 2018 - link

    Each state in the US has slightly different traffic laws regarding right of way for pedestrians, but generally speaking if a pedestrian is in a crosswalk - in other words, not jaywalking - then they have the right of way. Someone crossing against a signal "do not walk/cross" or crossing at a random spot along the street does not have the right of way in most states. Pedestrian crashes are very rare overall and most of them do not occur at legal crossings. Alcohol and distraction is cited as a very common factor for the pedestrian being hit. Our meat sack bodies don't do very well against massive moving objects. Be safe out there.
  • r3loaded - Thursday, March 29, 2018 - link

    The pedestrian was definitely at fault, but the car should have been able to spot her on lidar/radar even in the dark and taken a defensive action (i.e. slam on the brakes, return control to the safety driver). That didn't happen, and we need to know why that didn't happen.
  • sl149q - Thursday, March 29, 2018 - link

    The issue is did Uber live up to the expectations of what an autonomous vehicle should be able to do. First at least matching what a human driver would have done, second exceeding that because of the additional capabilities.

    While it appears (at this point) that a human driver would have possibly killed or injured the pedestrian, there would have been some braking done by most people.

    If the Uber car had been operating at (minimally) the equivalent of a human driver the brakes should have been applied in the last two seconds (and apparently where not.)

    If the Uber car had been operating at our expectations of an autonomous vehicle it's lidar or radar systems should have "seen" the pedestrian more than five seconds out and should have slowed down. Again it appears that did not happen.

    This means that while the pedestrian was at fault, it is possible that some fault may also lie with Uber in a civil suit. It is likely that Uber will endeavor to avoid a suit with a settlement.
  • K_Space - Thursday, March 29, 2018 - link

    Looking at the video my sneaky suspicion is that the car DID in fact hand control back to the safety driver who may have been distracted but it's a pure speculation of my part.
    I feel this to be an inherent problem with the system though: handing control from a more safe system to a less safe system, what for? to avoid litigation? To be a devil advocate i would say in cases like this the car should NOT hand control back to the driver; it causes unnecessary delay in a time-critical scenario before an action is taken.
  • jjj - Thursday, March 29, 2018 - link

    The fact that they stopped testing is very suspicious though.
    It's the wrong thing to do , unless they know that something is off with their hardware or software.
  • BenSkywalker - Thursday, March 29, 2018 - link

    Someone died.

    If they did anything but suspend until the investigation is complete they would be both callous and negligent even from an advancement perspective. On the freak chance there was another failure that resulted in tragedy from what ended up being the same fault while the investigation was ongoing you would have legislatures everywhere creating draconian regulations and greatly impeding progress for the entire segment.
  • Samuel Lord - Saturday, March 31, 2018 - link

    @K_Space:

    There is no sense in any autonomous system handing over control to a human driver during any danger indication.

    The victim had already crossed the two opposite lanes and the median before being struck. The car didn't slow so either the lidar and other sensors failed to detect the victim, or there was a software or hardware fault downstream. I could understand IR sensors having trouble: Phoenix is hot and the asphalt would have been emitting an IR background similar to the victim's body. And lidar operating roughly in the visual band as I understand it, also could have been fooled given the low light and the victim's black parka (IIRC).

    I'm always struck by the silence of most of the big players on the most vital safety point: The vehicles and environmental sensors need to fuse. The car-, road-, building-, and tree-mounted sensors need to continuously broadcast so there is always overlapping coverage and warning of distant conditions. Want to keep your kids safe? Track them! Ford is the only big player who has emphasized this fact.
  • mode_13h - Tuesday, April 3, 2018 - link

    What's ridiculous about this is all the shock and surprise.

    It was 100% predictable there would be serious injuries and even fatalities in the development & testing of autonomous cars. No matter how good these systems are, they're not perfect and they can't totally prevent accidents caused by the other party. It was only a matter of time.

Log in

Don't have an account? Sign up now