Uber autonomous car fatality

Discussion in 'Off Topic' started by Domenick, Mar 22, 2018.

  1. Domenick

    Domenick Administrator Staff Member

    As many of you probably know, an "autonomous" Uber car hit and killed a pedestrian the other night. Now that I've seen the video, there's no doubt in my mind the the driver bears the responsibility for the accident.

    I wrote up a thread on Twitter, so I thought I'd echo my thoughts here.

    Whole lot of victim blaming going on with this Uber fatality situation, but having watched the video, it's clear that the driver is at fault. And only the driver.

    Autonomous vehicles aren't really my beat, but I do have a lot of experience driving on dark rural roads — 16+ years/1.5 million miles, more or less. I've hit and missed a lot of wildlife. Sometimes there nothing you can do. But that's rare.

    If you are watching the road, and it's clear the Uber operator wasn't, you can see deer ambling across an unlit road, or a pedestrian ploddingly pushing a bicycle, and, at 60 mph, have time to stop. At the very least, slow and swerve.

    This vehicle was moving at 38 mph. Piece of cake. Except the driver — and unless a car is certified level 5 autonomous, it has a driver — was looking down at their phone.

    Yeah, the car also totally failed, and Volvo needs to share exactly what went wrong with public. Sure, the lady should have crossed at a crosswalk with the lights. But driving is about expecting the unexpected.

    When autonomous vehicles are "perfected," there will still be crashes. Rare, but stuff happens, things go wrong. In the meantime, we can't fault technology, or pedestrians when a human fails to operate a vehicle irresponsibly.
     
  2. bwilson4web

    bwilson4web Well-Known Member Subscriber

    [​IMG]
    • pedestrian in Ninja black top and blue jeans. Only white tennis shoes seen first.
    • no reflecting tape or paint on bicycle frame nor active LED lights at night.
    • both pedestrian and driver were inattentive.
    • Tempe street lights make a small island of light instead of projecting over a larger area.
    • XC90 appears to be driving on low beams, over driving the lights.
    [​IMG]
    • IIHS already lists the XC90 having "Marginal" head lights
    Most accidents are a series of faults and that appears to be the case here. The NHTSA report will teach us more.

    Personally, I think Uber self-driving should be limited to daylight until the NHTSA report comes out.

    Bob Wilson
     
    Pushmi-Pullyu and Domenick like this.
  3. Domenick

    Domenick Administrator Staff Member

    First, that's an awesome post, Bob. :)

    I know not everyone thinks of this the way I do, but I don't blame the pedestrian. Sure, she did something stupid by walking out into the streets and not looking into the direction of traffic. I don't know what mental or physical shape she was in, though possibly quite poor shape. But all kinds of things show up in streets unexpectedly -- people, bicycles, boxes, foxes, dogs, etc -- and vehicle operators need to be watching for those things.

    In the video, she is hard to see, but I'm quite confident the video image is quite inferior to the human eye's ability to pick out dark shapes. Regardless, it doesn't matter how many reflectors she had or didn't have, the operator wasn't looking anyway.

    Now, previously, I laid the blame mainly at the operator's feet, but learning a bit more, I believe Uber holds some amount of responsibility. For starters, the operator had a number of moving violations that probably should have precluded them from being hired in the first place. Then there is the issue of training. Was there any? How extensive was it?

    Then there is operator monitoring. Was the company monitoring the camera feeds at all? I know for FedEx semi drivers that I worked with, the camera, and the fear of being seen even glancing at a phone and getting fired, really help them focus on properly doing their job.

    Pretty sure if Uber had of checked in on this operator, they might well have caught them on their phone and fired them before this happened.

    I think limiting Uber to daylight is not enough. I'd want to know all about their hiring, training, and supervisory policies were and how they will be improved. And of course, finding out what went wrong with the car and making sure it couldn't happen again.
     
  4. bwilson4web

    bwilson4web Well-Known Member Subscriber

    With respect, I'm not ready to go here:
    It is a hard job, sitting there and monitoring, not driving the car. At one time in the Marine Corps I was issued a pin-feed platen, electric typewriter, that used boxes of perforated edge, paper and typed OCR-font for unit diaries. No skill required and incredibly boring. In the Harry Potter movies they called it "doing lines." There is no training for a safety monitor that could mitigate such a soul killing job.

    The only 'good news' is it was the first step in burning out my lifer tendencies (a desire for a military career.) I had other boring jobs in the Marine Corps that involved more standing around. So I filled that time taking college courses but by then I realized the importance of being able to say, "Take this job and shove it" had its own value.

    I'm not giving Uber a pass nor Tempe traffic engineering nor social services for the Tempe homeless. It is too early and we don't have the NHTSA report. Like 'stone soup,' each brought their contribution.

    Bob Wilson
     
    Charles Hall and Domenick like this.
  5. What I am about to say does not apply to the scenario where full automation is applied.

    But for the case of new models with any form of automated braking capability there should be no reason why eye/iris sensors could not be included. The sensors would allow for a predetermined period of distraction. I will leave the appropriate/allowable period for someone with more research/knowledge than I. The allowable period could vary depending on vehicle speed and maybe even conditions (lighting, weather etc). After that period the system could alert the driver, either by audible alarm or by some physical method, vibrating steering or seat etc. If the distraction continues past a second period the brakes should be gradually applied, assuming no obstacle has been detected. Given available technologies (even available in smart phones) I don't see why this could not be implemented in even basic models.
     
    Domenick likes this.
  6. Oh and I also applaud the post from Bob.
     
  7. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    And this, EV fans, is why autonomous cars need better sensors than video cameras. Cameras, unless they are infrared cameras, can't see in the dark any better than the human eye can.

    However, my understanding is that this Uber car was equipped with a rotating lidar scanner. So why wasn't the pedestrian detected? Is the range or the resolution too low for the lidar detector to "see" a pedestrian walking a bicycle? That seems unlikely. My guess is that Uber's self-driving software just isn't that good, as compared to Waymo or Tesla's systems.
    -
     
    Charles Hall likes this.
  8. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    First of all, I don't think we should fall into the trap of simplistically assigning all of the blame to one person or to one semi-autonomous car. I was raised to believe that adults should take responsibility for their own actions, and I still firmly believe that is a creed which everyone who is functionally an adult should live by. If someone makes a habit out of walking or riding a bike on public roads at night, then he or she should take car to wear light-colored clothing, or better yet, wear something with reflective tape on it, and/or put large multi-directional reflectors on the bicycle.

    If someone is out walking on a public road at night, wearing dark clothing, then she or he surely bears at least some of the responsibility if she or he is hit by a car. In Drivers' Education we are taught to "drive defensively". Well, if you're out walking in traffic at night, and you're wearing dark clothing, then you'd darn well better "walk defensively"!

    But if the "driver" was looking at his cell phone rather than paying attention to the road, then it seems to me he is equally to blame.
    -
     
  9. I suspect the pedestrian was detected but it was the pedestrian's movement that caused the issue.

    Uber system: Object detected to the side, We're good to go. Object detected to the side, We're good to go. Object detected to the side, We're good to go. Object detected to the side, We're good to go. Oh crud, now the object is in front of us, too late!!!

    There is a video of the Telsa system having the oposite reaction. It detected pedstrians on the footpath (safely off the road) yet it still brought the car to a halt because of a perceived risk. In the case of this Tesla video, the pedestrians had no intention of crossing the road so there was no issue, but we only know this in retrospect. So where to programatically draw the line in these cases is not simple. This is why it will take a long while before I will be an advocate of totally autonomous cars. It is going to take a lot of real world testing/data before these unusual cases are brought down to an acceptable level. I am however an advocate for autonomous systems to assist drivers. The combination of which have the potential to dramatically reduce road deaths and injuries.
     
  10. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    If you're talking about the same Tesla semi-self-driving video I've seen -- the one posted below, see time mark 4:50 -- then the pedestrians were not "safely off the road", they were walking dangerously close to a road with no shoulder, and improperly facing away from oncoming traffic. If the one closer to the road had stumbled and fallen to her left, she (?) would certainly have been in the roadway.



    We have to accept the fact that if self-driving cars are going to reduce the accident rate drastically, then they are going to have to drive more safely, taking less risk, than humans do. In some cases, they are indeed going to drive in a "timid" fashion, which no doubt will exasperate some human passengers. IMHO that's a small price to pay for saving human lives!

    In the video in question, the problem as I see it is that Tesla's semi-self-driving system does not allow the car to deviate from the lane. If it was allowed to swerve into another empty lane briefly, to give a wide berth to moving obstacles (such as pedestrians), then the car wouldn't have needed to stop or substantially slow down.
    -
     
    Last edited: Mar 23, 2018
  11. Domenick

    Domenick Administrator Staff Member

    Cool video, I don't think I saw that when it came out. The two women are safely on a sidewalk, but because of the bend in the road, their trajectory looks like they might be about to cross, though actually, the road bends.
     
  12. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    Here it is in the context of an InsideEVs news article: "Tesla Releases Self-Driving Demonstration With Recognition Feed – Video"

    I found it quite revealing to see all the false positives in the side views; the green outlines which, according to the caption, indicate "in-path objects". All those trees well to the side of the road, or even to the side and behind the car, which the software was painting as "in-path objects"!* Hopefully the state of the art of Tesla's software has advanced since then, but to me this really underscores why autonomous cars cannot rely on video cameras and fallible optical object recognition software. That approach to detecting obstacles is simply too unreliable. Robotics researchers have been working to improve optical object recognition software for decades, with rather limited success, and I think it is entirely unrealistic to believe that Tesla can increase the performance of such software in the very short time -- just 2-3 years -- that Elon Musk gives for development of Level 4-5 autonomous, fully self-driving cars.

    To me, that's just one piece in the overwhelming amount of evidence that Level 4 autonomy is going to require active scanning, using high-resolution radar, lidar, or both. And not merely the video cameras and low-resolution Doppler radar sensors that Tesla currently uses in its cars.

    *I am reminded of the possibly apocryphal story about the car insurance claim where the driver reported "I was driving along, minding my own business, when a tree jumped out in front of my car!" ;)
    -
     
  13. Feed The Trees

    Feed The Trees Active Member

    I am a big opponent of this technology. I don't think it will ever be in a position to actually drive itself safer than a human, there is too much trust that it's solid when it isn't and I can't see it ever being for the millions of situations a human can infer about that a car just simply cannot.

    It can react to stuff a human doesn't see or cant react quickly enough to, so sure collision avoidance is great but other than that I think the manufacturers are being highly irresponsible in putting this on the streets. Even self park is too much, too much room for disaster with nobody inside the car.

    And that doesn't even get to the very real possibility of hacking. I don't even want the sensors on board even in an off setting like the S has.
     
  14. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I wonder what future generations will think about people trusting their lives to human-operated vehicles moving at highway speed? Especially with current drivers (and passengers) trusting their lives to the rather questionable safety of public roads, despite knowing that a significant percentage of drivers text while driving, or drive drunk, or drive when they're short on sleep... all forms of impaired driving?

    Here's a passage from Larry Niven's "Flatlander" (the short story, not to be confused with the collection of the same title), set a few centuries from now, about an interstellar traveller playing tourist on Earth, in an era when any "car" travel uses fully automated aircars:

    I remember the freeways.

    They were the first thing that showed coming in on Earth. If we'd landed at night, it would have been the lighted cities, but of course we came in on the day side. Why else would a world have three spaceports? There were the freeways and autostradas and autobahns, strung in an all-enclosing net across the faces of the continents.

    From a few miles up you still can't see the breaks. But they're there, where girders and pavement have collapsed. Only two superhighways are still kept in good repair. They are both on the same continent: the Pennsylvania Turnpike and the Santa Monica Freeway. The rest of the network is broken chaos.

    It seems there are people who collect old groundcars and race them. Some are actually renovated machines, fifty to ninety percent replaced; others are handmade reproductions. On a perfectly flat surface they'll do fifty to ninety miles per hour.

    I laughed when Elephant told me about them, but actually seeing them was different.

    The rodders began to appear about dawn. They gathered around one end of the Santa Monica Freeway, the end that used to join the San Diego Freeway. This end is a maze of fallen spaghetti, great curving loops of prestressed concrete that have lost their strength over the years and sagged to the ground. But you can still use the top loop to reach the starting line. We watched from above, hovering in a cab as the groundcars moved into line.

    "Their dues cost more than the cars," said Elephant. "I used to drive one myself. You'd turn white as snow if I told you how much it costs to keep this stretch of freeway in repair."

    "How much?"

    He told me. I turned white as snow.

    They were off. I was still wondering what kick they got driving an obsolete machine on flat concrete when they could be up here with us. They were off, weaving slightly, weaving more than slightly, foolishly moving at different speeds, coming perilously close to each other before sheering off -- and I began to realize things.

    Those automobiles had no radar.

    They were being steered with a cabin wheel geared directly to four ground wheels. A mistake in steering and they'd crash into each other or into the concrete curbs. They were steered and stopped by muscle power, but whether they could turn or stop depended on how hard four rubber balloons could grip smooth concrete. If the tires lost their grip, Newton's first law would take over; the fragile metal mass would continue moving in a straight line until stopped by a concrete curb or another groundcar.

    "A man could get killed in one of those."

    "Not to worry," said Elephant. "Nobody does, usually."

    "Usually?"

    The race ended twenty minutes later at another tangle of fallen concrete. I was wet through. We landed and met some of the racers. One of them, a thin guy with tangled, glossy green hair and a bony white face with a widely grinning scarlet mouth, offered me a ride. I declined with thanks, backing slowly away and wishing for a weapon. This joker was obviously dangerously insane.
     
    Cypress likes this.
  15. Feed The Trees

    Feed The Trees Active Member

    And I know this lady was not where she was supposed to be, but man this should have been so glaringly obvious to the radar on board. Radar doesn't care about ambient light. This should have been a huge success story... driver not paying attention, lady in middle of dark road wearing dark clothes... look how this car comes to a screeching halt and saves her, or at least hits her a lot less hard than a human would have.

    Also the camera on the car is going to exaggerate how dark it was and she was. Good camera's can't even reproduce all the light shown, they can get maybe 8 of the 10 steps. A dash cam I would be surprised if it even captured 6 of the 10. With the headlights on this means it's going to get the exposure right for that section, her lower waist, and then by the time it gets down to the darker regions of the photo everything just goes to black, but that doesn't mean it was that black just that the camera couldn't capture any more degrees of shade. It's like ultraviolet on the other end, just because a camera can't show it to you in a picture doesn't mean it isn't there.

    So in this case I suspect the lady was a lot more obvious to the naked eye which can see these 10 steps.
     
    Domenick likes this.
  16. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I get the impression that you don't drive much, or at least not at night. I've certainly been startled at just how close I had to get to someone dressed in dark clothes before I noticed them, illuminated by nothing more than my car's headlights. In fact, the photo in the #2 post above gave me an immediate stomach-churning reaction, remembering just how close I've come a couple of times to hitting someone while driving, because they were walking in traffic while wearing dark clothes at night. This despite the fact that my night vision is noticeably better than that of many of my friends.

    Does the human eye have more sensitivity than a video camera? Yes. But that certainly doesn't mean you'll notice something with that very low contrast in time to react while driving down the road at a good speed. Especially if you're driving with low beams on. Bob Wilson was entirely right to bring up the point of overdriving the headlights.

    One advantage of self-driving cars is that they should be able to react almost instantly, far faster than human reflexes. That is, they should if they are programmed properly. I've been disturbed at the "hesitant" behavior described in some current semi-self-driving cars. If the software is hesitating before making a decision, that's an indicator to me that it needs more development before being taken out on public roads.
    -
     
  17. Feed The Trees

    Feed The Trees Active Member

    This is also exacerbated by HID headlamps. The light falloff is so sharp that it makes dark objects darker. Yes they are brighter where lit, but it makes it darker where dark. The old school halogen bulbs of the 80s maybe sucked *** in terms of downroad lighting, but they didn't cause the same falloff effect. I find it very bothersome to drive with HID, feels like driving with tunnel vision.

    Radar still a spectacular fail in this case.
     
  18. Domenick

    Domenick Administrator Staff Member

    At 38 miles per hour, he was not "overdriving" the headlights.
     
  19. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I think that's at least debatable. According to a post above, the accident occurred in an spot not illuminated by street lights, and the car had only its low beams on.

    From the Institute for Insurance Highway Safety website: "New ratings show most headlights need improvement"

    From the Institute for Insurance Highway Safety website: "Few drivers use their high beams, study finds"

    From the AAA NewsRoom website: "AAA Tests Shine High-Beam on Headlight Limitations"
    -
     
  20. Feed The Trees

    Feed The Trees Active Member

    I have to suspect this car was equipped with auto high beams, which should be able to detect how much light is needed given the circumstances and oncoming traffic. If low beams were on theb either the onboard system failed or it was low beam worthy given the situation.

    Sent from my Nexus 5X using Tapatalk
     

Share This Page