Another Autopilot crash

Discussion in 'Tesla' started by bwilson4web, Dec 8, 2019.

To remove this ad click here.

  1. bwilson4web

    bwilson4web Well-Known Member Subscriber

    Source: https://abcnews.go.com/US/tesla-autopilot-slams-police-cruiser-driver-claims-checking/story?id=67570199

    A Tesla on Autopilot slammed into two vehicles on Saturday, one of which was a Connecticut State Police cruiser, officials said.

    The driver of the Tesla told police that he put the car on Autopilot because he was checking on his dog in the backseat, according to a statement from Connecticut State Police.

    The incident happened in the early morning hours Saturday on Interstate 95 in Norwalk.

    Bob Wilson
     
  2. To remove this ad click here.

  3. Roy_H

    Roy_H Active Member

    It disturbs me greatly that Automatic Emergency Braking failed to activate. The car keeps going even after the initial impact with the Police cruiser, activating the air bags, and carries on hitting the car the police were helping and then on to a second cruiser. It seems to me an extreme oversight that the car would keep driving after impact. Bad enough that the AEB didn't work.

    This has been true of other incidents including where the car has driven under a semi trailer, sheared off the roof (killing the driver) and then continues to drive down the highway. But now blind as the cameras in the windshield are gone so eventually the car drives off the highway and crashes into something finally making it stop. How is it that the software is written to continue driving after impact, and after it no longer has any input from the cameras???

    Tesla has allowed Model X to drive in summon mode with the doors raised and then crashing into the side of the garage door. Again who decided that it is safe to drive with the doors open? Software seems to be written with no concern about failure modes. I would have thought that any professional programmer would be very concerned to write fail safe code, but I feel Tesla has hired gammers who are brought up playing games where when you get killed, you just come back to try again, and no big deal.
     
    R P likes this.
  4. interestedinEV

    interestedinEV Well-Known Member

    There is a whole set of debates on ethics that apply to driver less cars. Once of the criticism against Autopilot is that the driven can fool the system into thinking that he/she has hands on the wheels, and other manufacturers have made that more difficult.

    But at the end of the day, this driver was stupid, there is no cure for stupidity. He willfully disregarded the instructions.
     
  5. Roy_H

    Roy_H Active Member

    The fact that the driver is ultimately responsible is a poor excuse to write software that has no fail safe code.
     
    R P likes this.
  6. I have made the same point in the several past posts, and was quickly admonished by the Tesla lovers who claim the loss of some lives with autopilot is an acceptable cost to further the testing of self driving. I don't know of any other manufacturer takes that position, esp when self driving is nowhere near ready for public usage. They need to better test it in a virtual AI environment to a point where such obvious software deficiencies are fixed before letting it loose on untrained drivers and without on board monitors.

    If the needless accidents and deaths continue, I fear a regulatory backlash which could put all EVs in a bad light, just because of a few bad apples.
     
  7. To remove this ad click here.

  8. interestedinEV

    interestedinEV Well-Known Member

    I make no excuses for Tesla or any manufacturer not doing their best to prevent such accidents. They absolutely should do more. Other companies seem more pro-active and there is no reason why Tesla should not do so.

    That said, my understanding is that you have to trick the system in a Tesla by hanging a weight or something like that to circumvent the current checks. Other companies in addition use Cameras to use eye position as an additional check. Now if this driver knew that he was doing something to bypass the system, then it is deliberate. So if you have to apportion blame, you can. Tesla may be responsible to the extent that they allow the system to bypassed easily. But in this case, the driver has to bear responsibility too.
     
  9. bwilson4web

    bwilson4web Well-Known Member Subscriber

    There is a problem with stationary objects. I did some late night testing approaching semitrailers with and without side skirts in the middle under the frame. It was obvious that side skirts were detected and the car automatically braked. But without the side skirts, I had to brake.

    Autopilot is not perfect but it is perfect-able. Earlier versions required accelerator input when the lead car pulled out at a light. Now it does it on its own. Changing lanes is much improved in heavy traffic.

    I am not defending this driver as much as sharing that Autopilot is getting better. Dealing with stationary objects in the road appears to be a harder problem than one might expect.

    Bob Wilson
     
  10. marshall

    marshall Well-Known Member

    I'm with you. I don't buy the hysteria RP is trying to sell.

    So far, I've seen 3 deaths in the last month due to auto accidents in the Seattle area. None of them where using Autopilot.
     
  11. This is exactly what I am talking about. Could be bad for all of us, not just Tesla. Here is a Tweet from a senator.

    [​IMG]


    Richard Blumenthal
    ✔ @SenBlumenthal
    This crash could’ve been avoided. While autonomous vehicles are an exciting development, the tech is simply not ready to be deployed safely. Congress must act to protect the public from these vehicles until their safety can be assured.
    https://www.wfsb.com/news/state-police-cruiser-struck-by-tesla-in-autopilot-mode-on/article_03f661fa-18ff-11ea-968f-5bb6ecb31c72.html …
     
  12. To remove this ad click here.

  13. marshall

    marshall Well-Known Member

    Eight car pile-up Thursday on I-5. Two women sent to the hospital. No Autopilot involved! Heck, no Tesla's involved.
     
  14. Yeah, so??.. Lots of accidents all the time. This thread is about Autopilot accidents, not accidents in general. We have enough of those, and don't need to add to that carnage.

    What we don't want is needless extra accidents and deaths caused by unwitting, untrained drivers testing/using the self driving features of Tesla's Autopilot (with hands off) before the software is safe for general public use.
    https://www.bloomberg.com/news/articles/2019-12-13/tesla-that-hit-parked-police-car-while-in-autopilot-being-probed
    Like I said before, Tesla's reckless actions may actually be jeopardizing the implementation and adoption of these capabilities by all manufacturers. More regulations always slow things down.

    Again, I'm all for driver assist features that help prevent accidents, incl those in Tesla's autopilot. Both of my cars have them (and they work great), and would never buy a car without them. But that's very different than letting regular drivers use buggy/incomplete software that could not only be a danger to themselves, but others in their path.
     
  15. bwilson4web

    bwilson4web Well-Known Member Subscriber

    In about 3-6 weeks, we'll have the Q4 2019, Tesla safety report.

    I don't mind an investigation as long as it has real metrics comparing Autopilot to non-Autopilot accident rates. So if the numbers come in that Autopilot operating cars are significantly safer than the others, would that change your mind?

    Bob Wilson
     
  16. Comparing autopilot to non-autopilot does not provide metrics for the safety of self-driving as it stands now. Autopilot has the other driver assist features (besides self driving) which of course add to safety, just as they do in other cars. So of course those metrics should be better. I keep saying this over and over, but you guys just don't seem to understand that. Not sure what else I can say.

    It is the self driving (hands off) part of autopilot that is causing these extra needless accidents. And the media doesn't help when they just say autopilot, and not specifically self-driving. That's why I say it is giving the whole industry a black eye. I wish Tesla would differentiate that, too. Actually, to be really safe, they should remove that feature completely from autopilot until is properly tested inhouse and safe for public use. But we know that is one of Musk's selling points of Tesla, and a few lives lost here and there are not going to stop him from promoting that.

    Funny, but have had other non-EV friends bring these Tesla crashes up to me (incl the fires), and I just explain to them that other manufacturers don't have this problem. When I explain the diff between the Tesla autopilot driver assist features (which are now pretty well standard in all new cars) and self driving (which is the dangerous) part, they get it pretty quickly. Not so it seems with some of the Tesla fanbois on this forum...
     
  17. interestedinEV

    interestedinEV Well-Known Member

    In ethics you have what is called the "Trolley Problem" or "Trolley Dilemma".

    Putting it in this context: You do nothing (leave things as they are for now), 5 drivers will be in fatal accidents. You do so something (introduce autonomous driving which has been tested in labs but not thoroughly on the roads) and those 5 people may not have the accident, but one innocent person who drives carefully is in a fatal accident due to a software glitch. This is a restating of the "Trolley Problem" or "Trolley Dilemma" as it applies to autonomous driving.

    This is a much debated ethical question. Now unfortunately there is no consensus on this. It is a belief system. There are some who argue that the one innocent driver cannot be sacrificed to save the five who were destined to die, i.e. we cannot play god. Others would say greater good dictates that we save the five people. While I may believe that the only way software can improve is putting it on the street, as long as the manufacturer takes proper precautions and do not allow known errors to go into production. @R P believes otherwise, and I appreciate that. There is no right and wrong here. You can read up the "trolley problem" and see that there is so much of impassioned debate on both sides. As a society we need to make a decision, knowing fully well that not all will agree with it.
    Even if the numbers come out as you say, it does not answer @R P 's ethical dilemma, which is "does this entail a loss of an innocent life?". The question he is asking is a related but a little different, he wants to know 1) lives were saved by this technology and (repeat and) 2) No lives that were not previously in jeopardy were killed by this technology. The first one will be answered by your numbers, the second one is more difficult. So he may not change his mind unless the second one is also answered. More specifically in this case he believes that Tesla has not thoroughly tested fully autonomous driving and should not be putting out such software as it can cause deaths in addition to saving them.

    I may not agree with him fully (but I agree Tesla needs to do more) but I understand where he is coming from. I would leave it at that.

    [​IMG]

    Imagine you are standing beside some tram tracks. In the distance, you spot a runaway trolley hurtling down the tracks towards five workers who cannot hear it coming. Even if they do spot it, they won’t be able to move out of the way in time. As this disaster looms, you glance down and see a lever connected to the tracks. You realise that if you pull the lever, the tram will be diverted down a second set of tracks away from the five unsuspecting workers. However, down this side track is one lone worker, just as oblivious as his colleagues. So, would you pull the lever, leading to one death but saving five?

    This is the crux of the classic thought experiment known as the trolley dilemma
     
    Last edited: Dec 14, 2019
    Gsbrryprk8 likes this.
  18. Are you a university prof,... haha? Been a long time, but I do remember arguments/debates like this. I am a way more practical, down to earth, these days.:)
     
  19. interestedinEV

    interestedinEV Well-Known Member

    I used to be one many years back, but then went into industry and am now a consultant. I did update the posting to be a little more specific and you may want to look at it again.

    You are practical and are in "don't pull the lever camp". As a consultant I hedge my bets and say "it depends" :D
     
  20. And I don't really disagree with you either. At least you understand better what I am trying to say. We may disagree slightly how we should get to the end, but I do respect your reasoning.
     
  21. interestedinEV

    interestedinEV Well-Known Member

    Ahh, it looks like others have used the trolley problem analogy for autonomous vechicles

    You may want to take this test
    http://moralmachine.mit.edu/



    https://en.wikipedia.org/wiki/Trolley_problem

    Implications for autonomous vehicles
    Problems analogous to the trolley problem arise in the design of software to control autonomous cars. Situations could occur in which a potentially fatal collision appears to be unavoidable, but in which choices made by the car's software, such as who or what crash into, can affect the particulars of the deadly outcome. For example, should the software value the safety of the car's occupants more, or less, than that of potential victims outside the car.[26][27][28][29][30]

    A platform called Moral Machine[31] was created by MIT Media Lab to allow the public to express their opinions on what decisions autonomous vehicles should make in scenarios that use the trolley problem paradigm. Analysis of the data collected through Moral Machine showed broad differences in relative preferences among different countries.[32] Other approaches make use of virtual reality to assess human behavior in experimental settings.[33][34][35][36] However, some argue that the investigation of trolley-type cases is not necessary to address the ethical problem of driverless cars, because the trolley cases have a serious practical limitation. It would need to be top-down plan in order to fit the current approaches of addressing emergencies in artificial intelligence.[37]

    There is also the question of whether the law should dictate the ethical standards that all autonomous vehicles must use, or whether individual autonomous car owners or drivers should determine their car's ethical values, such as favoring safety of the owner or the owner's family over the safety of others. Although most people would not be willing to use an automated car that might sacrifice themselves in a life-or-death dilemma, some[who?] believe the somewhat counterintuitive claim that using mandatory ethics values would nevertheless be in their best interest. According to Gogoll and Müller, "the reason is, simply put, that [personalized ethics settings] would most likely result in a prisoner’s dilemma."[38]

    In 2016, the German government appointed a commission to study the ethical implications of autonomous driving.[39] The commission adopted 20 rules to be implemented in the laws that will govern the ethical choices that autonomous vehicles will make.
     
  22. bwilson4web

    bwilson4web Well-Known Member Subscriber

    Nonsense:
    We call that 'letting Perfect be the enemy of Good Enough.' The same has happened with every safety advance:
    • seat belts - 'you could be trapped in a burning car or drown in a river'
    • anti-lock brakes - pumping the brakes instead of letting the system handle skidding
    • air bags - kill pets and small people
    In contrast, Tesla provides a quarterly safety report:
    • Tesla accidents with autopilot
    • Tesla accidents without autopilot but using built-in safety systems
    • Tesla accidents without built-in safety systems
    Due to the sample set of Teslas, it is what we have.

    Tesla also includes the gross NHTSA accident rate BUT does not try to match vehicle class, accidents to the Teslas models. When (IF) NHTSA or NTSB does a proper study comparing equivalent size and weight cars to the Tesla, it would work for me. But someone has to collect and report the data and worse, deal with the small number of Tesla accidents. But there is one latent problem, Autopilot version.

    Tesla continues to upgrade their software to improve functions and operation. In the past 30 days, I've had two upgrades including enabling the current functions of Full Self Driving that I bought two months ago. The latest version alerts if you are about to run a stop sign ... having nothing to do with auto-steering which has also improved.

    The main advantage of auto-steering is in lane changes. With 8 cameras, 12 ultrasonic sensors, and a powerful computer, the car changes lanes safer than I can. The car measures the gap, matches speed, signals, and smoothly changes lanes. This isn't theory but fact.

    Bob Wilson
     
    Last edited: Dec 14, 2019
  23. I am all for releasing driver assist software (which is what you are describing) as long as it has been properly tested inhouse and does not endanger the end user. But that is not full self driving, and it shouldn't be called that. FSD is when you can leave your hands off the wheel and not watch the road. Unfortunately, some naive drivers have been duped into believing they can do that (just have to trick the system). That's what is dangerous and should not be allowed.
     

Share This Page