Sounds like Tesla engineers wanted to make Autopilot work like Supercruise

Discussion in 'Tesla' started by David Green, May 14, 2018.

  1. David Green

    David Green Active Member

  2. NeilBlanchard

    NeilBlanchard Active Member

    Why didn't they put more sensors on the car, I wonder?
     
  3. David Green

    David Green Active Member


    It is hard to understand the decision making process on some things at Tesla... Autopilot obviously needs some more safety features to keep people from abusing it so easily.
     
  4. NeilBlanchard

    NeilBlanchard Active Member

    So, why didn't they add more sensors, do you think?
     
  5. David Green

    David Green Active Member


    I am not sure why they did not put more safety into the system... Elon Musk overconfident in the tech is a possibility, as is incompetence on the part of the engineers. Its hard to know for sure, obviously Tesla rushed this out, and did not think through the ramifications of system abuse well enough.
     
  6. bwilson4web

    bwilson4web Active Member

    How hard can it be:
    [​IMG]

    Bob Wilson
     
  7. David Green

    David Green Active Member

    Yes, it should be easier then landing a rocket booster.. but these are different companies, and different engineers.
     
  8. NeilBlanchard

    NeilBlanchard Active Member

    How have you diagnosed the issue(s)? I strongly suspect it is software, and/or CPU grunt.
     
  9. David Green

    David Green Active Member


    I have no idea, No data to make assumptions from...
     
  10. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I think the important question is why Tesla (or Elon Musk) is so adamant about not putting lidar scanners into their cars.

    As far as why not more sensors: Well obviously part of the reason they don't is the additional expense, and I assume that's the primary reason for refusing to put in lidar.

    But from a computer programmer's viewpoint, there is a downside to more and more sensors. Each sensor provides a stream of data input, and that input must be monitored and analyzed in real time. The more data which must be processed and analyzed simultaneously, the slower the entire software decision-making process gets. For example, a recent report said in the case where the Uber car hit and killed a pedestrian, the car did detect the pedestrian, just not in time to stop.

    One thing self-driving car designers are going to have to experiment with is finding a sweet spot, a "happy medium", between too little sensor input, which would give an incomplete "picture" of the environment and the objects moving within it, and too much sensor input, which would result in the computer equivalent of sensory overload, slowing decision-making down so much that a slow reaction time by the car would become a danger.

    But with the current state of affairs, it's pretty silly to even talk about sensors providing a complete picture. As has been pointed out in a recent article about a second Tesla car hitting another fire truck parked on the highway, semi-self-driving cars currently in production are not built to even try to react to stationary objects in the lane the car is driving in. (From a lot of comments posted to this article, a great number of people are having a hard time believing that!)

    For reasonably reliable (that is, not perfect but pretty good) full autonomy -- Level 4 or Level 5 -- a car is going to have have a SLAM* system. Waymo's experimental test fleet of self-driving cars do have a SLAM system, but nobody -- not even Tesla -- is making a production car which even attempts to create a SLAM.

    Tesla's Autopilot+AutoSteer qualifies as advanced Level 2 autonomy. They have a long way to go to get to Level 4, and development of a SLAM system will be part of what's necessary to get there. A SLAM would include detecting the presence and placement of stationary obstacles, up to perhaps 100-200 meters from the car, or at least that far in front of the car, perhaps not as far to the sides and/or behind.

    *SLAM stands for Simultaneous Localization And Mapping technology, a process whereby a robot or a device can create a 3D map of its surroundings, and orient itself properly within this map in real time.

     
    Last edited: May 16, 2018
  11. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    Or maybe it's because unlike you, Tesla realizes that even though it's a work in progress, far from complete, Tesla Autopilot+AutoSteer (Beta) can still be saving lives right now, even in its less-than-half-finished form.

    And it is saving lives right now. There is far too much attention focused on the very few (count 'em on the fingers of one hand) people killed in cars under the control of AutoSteer, and far too little attention focused on the number of people whose lives have been saved by someone using AutoSteer; a number almost mathematically certain to be a significantly higher number than the few people killed.

     
  12. Martin Williams

    Martin Williams Active Member

    So prove it!

    If you can't you have no idea whether this is true or not. Sounds like advertising bullshit to me.
     
  13. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    One can point to very definite statistics showing that Tesla cars equipped with Autopilot+AutoSteer have an approx. 40% lower accident rate than Tesla cars without AutoSteer.

    Extrapolating that to a significantly lower fatality rate is a straightforward application of logic... o_O

    "Tesla’s crash rate was reduced by 40% after introduction of [AutoSteer] based on data reviewed by NHTSA"

     
    Last edited by a moderator: May 16, 2018
  14. Martin Williams

    Martin Williams Active Member

    Well, as I can't get your link to work I can't check whether the findings were statistically significant or not, so I have nothing to extrapolate from. I don't suppose you would bother with such niceties.

    Still sounds like advertising bullshit.
     
  15. David Green

    David Green Active Member

    Of course PP cannot prove anything, Tesla's BS on A/P safety is going to get blown out of the water with the latest investigations, and lawsuits that are going to comb the real statistics. Tesla A/P is obvious seriously flawed (compared to Mobileye and Supercruise), and should be fixed so that is cannot be abused, or turned off, as it not only puts Tesla drivers in danger, but the general public.
     
  16. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    Let's see, I've got facts and figures from government investigative agencies to back up what I'm saying; you've got inflammatory nonsense, FUD, extreme negative bias, and repeated outright lies.



     
    Last edited by a moderator: May 18, 2018 at 11:38 AM
  17. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    Apparently you've also forgotten how to use Google. It's not like the info is hard to find. More to the point, you want me to do your homework for you.

    ROTFL!!

    You've been posting all over the forum for the past week or so about three recent horrible accidents involving Tesla cars, and the conclusion you've jumped to about safety based on only those three. And now you're questioning whether or not the NHTSA's conclusions are based on statistically significant data!

    And this is just one more tired example of how you ignore actual evidence contrary to your preconceived ideas.

     
    Last edited by a moderator: May 18, 2018 at 11:38 AM
  18. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I got a note from the moderator saying my link didn't work, and he substituted another.

    Thank you (Domenick? Jack?), I like that article better than the one I tried to link to! :)

    However, the correct title for this link should be "Tesla’s crash rate dropped 40 percent after [AutoSteer] was installed, Feds say"

    Unfortunately, we're still seeing journalists -- and Tesla itself -- using the term "Autopilot" when they mean "AutoSteer". Altho the title incorrectly uses "Autopilot", the chart in the article is correctly labeled "AutoSteer". (Well, almost correctly. It should be "AutoSteer", not "Autosteer". But that's just a Grammar Nazi quibbling. ;) )

    [​IMG]
     
  19. Martin Williams

    Martin Williams Active Member

    Knowing your propensity for half-truths, dissembly, misinterpretation and other similar tricks I'd be a fool to take your word for anything without looking into it.
     
  20. David Green

    David Green Active Member

Share This Page