Looks like this Model S went after a Police car instead of a Firetruck?

Discussion in 'General' started by David Green, May 29, 2018.

To remove this ad click here.

  1. David Green

    David Green Well-Known Member

  2. To remove this ad click here.

  3. TeslaInvestors

    TeslaInvestors Active Member

    I think there are actually a lot more. But only the firetruck/police related collisions are becoming news, since they have a voice.
    I can't think of any reason why Tesla AP will have a crush for law enforcement and emergency vehicles only.

    But what can we do? Elon is out there saving the polar bears with his "twice as safe" Autocrash system. :(
    I suppose, he is thinking that some journalists, whom he seems to hate now for some obscure reason, made up that picture and created a fake news. And LA times is getting paid by Shell or BP.
     
    David Green likes this.
  4. JyChevyVolt

    JyChevyVolt Active Member

    Can't blame the gun manufacturer when the crazy go nuts.
     
  5. Martin Williams

    Martin Williams Active Member

    Probably still too early to say, but it's looking as if AI is not quite as smart as it's cracked up to be.

    As Elon Musk is reported to have remarked recently about the smart production equipment he junked. 'People have been underestimated'. Evidently, this is true for him, but not for those of us who never thought much of the idea of driverless cars anyway.
     
  6. David Green

    David Green Well-Known Member

    Its hard to say, but more details on this one are emerging, seems the car turned into the police cruiser.
     
  7. To remove this ad click here.

  8. Martin Williams

    Martin Williams Active Member

    Are you perhaps suggesting that not only do Teslas crash into stationary vehicles that happen to be in their way, they actually go looking for stationary vehicles to crash into?
     
  9. David Green

    David Green Well-Known Member

    Not really... I am just making light of the situation the Tesla's have developed a reputation for hitting solid things recently.
     
  10. Martin Williams

    Martin Williams Active Member

    Well, I wasn't being entirely serious either!

    However, on further reflection, if neural networks are being used, nobody - including the designers - knows what is going on in its 'mind', because nobody really understands what's going on in any solid engineering sense.
     
  11. David Green and bwilson4web like this.
  12. To remove this ad click here.

  13. bwilson4web

    bwilson4web Well-Known Member Subscriber

    That makes a lot of sense as I've already noticed dynamic cruise control in both of our cars have some distance and speed difference limitations. Too far away and the car does not brake early enough. In fact, it seems like the braking comes on light and follows a non-linear increase in braking force.

    Bob Wilson
     
    silversod likes this.
  14. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I agree with the objective stated in the article; that people really do need to be much better educated about what semi-self-driving cars can't do. Sadly, the video fails to accomplish that task. The video says the car doesn't have time to stop after it "sees" the dummy car parked in its lane.

    This is simply wrong. It's not that the car doesn't have time to stop, it's that The car is not engineered to stop for stationary obstacles. I see a lot of disbelief in comments from people when this is pointed out. What, semi-self-driving cars (including those under the control of Tesla Autopilot + AutoSteer) don't even try to see large stationary objects in the car's lane? Unbelievable, they say!

    The problem is two-fold:

    1. Automatic braking systems depend on Doppler radar. That only detects relative differences in speed. Anything that's stationary (relative to the unmoving background) is ignored. It has to be ignored if it's not moving. If a system dependent on Doppler radar to sense things didn't ignore the background, then it would constantly be detecting things everywhere which needed to be braked for -- false positives everywhere -- and the car would never go anywhere!

    2. Tesla cars use video cameras and optical object recognition software to "see" things such as other vehicles and obstacles in the vehicle's path. Unfortunately, that software is so unreliable about placing obstacles that it "sees" that Tesla couldn't possibly rely on it. In a promo video that Tesla put out in Nov 2016, we see literally hundreds of objects being "painted" by Autopilot as "in-path" objects, even when they are stationary objects well to the side of the road -- including hundreds of trees! This is again a case of an overwhelming number of false positives. Obviously if Autopilot stopped the car every time it detected such an object, then -- again -- the car would never go anywhere.

    Of course, we can hope Tesla has improved the software somewhat since then, but -- based on what I know about the history of robotic R&D and the fallibility of optical object recognition software -- I think it's extremely unlkely that this can be improved, over the next few or several years, to the point that it will be reliable enough for human lives to depend on it. If roboticists have not been able to make truly reliable optical object recognition software despite working hard on the problem for decades, then it's very unlikely Tesla is going to solve that problem within the next few years.

    In my opinion, self-driving cars, or even reliably operating semi-self-driving cars, are going to have to have better sensors -- much better sensors -- than the low-res Doppler radar units which all cars with ABS systems are using. They need sensors which actively "ping" the environment to get a positive signal return, rather than relying on passive sensors such as video cameras.

    In other words, they need active scanning with lidar or a high-res radar array, or both.

     
    silversod likes this.
  15. bwilson4web

    bwilson4web Well-Known Member Subscriber

    Not owning a Tesla and only having dynamic cruise control, I don't have a dog in this:
    If this were accurate, +30,000 Teslas would be crashing much more frequently. The video shows the lead car changing lanes which exposes a stopped car and the failure of the Tesla to stop in time. It is not a risk I'm exposed to as dynamic cruise control is an assistant and as the driver I remain in charge and have to steer the my cars.

    Source: https://electrek.co/2018/06/10/tesla-version-9-software-update-fully-self-driving-features-elon-musk/

    Musk announced on Twitter last night that the version 9.0 is coming in August and that it would even include the first “full self-driving features”:
    . . .
    Last week, Musk said that Tesla should be releasing the Enhanced Autopilot’s On-Ramp/Off-Ramp feature in the next couple of month.

    Enhanced Autopilot is a $5,000 package available for Autopilot 2.0 cars and Tesla also offers an additional $3,000 package called ‘Full Self-Driving Capability’.
    . . .

    Bob Wilson
     
    Last edited: Jun 13, 2018
  16. David Green

    David Green Well-Known Member

    The way I understand it is there are saying Tesla's system needs improvement, which I agree with, we need Firetrucks and Police cars to fall safe to stop and render aid without the worry of being smashed by a Tesla, with a driver not paying attention
     
  17. David Green

    David Green Well-Known Member

    On the automatic emergency braking, other manufacturers seem to be about to better engineer a system, Autonomous Braking is going to be required soon... All the OEM's are working on it.

    On LIDAR... Ding Ding Ding... Thats why every self driving program other then Tesla is claims to need it it. In one way or another. GM is taking that one step further, they are hi res LIDAR mapping all of the areas where they allow either Supercruise, or the Cruise Automation to operate, for extra safety. This is what any responsible company should do for anything beyond level 2.
     
    silversod likes this.
  18. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    Perhaps I should have more fully explained: If the system "sees" the car it's following stopping, then it does detect that change in speed, and will stop behind the car it's following even after that car comes to a full stop. (I'm not exactly sure how TACC (Traffic Aware Cruise Control) systems, including Tesla's, "mark" the position of those cars even after they've stopped, but obviously they do.) The problem comes when a vehicle is already parked in the lane when it comes into view. A vehicle such as a parked fire truck... which is why we've seen two reports of Tesla cars under Autopilot + AutoSteer crash into parked fire trucks.

    Do Tesla cars under control of AutoSteer crash into cars parked in the lane every time they encounter them with no intervening car? Well of course not! The driver is expected to take control to prevent such accidents from occurring, and of course usually does.

     
  19. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I agree. I've been predicting for at least months now, if not over a year, that Tesla's advancement towards more capable self-driving will remain stalled until such time as Tesla (or more specifically, Elon) gives in and admits that cameras are not sufficient for dependable self-driving systems. Of course since I'm not an expert on the subject, and I don't know exactly what Tesla is doing, it's possible they will prove me wrong. But I haven't seen anything within the past several months that suggests to me that I'm wrong, and I think it's quite noticeable that Tesla, after making rapid strides to what I think industry experts would describe as advanced Level 2 autonomy with some aspects of Level 3, has stalled out in those advancements. That is, Tesla has stalled out in the semi-self-driving tech it puts into its cars. All we have seen for over a year now is incremental improvements in the tech it's already deployed in its mass production cars. I don't know if Tesla has stalled out in what its test fleet is doing; I hope not!

    But in the short term at least, it does appear to me that GM is now charging ahead of Tesla in advancing toward Level 3 and eventual Level 4/5 autonomy.

    Elon gave some vague mention at the recent Tesla Stockholder's Meeting that Tesla would soon demonstrate a radically different approach to self-driving, or at least semi-self-driving. I'm very much looking forward to seeing what Tesla will demonstrate!

     
  20. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    I agree. I've been predicting for at least months now, if not over a year, that Tesla's advancement towards more capable self-driving will remain stalled until such time as Tesla (or more specifically, Elon) gives in and admits that cameras are not sufficient for dependable self-driving systems. Of course since I'm not an expert on the subject, and I don't know exactly what Tesla is doing, it's possible they will prove me wrong. But I haven't seen anything within the past several months that suggests to me that I'm wrong, and I think it's quite noticeable that Tesla, after making rapid strides to what I think industry experts would describe as advanced Level 2 autonomy with some aspects of Level 3, has stalled out in those advancements. That is, Tesla has stalled out in the semi-self-driving tech it puts into its production cars. All we have seen for over a year now is incremental improvements in the tech it's already deployed in its mass production cars. I don't know if Tesla has stalled out in what its test fleet is doing; I hope not!

    But in the short term at least, it does appear to me that GM is now charging ahead of Tesla in advancing toward Level 3 and eventual Level 4/5 autonomy.

    Elon gave some vague mention at the recent Tesla Stockholder's Meeting that Tesla would soon demonstrate a radically different approach to self-driving, or at least semi-self-driving. I'm very much looking forward to seeing what Tesla will demonstrate!

     
    silversod likes this.
  21. David Green

    David Green Well-Known Member

    Tesla's auto-steer seems to have more problems then do others, considering Mobileye is on at least 5X as many cars. Cadillac Supercruise uses Mobileye Tech, but then GM added extra robustness to the top, to get to a level they consider safe and prudent. The executive in charge of Supercruise was on Autoline After Hours last Thursday and had some great insight, to the capabilities, and liabilities.
     
  22. David Green

    David Green Well-Known Member

    You are not an expert? It sure seems so in a lot of your posts... Actually when the GM self driving exec comes out and publicly says Elon Musk is "full of crap" on self driving, you can probably believe it. GM's executives are usually quiet and disciplined, so that was a serious comment.

    Tesla announced in the shareholder meeting that they are working to beef up the "easier system" but the "more difficult system" is not progressing well at all. Elon announced that the night before the meeting at 1AM he was testing a system that changed from ramp to ramp on the freeway. Fundamentally I think the Tesla system is flawed, because not it nags you to keep your hand on the wheel every 18 seconds and does not always register your hands being on the wheel, meaning you have to add force. This have been talked about on most of the Tesla forums the last 36 hours. What is the benefit of Autopilot if you have to pay attention, and keep your hands on the wheel, and fight with the car to prove your hands are on the wheel. With Supercruise on a long drive, you just set it and pay attention, let the car do the rest, no beta software, or half baked experience.
     
  23. Pushmi-Pullyu

    Pushmi-Pullyu Well-Known Member

    Far warning: In this post, I'm stepping away from entirely factual statements, into the realm of opinion. Hopefully informed opinion on my part, but I'm not an expert in this field, so what follows is based on what I've gleaned from my study of the subject, which may or may not be entirely correct.

    * * * * *

    I submit that's merely a matter of selective observation. The media puts a strong spotlight on Tesla, not so much on other auto makers. It's not that other auto makers' cars never run into parked fire trucks or concrete walls on the freeway, it's just that nobody finds it odd when such an everyday occurrence happens. People expect Tesla cars to avoid accidents, because of Tesla's well-publicized advancements in self-driving tech. Unfortunately, the general public has a vastly inflated idea of the ability of Autopilot+AutoSteer. There isn't much understanding of the wide gap in ability between Level 2 and Level 4 autonomy. Tesla's Autopilot+AutoSteer is rated by industry experts at Level 2; I'd argue it has some aspects of Level 3. But people expect Level 4, so they express disbelief when we tell them that Autopilot+AutoSteer can't detect stationary obstacles in the road!

    Also, it's widely reported that for example Cadillac's automated lane-keeping system gives a smoother ride than AutoSteer, and I do think it's odd that Tesla hasn't made much efforts to reverse that situation. But despite the reports of Tesla cars under AutoSteer "lurching" from side to side in a lane, there's no question that AutoSteer significantly reduces the number of accidents. The human rider may find such lurching alarming, because that's not how humans drive. But keep in mind, the objective of those developing self-driving cars is not to slavishly try to re-create a human driver's habits. The objective is to design robot driven cars to be far safer than human driven cars!

    I think it's pretty clear that AutoSteer is significantly more advanced than other lane-keeping systems, insofar as utility is concerned, and probably significantly more advanced in safety, too. We haven't seen reports of other brands of automobiles having their accident rate sharply reduced by an automated lane-keeping system like AutoSteer! That's not where Tesla is falling behind.

    Other auto makers, most notably GM, are moving toward putting solid-state lidar scanners into their semi-self-driving cars. Such active sensors are what is needed for the car to have a SLAM* system, capable of real-time scanning of the environment... including stationary obstacles. That is where Tesla is falling behind. Of course, just putting lidar or high-res radar arrays into the car doesn't automatically mean the car has an adequate SLAM system, but I think it has been shown to be a requirement for that, despite Elon's protests that cameras are adequate. Active scanning with lidar or high-res radar is arguably a necessary step toward advancement, but the SLAM software also needs to be developed, and the onboard computer needs to be able to handle all that.

    *SLAM stands for Simultaneous Localization And Mapping technology, a process whereby a robot or a device can create a 3D map of its surroundings, and orient itself properly within this map in real time.

     
    Last edited: Jun 14, 2018

Share This Page