Tesla Avoided a Recall, But That Doesn't Mean Its Autopilot Is Always Safe | Edmunds

Edmunds.comEdmunds.com Member, Administrator, Moderator Posts: 10,135
edited January 2017 in Tesla
imageTesla Avoided a Recall, But That Doesn't Mean Its Autopilot Is Always Safe | Edmunds

A NHTSA investigation into a fatal Tesla crash didn't find any cause for a recall, but that doesn't mean the system is completely safe.

Read the full story here


Comments

  • blood_donorblood_donor Member Posts: 1
    This is a good demonstration of how some autonomy can be very dangerous. It forms a "moral hazard", encouraging some drivers to not pay attention, when the system may not be ready to handle all situations.
  • dokterv8dokterv8 Member Posts: 8
    Thank you. If you've driven enough miles, you've had one of these experiences, whether in a tesla or not. Driving aids like auto-braking, etc. are good for when you're not paying attention. But there is no substitute for your own instincts and experience as a driver - some things can't be programmed or coded into an algorithim. If autopilot becomes ubiquitous, we will see just as many accidents, but of a different nature.
  • imispghimispgh Member Posts: 1
    Lockheed Engineer/Whistleblower - NHTSA made two fatal mistakes regarding Tesla “Autopilot” and Joshua Brown’s death

    https://www.linkedin.com/pulse/nhtsa-made-two-fatal-mistakes-regarding-tesla-autopilot-dekort?trk=hp-feed-article-title-publish

    First – I want to make it clear I want driverless vehicles to be a reality. But I want it to be done right. I also understand that the metrics show that Tesla and others are already statically saving lives. However they are putting lives unnecessarily at risk with the approach they are taking. NHTSA’s decision will lead to needless injury, emboldened and enable poor practice and make it far more difficult to holds these companies accountable.

    Regarding the term “Autonomous”. Until these vehicles are proven to handle all of the scenarios they need to handle to actually be autonomous and safe they should not carry that name. They should be driver assist. Regardless of the fine print, which I understand may be legal; the public’s trust is based on an assumption of rigor and quality that simply does not exist. Mr. Brown died because his car his a tractor trailer right in front of him. In the top 10 set of scenarios of “Use Cases” not hitting a very large object right in front of you should have been one of them. Tesla added radar capability after the crash that should have always been there.

    Regarding the mistaken approach this industry is taking. There is absolutely no need to put the public at risk and waste time driving around to gather primary data inputs. There are ways to gather data, to engineer and test to it that do not require this. They involve simulation/simulators and engaging the right folks, the right engineering practices (which Commercial IT has never used) and the creation of a scenario matrix. That large truck in the way and Tesla NOW releasing code that allows cars to exit a highway show that scenario matrix is nowhere near ready. These vehicles should not on the road, except in controlled situations, until this matrix is complete or at least progressively complete based on situation. Gathering data by putting people at risk until human Guinea pigs stumble on situations is the wrong way to do this. The problem here is that this industry and NHTSA have a limited experience set. They are way too enthralled with apps, websites and games. They think they are using best practice and this is the best way to do this. They think simulation and simulators are not up to this. None of that is not accurate. The major airlines, FAA, DoD and NASA plowed most of these fields long ago. (Look at fighter jets hands free carrier landings, the sensor integration in an MH-53M helo and FAA Level D simulators). Multi-sensor use and integration. Creating massive capability scenarios built mostly around exception handling. (Those are non-desired situations. Many lead to accidents). Commercial IT rarely does any of this. They have little experience in it and their practices not only don’t accommodate it they aren’t remotely best practice. Yes I get how crazy and arrogant this sounds. I am telling you that this is a perfect storm of assumptions. The assumption that the PayPal’s, Twitters, Googles of the world use best practice and make stuff so complex they can do anything. That is wrong. Tesla just hired a lead for their driverless division who made and OS for Apple. Uber hired someone who was just in Twitter for 8 years. Neither of these folks is remotely qualified for this. (No more than the folks who understand this making an OS or PayPal). These folks live is massive world where they think they are the be all and end all of technology and innovation. They have done great and cool things. But not this. They don’t know what they don’t know and are needlessly putting people at risk to save others. When Elon Musk first sent his SpaceX code to NASA for review they rejected it. They asked where the defects are and where the exception handling was. Elon said we have no defects. We tested them and fixed them. NASA told them it is impossible to test this properly, especially with the vast amount of exception handling situations and to have no defects on the books. This is not Twitter PayPal, Facebook etc. It is light years more difficult, complex and larger. The solution is to use actual best practices, ask for help from folks who have plowed the autonomous fields before especially with massive exception handling, create that matrix and use simulation/simulators, test tracks and some controlled public data gathering. You will get there much faster and put very few at risk.

    I implore NHTSA to do their homework and revisit the decision. Start with hat scenario matrix. And publish it. The public, insurance industry, traffic engineers, the automotive industry, those who have used best practice and done automation before like DoD, NASA, Boeing etc. should be able to see it and weigh in. This area is far too important to not get opinions from as many qualified folks as possible.

    (The biggest problem out there is comma.ai selling an “autonomous” kit for some Honda's. That kit allows for anyone to change the software. That is insane. Even if this product had a tested and complete scenario matrix, which I bet it does not, you cannot allow the public to tweak that software. Few are qualified. Not counting the immense risk of hacking.)

    Please find more specifics in my running article here - https://www.linkedin.com/pulse/nhtsa-should-shut-down-all-auto-piloted-self-driving-cars-dekort?trk=mp-author-card
Sign In or Register to comment.