Did you recently take on (or consider) a loan of 84 months or longer on a car purchase?
A reporter would like to speak with you about your experience; please reach out to PR@Edmunds.com by 7/22 for details.
A reporter would like to speak with you about your experience; please reach out to PR@Edmunds.com by 7/22 for details.
Options
Comments
So this might be a case where autopilot will be a severe detriment to the lawsuit and/or insurance claim.
Pilot training must be at least 100X better than driving training and some of us are aware that there are bad pilots out there.
So you can imagine the number of incompetent drivers out there who aren't even spaced out on autopilot, much less with it.
At a minimum it is a legal and moral responsibility to mitigate any damages caused by an illegal act (whether it was negligent is another matter). At a maximum, deliberately causing a collision could be considered attempted murder (although in the case of a Tesla hitting a semi, this one might be overboard). However, what if it was a Tesla deliberately hitting a motorcyclist?
Tesla says that before Autopilot can be used, drivers have to acknowledge that the system is an "assist feature" that requires a driver to keep both hands on the wheel at all times. Drivers are told they need to "maintain control and responsibility for your vehicle" while using the system, and they have to be prepared to take over at any time.
http://www.freep.com/story/money/cars/2016/07/05/southfield-art-gallery-owner-survives-tesla-crash/86712884/
And Musk is now trotting out the 'it's an anti-Tesla conspiracy' nonsense regarding the fact that Tesla didn't publicize the accident until after a huge stock sale.
If this makes you go Wow, (or why) just wait until all of the cars have cap-less fuel fillers like Ford does and the car is robotically fuelled. This is projected right down to the dispenser linking to your cell phone or the car's Bluetooth for payment.
I mean, I like the idea of switching to autopilot in very low speed traffic crawl. At worst, when the autopilot makes an error (which it will, invariably), the result might be no worse than a bumper tap.
At this point, I don't even want to ride in a Tesla with the owner on autopilot beaming to his passengers: "Look Ma no hands!".
That being said, Tesla should be forced to rename its system to something other than "autopilot", as it isn't one.
My main concern is that all these "driver aids" are going to dumb-down more drivers than they are going to assist.
I know that one risks being branded a "luddite" if one resists any kind of innovation, but my counter argument is that today's "futurists" aren't like the ones who used to sit in think tanks and really THINK things out. Today's "futurists" who hang out at TED talks sometimes seem to be borderline product hucksters more than careful planners of future tech.
Many people are not a fan of TED talks. I had a boss who was addicted to them, and some co-workers were enthralled, but I think the majority finally got bored and they stopped.
Well, let's see--we'll learn astrophysics from 9 to 10 AM, then move onto The Nature of Autonomous Driving Systems just before lunch.
Man, this sure beats 8 years of post-doc.
http://www.autonews.com/article/20160711/OEM05/307119921/hyundai-steps-up-ev-cadence
A Tesla spokesperson released a moment by moment description of what happened in the 40 seconds before the crash.
After 15 seconds of what was described by Tesla as "visual warnings and audible tones," the autopilot began to disengage because the driver's hands were still not on the wheel.
About 25 seconds before the crash, "Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel," according to the company.
Tesla said the driver responded 11 seconds before the crash by retaking the wheel, turning it toward the left and pressing on the accelerator.
And really, in 11 seconds you can't avoid a collision?
Something isn't adding up with any of this.
Just goes to show you how there's no such thing as a "foolproof" system. You provide the system, Nature provides the fool that'll prove you wrong.
PS: A little off the subject. Am I the only one who gets insulted when someone says: "You'll love this. It's foolproof!"?
As anyone who has watched car wrecks in Russia on YouTube, not all accidents are preventable, no matter what the driver, or the machine, does.
The Tesla driver in FL is the one that got killed. And that was on autopilot as far as I can tell. Did not react to a semi crossing in front of him.
Best keep your hands on the wheel and play Pokemon at home.
If this was a 77 year old crashing his Avalon, it wouldn't be a surprise. I suspect this guy's VCR is blinking 12:00/
The Barnum line is a good one, although IIRC the "sucker" quote was actually about him. I remember when the PT Cruiser was new and selling for huge amounts over MSRP, a friend of mine said the "PT" was for P.T. Barnum.
You may be too young to know about W.C. Fields, but he had many good lines as did Groucho Marx.
This month, two Teslas equipped with Autopilot veered into barriers following disclosure of the first fatal wreck, a Model S slamming into a 18-wheeler crossing a Florida highway after the semi-autonomous car failed to distinguish the truck’s white trailer from sky.
“The moment I saw Tesla calling it Autopilot, I thought it was a bad move,’’ said Lynn Shumway, a lawyer who specializes product liability cases against carmakers. “Just by the name, aren’t you telling people not to pay attention?’’
http://www.bloomberg.com/news/articles/2016-07-15/tesla-won-t-be-able-to-put-crash-defense-on-autopilot
There are still plenty of second generation ones around. Are these more reliable than the first generation, or are they still around because they're newer and less miled up?