'Operational limitations' played a role inTesla Autopilot crash 'Operational limitations' played a role in Tesla Autopilot crash
1:08 PM ET Tue, 12 Sept 2017 | 01:00
The "operational limitations" of Tesla's Autopilot system played a "major role" in a 2016 crash, the National Transportation Safety Board said on Tuesday.
NTSB investigators met to determine the cause of a fatal crash involving a Tesla on Autopilot in May 2016.
After meeting for roughly 2½ hours Tuesday, the board determined the probable cause of the accident was a combination of a semitrailer driver failing to yield the right of way to the Tesla driver, the Tesla driver's overreliance on the car's Autopilot system, and certain design limitations in the Autopilot system that failed to adequately warn the Tesla driver of the approaching truck.
I just want to touch on the fact that these tech companies like Tesla are pushing products on the market that haven't been fully tested or maybe dangerous to use in certain circumstances. These self driving cars have along way to go before I would deem them safe in my mind but the government doesn't seem to be concerned with the risks and is pushing laws to allow self driving cars all across the US. Clearly our politicians are being heavily lobbied by the tech companies to just ram legislation through the system as fast as possible with no fore thought into the consequences.
So the main problem in the crash according to Tesla was the type of road their autopilot system was being used on wasn't suitable for engaging the autopilot. Which I totally understand if it's meant for interstate highways vs back country roads because there's a big difference in driving on one vs the other. However, with all the technology built into these cars and known road limitation these autopilots have why in the world isn't Tesla using a GPS system that tells the driver of the car what roads the autopilot will work on and what roads it won't work on.
Sounds simple enough to have that type of safety system in place but maybe Tesla isn't concerned about peoples safety and more concerned about selling cars. Sounds like a great case for a lawsuit against them.
Never trust technology
I understand the desire for safety, but I'm against (most) regulation, especially in this field, since legislators are retarded.
Without testing it in real road conditions, I really don't think we'll ever get to a point where they are safe. And they never may be entirely safe. But I'd rather have the technology developed and perfected rather than legislated out of existence so we won't see it in real action for another 10, 20, 50 years.
It's also a problem of dumbass drivers. I'm pretty sure that Tesla says the Autopilot feature is in alpha/beta on the car startup, and that you should be careful relying on it entirely, and that you still need to be alert behind the wheel.
Ya but if it's in beta testing how hard is it for Tesla to put autopilot restrictions on cars when they are driving on dangerous or untested roadways? Seems like a no brainer to me.
They gotta still test those roadways somehow. I'd be interested in seeing what changes, if any, they've implemented from the mistakes the autopilot has made.
The story says autopilot wasn't designed for that type of road. So they must of tested it and deem it unsafe before the crash.
I understand the concern for safety but Tesla gives full disclosure on the use of its Autopilot system. The driver must be fully aware of his surroundings at all times while the Autopilot system is engaged, much like an airline pilot paying attention to weather patterns, other airplanes, and directions from various control towers. The C117 can take off, fly to its destination, and land all by itself. That doesn't mean that pilots hit go and take a nap. You must hold the owners of this technology responsible for not properly using it as instructed. I see no fault here for Tesla who has given and continues to give adequate warning to its customers. In a similar circumstance Dodge gives warning to customers who buy their SRT performance cars especially the Hellcat and Demon. The customer cannot blame the manufacturer for wrecking their car because they don't know how to drive a 700 + hp car.
In all wrecks the driver is at fault but who's the driver when the autopilot is on? Sure the person in the driver seat can turn it on and off but are they truly in control of the vehicle when it's on and their hands and feet are off the steering wheel and peddles or is Tesla in control?
Who's ever in control of the vehicle at the time of the crash is to blame for the crash so I tend to agree with the NTSB on this one.
The owner should have been paying attention to his surroundings instead of sleeping. All he had to do was brake (bringing the car out of autopilot) and allow the truck to merge. You cannot blame Tesla for his own negligence.