A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”
So, you admit that the company’s marketing has continued to lie for the past six years?
This is gonna get overturned on appeal.
The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.
Pressing your foot on it overrides any braking, it even tells you it won’t brake while doing it. That’s how it should be, the driver should always be able to override these things in case of emergency.
Maybe if he hadn’t done that (edit held the accelerator down) it’d stick.
I think the bigger issue is that Tesla might be diminishing the drivers impression of their vehicle responsibility with their marketing/presentation of auto pilot.
I say that knowing very little about what it’s like to use auto pilot but if it is the case that there are changes that can be made that will result in less deaths then maybe the guys lawyer has a point.
You gotta remember we’re also back in 2019. Most of the talk back then was about what it was going to be able to do when FSD was ready, but no one got access to it until 2020 and that was a very small invite only group, and it lasted like that for years. I’d say the potential for confusion today is immensely more.
I have used AP back then, and it was good, but it clearly made lots of little mistakes, and needed constant little adjustments. If you were paying attention, they were all easy to manage and you even got to know when to expect problems and take corrective action in advance.
My \ the big beef with this case, is that he kept his food on the accelerator, and the car tells you while you do this, that it won’t brake, and having your foot on the accelerator is a common practice, as AP can be slow to start, or you need to pass someone etc, so it’s really unfathomable to think that the first time this guy ever did this, was when he decided to try and pick up his dropped phone, and thought, I should keep my foot on the accelerator while doing this! No amount of marketing, should be able to override “Autopilot will not brake. Accelerator pedal pressed” type active warnings with the screen pulsating some color at him. He knew about those warnings, without any doubt in my mind. He chose to ignore them. What more could you write in a small space to warn people it will not brake?
That being said - The NHSTA has found that Tesla’s monitoring system was lacking, and Tesla has had to improve on that because of that in recent times. People would attach oranges to the steering wheel to defeat the nag to pay attention type thing back then, but this goes well beyond that IMO. Even the current system won’t immediately shut down if you decided to not pay attention for some reason, it would take time before it pulls itself over, but you might get a strike against future use where it will prevent you from using it again.
Had his foot not been on the accelerator, this would have been a very different case had the accident still occurred (which is also still possible)
Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.
My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.
Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.
The car tells you it won’t brake WHILE you do it.
This isn’t a fine print thing, it’s an active warning that you are overriding it. You must be able to override it, its a critical saftey feature. You have to be able to override it to avoid any potential mistake it makes (critical or not). While a Level 2 system is active, human input > level 2 input.
It’s there every time you do it. It might have looked a little different in 2019, but as an example from the internet.
(edit: clarity + overriding with the accelerator is also explained to every user before they can enable autopilot in an on screen tutorial of basic functionality)
On what grounds? Only certain things can be appealed, not “you’re wrong” gut feelings.
Just a further follow up - you actually can appeal that the jury was just outright wrong, but that would be an
really hardimpossible case to win here, i doubt thats what they would try. But just as an FYIhttps://www.law.cornell.edu/wex/judgment_notwithstanding_the_verdict_(jnov)
edit: Added emphasis there as well, which they could maybe try I guess given their error of law comment.
Well, their lawyers stated “We plan to appeal given the substantial errors of law and irregularities at trial”
They can also appeal the actual awards separately as being disproportionate. The amount is pretty ridiculous given the circumstances even if the guilty verdict stands.
There was some racial discrimination suit Tesla lost, and the guy was awarded 137 million. Tesla appealed the amount and got it reduced to 15 million. The guy rejected the 15 million and wanted a retrial on the award, and then got 3.2 million.
Thats not a gut feeling. That’s how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.
That’s not a gut feeling, that’s what stated in the manual.
I’ve never had one that turns it off if I accelerate.
They’ve all shut off if I tapped the brakes though.
Yep, can confirm works for my car too. If I press the gas pedal enough I can go faster than set cruise speed (for example, if I want to pass someone). If I lightly tap brakes, it turns kinda immediately.
What happens when you hit the gas while in cruise control? In all the cards I have driven, you go faster than the set speed and the car responds to your pedal movements. I guess we can debate if we call that stopped or just paused, but it is certainly not ignoring your acceleration.
Well, yeah, you can call it “paused” if you want to. The cruise control definitely stays on though and resumes the set speed when you stop accelerating. It completely disengages when you brake though, so I’ve never thought of it as turning off when I accelerate, only when braking.
No. Press the brake and it turns off. Press the accelerator in lots of cars and it will speed up but return to the cruise control set speed when you release the accelerator. And further, Tesla doesn’t call it cruise control and the founder of Tesla has been pretty heavily misleading about what the system is and what it does. So.
Yeah, sure.
You sound like one of those people who are the reason why we find the following warning on microwave ovens:
WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.
And on plastic bags:
WARNING: DO NOT PLACE OVER HEAD.
We both know that this is not what it’s for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.
(Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That’s malice. There where visible, audible and even tactile warnings wich this guy ignored.
No current day vehicle (or something from 2019) has in it’s manual that this is use as intended. As a matter of fact all warn you to not do that.
And I get that you hate Tesla/Musk, don’t we all. But in this case only 1 person is responsible. The asshole driving it.
Nope. I’m correcting you because apparently most people don’t even know how their cruise control works. But feel however you feel.
That’s not how cruise control works and I have never seen cruise control marketed in a such a way that would make anyone believe it was smart enough to stop a car crash.