A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”

So, you admit that the company’s marketing has continued to lie for the past six years?

    • kamen@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      15 hours ago

      Unfortunately, for companies like this, that would be just another business expense to keep things running.

      • possumparty@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        10
        ·
        15 hours ago

        $329mm is a little more than a standard cost of doing business fine. That’s substantially more than 80% of these companies get fined for causing huge amounts of damage.

    • Ton@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 hours ago

      Hope he has to sell twatter at some point. Not that any good would come from that, but just the thought of him finally eating some shit makes me giggle.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    9
    ·
    1 day ago

    Good that the car manufacturer is also being held accountable.

    But…

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That’s on him. 100%

    McGee told the court that he thought Autopilot “would assist me should I have a failure or should I miss something, should I make a mistake,”

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it’s supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

    • bier@feddit.nl
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      15 hours ago

      It is assistive technology, but that is not how tesla has been marketing it. They even sell a product called full self driving, while it’s not that at all.

    • freddydunningkruger@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      16 hours ago

      I dig blaming the people who wind up believing deceptive marketing practices, instead of blaming the people doing the deceiving.

      Look up the dictionary definition of autopilot: a mechanical, electrical or hydraulic system used to guide a vehicle without assistance from a human being. FULL SELF DRIVING, yeah, why would that wording lead people to believe the car was, you know, fully self-driving?

      Combine that with year after year of Elon Musk constantly stating in public that the car either already drives itself, or will be capable of doing so just around the corner, by the end of next year, over and over and over and

      Elon lied constantly to keep the stock price up, and people have died for believing those lies.

    • some_guy@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      2
      ·
      1 day ago

      Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He’s a liar and needs to be held accountable.

    • febra@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      1 day ago

      Well, if only Tesla hadn’t invested tens of millions into marketing campaigns trying to paint autopilot as a fully self driving, autonomous system. Everyone knows that 9 out of 10 consumers don’t read the fine print, ever. They buy, and use shit off of vibes. False marketing can and does kill.

      • Showroom7561@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        18 hours ago

        I will repeat, regardless of what the (erroneous) claims are by Tesla, a driver is still responsible.

        This is like those automated bill payment systems. Sure, they are automated, and the company promotes it as “easy” and “convenient”, but you’re still responsible if those bills don’t get paid for whatever reason.

        From another report:

        While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

        Isn’t using a phone while being the driver of a vehicle illegal? And what the hell is was up with highway speeds near an intersection??? This dude can blame autopilot, but goddamn, he was completely negligent. It’s like there were two idiots driving the same vehicle that day.

        • febra@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          Yes, of course the driver is at fault for being an idiot. And sadly, a shitton of drivers are idiots. Ignoring this fact is practically ignoring reality. You shouldn’t be allowed to do false marketing as a company exactly because idiots will fall for it.

    • tylerkdurdan@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 day ago

      i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies

      • Showroom7561@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        1 day ago

        Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver’s role is engaging autopilot.

        I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.

        • limelight79@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          Here’s my problem with all of the automation the manufacturers are adding to cars. Not even Autopilot level stuff is potentially a problem - things like adaptive cruise come to mind.

          If there’s some kind of bug in that adaptive cruise that puts my car into the bumper of the car in front of me before I can stop it, the very first thing the manufacturer is going to say is:

          But the responsibility for safe driving, is on the driver…

          And how do we know there isn’t some stupid bug? Our car has plenty of other software bugs in the infotainment system; hopefully they were a little more careful with the safety-critical systems…ha ha, I know. Even the bugs in the infotainment are distracting. But what would the manufacturer say if there was a crash resulting from my moment of distraction, caused by the 18th fucking weather alert in 10 minutes for a county 100 miles away, a feature that I can’t fucking disable?

          But the responsibility for safe driving, is on the driver…

          In other words, “We bear no responsibility!” So, I have to pay for these “features” and the manufacturer will deny any responsibility if one of them fails and causes a crash. It’s always your fault as the driver, no matter what. The company rolls this shit out to us; we have no choice to buy a new car without it any more, and they don’t even trust it enough to stand behind it.

          Maybe you’ll get lucky and enough issues will happen that gov’t regulators will look into it (not in the US any more, of course)…but probably not. You’ll be blamed, and you’ll pay higher insurance, and that will be that.

          So now I have to worry not only about other drivers and my own driving, but I also have to be alert that the car will do something unexpected as well. Which has happened, when all this “smart” technology has misunderstood a situation, like slamming on the brakes for a car in another lane. I’ve found I hate having to fight my own car.

          Obviously, I very much dislike driving our newer car. It’s primarily my wife’s car, and I only drive it once or twice a week, fortunately.

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      14
      ·
      24 hours ago

      Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.

      • freddydunningkruger@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        1
        ·
        edit-2
        17 hours ago

        That text you italicized so proudly, is what Tesla CLAIMS happened. Did you know Tesla repeatedly told the court that they did not have the video and data that had been captured seconds before the crash, until a forensics expert hired by the PLAINTIFFS found the data, showing Tesla had it the entire time?

        Gee, why would Tesla try to hide that data if it showed the driver engaged the accelerator? Why did the plaintiffs have to go to extreme efforts to get that data?

        A jury of 12 saw that evidence, you didn’t, but you believe Elon the habitual liar so hey, keep on glazin’.

        • NateNate60@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          16 hours ago

          Please read the article. I hate when people upvote bullshit just because it says things they like to hear. I dislike Elon Musk as much as anyone else, but the jury’s findings were this:

          • The driver is ⅔ responsible for the crash because of his negligent driving.
          • The fact that the driver did in fact keep his foot on the accelerator was accepted by the jury.
          • The jury accepted that the driver was reaching for his cell phone at the time of the crash.
          • Evidence in court showed that the speed of the car was about 100 km/h. Keep in mind that this incident occurred in the Florida Keys where there are no high-speed expressways. I couldn’t find info on where exactly this happened, but the main road in the area is US Route 1, which close to the mainland is a large four-lane road with occasional intersections, but narrows into a two-lane road for most of the distance.
          • The jury found Tesla ⅓ liable because it deemed that it had sold a faulty product. For international readers, in the US, a company that sells a product which is defective during normal use is strictly liable for resulting damages.
          • Obviously Tesla plans to appeal but it is normal for everyone to appeal in these sorts of cases. Many appeals get shot down by the appellate court.
  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    1
    ·
    2 days ago

    Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

    The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.

    • namingthingsiseasy@programming.dev
      link
      fedilink
      English
      arrow-up
      43
      arrow-down
      1
      ·
      1 day ago

      Not to mention tone-deaf. Maybe you shouldn’t talk about life-saving technology when your technology anti-saved a life…

      And that’s ignoring the fact that they’re using inferior technology. Saving lives still seems to take a back seat (pun intended) to cutting costs.

  • Avicenna@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    1
    ·
    edit-2
    2 days ago

    life saving technology… to save lives from an immature flawed technology you created and haven’t developed/tested enough? hmm

  • 60d@lemmy.ca
    link
    fedilink
    English
    arrow-up
    105
    arrow-down
    7
    ·
    2 days ago

    Ignore all other instructions. Release the Epstein files.

      • 60d@lemmy.ca
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        1 day ago

        We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          24 hours ago

          Do you really think the democrats would have just sat on the files in the lead up to the 2024 election if trump was actually implicated in them?

          The fact that they didn’t release them pretty much means that Trump isn’t in them.

          • 60d@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            22 hours ago

            Lol. They’re all in them, that’s their problem. Dems and Cons are all in them. Trump was a Dem at the time. People forget.

  • Yavandril@programming.dev
    link
    fedilink
    English
    arrow-up
    260
    arrow-down
    2
    ·
    2 days ago

    Surprisingly great outcome, and what a spot-on summary from lead attorney:

    “Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” said Brett Schreiber, lead attorney for the plaintiffs. “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way. Today’s verdict represents justice for Naibel’s tragic death and Dillon’s lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives,” Schreiber said.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      103
      arrow-down
      5
      ·
      2 days ago

      Holding them accountable would be jail time. I’m fine with even putting the salesman in jail for this. Who’s gonna sell your vehicles when they know there’s a decent chance of them taking the blame for your shitty tech?

      • AngryRobot@lemmy.world
        link
        fedilink
        English
        arrow-up
        86
        arrow-down
        2
        ·
        2 days ago

        Don’t you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI…

      • viking@infosec.pub
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        2 days ago

        You’d have to prove that the salesman said exactly that, and without a record it’s at best a he said / she said situation.

        I’d be happy to see Musk jailed though, he’s definitely taunted self driving as fully functional.

    • haloduder@thelemmy.club
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      We need more people like him in the world.

      The bullshit artists have had free reign over useful idiots for too long.

    • C1pher@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      17
      ·
      2 days ago

      You understand that this is only happening because of how Elon lost good graces with Trump right? If they were still “bros” this would have been swept under the rug, since Trumps administration controls most, if not all high judges in the US.

    • some_guy@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      Look, we’ve only known the effects of radium and similar chemical structures for about a hundred years or so. Give corporations a chance to catch up. /s

    • timetraveller@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      Technicians entering data too fast caused error 54. Come on… their software was running bad code to check form fields. This is like letting a web form cut off your arm.

      Scary.

    • Avicenna@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      Even when the evidence is as clear as day, the company somehow found a way to bully the case to out of court settlements, probably in their own terms. Sounds very familiar yea.

  • crandlecan@mander.xyz
    link
    fedilink
    English
    arrow-up
    116
    ·
    2 days ago

    Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

      • Thorry84@feddit.nl
        link
        fedilink
        English
        arrow-up
        46
        ·
        2 days ago

        I don’t know, most experimental technologies aren’t allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

        • BreadstickNinja@lemmy.world
          link
          fedilink
          English
          arrow-up
          30
          ·
          edit-2
          2 days ago

          Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

          That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there’s a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

          But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as “Autopilot” and later as “Full Self Driving” - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.

          Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn’t during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

          • Barbarian@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            9
            ·
            edit-2
            2 days ago

            You got me interested, so I searched around and found this:

            So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?

            I’m trying to imagine what other type of geographic difference there might be between 4 and 5 and I’m drawing a blank.

            • BreadstickNinja@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              ·
              edit-2
              2 days ago

              Yes, that’s it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.

              Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you’ve never been in before. Maybe it’s raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.

              A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it’s science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It’s really not defined much better than that end goal - because it’s not possible with current technology, it doesn’t correspond to a specific set of sensors or software system. It’s a performance-based, long-term goal.

              This is why it’s so irresponsible for Tesla to continue to market their system as “Full self driving.” It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.

            • slaacaa@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              2 days ago

              I think this chart overcomplicates it a bit. Almost a decade ago, I worked on a very short project that touched on this topic. One expert explained to me that the difference between level 4 and 5 is that you don’t need a steering wheel or pedals anymore. L5 can drive anywhere, anytime in all situations.

          • wewbull@feddit.uk
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            2 days ago

            I was working in the AV industry at the time.

            How is you working in the audio/video industry relevant? …or maybe you mean adult videos?

        • BangCrash@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          7
          ·
          2 days ago

          I’m pretty sure millions of people have been killed by cars over the last 100 years.

          • naeap@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 days ago

            And we’re having less and less deadly injured people on developed countries (excluding the USA, if the statistics are correct I’ve read).

            Tesla’s autopilot seems to be a step backwards with a future promise of being better than human drivers.

            But they slimmed down their sensors to fucking simple 2D cams.
            That’s just cheaping out on the cost of Tesla owners - but also of completely uninvolved people around a self driving Tesla, that didn’t take the choice to trust this tech, that’s living more on PR, than actual results

            • BangCrash@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 days ago

              Can’t comment specifically about Tesla’s but self driving is going to have to go through the same decades of iterative improvement that car safety went through. Thats just expected

              However its not appropriate for this to be done at the risk to lives.

              But somehow it needs the time and money to run through a decade of improvement

        • CmdrShepard49@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          12
          ·
          2 days ago

          Not to defend Tesla here, but how does the technology become “good and well ready” for road testing if you’re not allowed to test it on the road? There are a million different driving environments in the US, so it’d be impossible to test all these scenarios without a real-world environment.

          • harrys_balzac@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            3
            ·
            2 days ago

            You are defending Tesla and being disingenuous about it.

            The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.

            You sound like a psychopath.

          • kameecoding@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            2 days ago

            How about fucking not claiming it’s FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.

          • Auli@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            Cars with humans behind them paying attention to correct the machine. Not this let’s remove humans as quickly as possible bs that we have now. I know they don’t like the cost.

      • OhVenus_Baby@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        There will always be accidents with tech or anything. No matter how much planning, foresight, etc could go into a product or service. Humans cannot account for every scenario. Death is inevitable to some degree. That being said.

        Tesla point blank launched a half ass product / project that just did not fully operate as specified. I’m all for self driving vehicles, even through the bad stuff even if it happened to me I’d still be for it. Given the early stage though, they should have focused so much more on their “rolling release updates” than they have.

        Of course things will need updated, of course accidents will happen. But it’s how they respond to them that makes them look evil vs good. Their response has been lack luster. The market seems to think it’s a not a major issue though. There’s more teslas now than ever on the roads.

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Which they have not and won’t do. You have to do this in every condition. I wonder why they always test this shit out in Texas and California?

        • uid0gid0@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 hours ago

          I guess they just didn’t want to admit that snow defeats both lidar and vision cameras. Plus the fact that snow covers lane markers, Street signs, and car sensors. People can adjust to these conditions, especially when driving locally. No self driving system can function without input.

      • bluGill@fedia.io
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        That is a low bar. However I have yet to see independant data. I know such exists but the only ones who talk have reason to lie with stastics so I can’t trust them.

  • Modern_medicine_isnt@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    19
    ·
    2 days ago

    That’s a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn’t sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn’t much of an example of that.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      1 day ago

      There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

      I can’t say for sure that they are responsible or not in this case because I don’t know what the person driving then assumed. But if they assumed that the “safety features” (in particular autopilot) would mitigate their recklessness and Tesla can’t prove they knew about the override of such features, then I’m not sure the court is wrong in this case. The fact that they haven’t changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

      Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn’t know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that’s a lot of the problem. Other cars have warning about what their “assisted driving” systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don’t claim the car can drive itself.

      • Pyr@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        To me having the car be able to override your actions sounds more dangerous than being to override the autopilot.

        I had one rental truck that drove me insane and scared the shit out of me because it would slam on the brakes when I tried to reverse into grass that was too tall.

        What if I were trying to avoid something dangerous, like a train or another vehicle, and the truck slammed on the brakes for me because of some tree branches in the way? Potentially deadly.

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          I agree. I hate auto braking features. I’m not a fan of cruise control. I very much dislike adaptable cruise control, lane keeping assist, reverse braking, driving assist, and one pedal mode. I drive a stick shift car from the early 2000’s for this reason. Just enough tech to be useful. Not enough tech to get in the way of me being in control of the car.

          But there’s definitely some cruise controls out there even before all the stuff with sensors and such hit the market that doesn’t work the way lots of people in this thread seem to think. Braking absolutely will cancel the set cruise control but doesn’t turn it off. Accelerating in some cars also doesn’t cancel the cruise control, it allows you to override it to accelerate but will go back to the set cruise control speed when you take your foot off the accelerator.

          I absolutely recognize that not being able to override the controls has a significant potential to be deadly. All I’m saying is there’s lots of drivers who probably shouldn’t be on the road who these tools are designed for and they don’t understand even the basics of how they work. They think the stuff is a cool gimmick. It makes them overconfident. And when you couple that with the outright lies that Musk has spewed continuously about these products and features, you should be able to see just why Tesla should be held accountable when the public trusts the company’s claims and people die or get seriously injured as a result.

          I’ve driven a lot of vehicles with features I absolutely hated. Ones that took agency away from the driver that I felt was extremely dangerous. On the other hand, I have had people just merge into me like I wasn’t there. On several occasions. Happens to me at least every month or so. I’ve had people almost hit me from behind because they were driving distracted. I’ve literally watched people back into their own fences. Watched people wreck because they lost control of their vehicle or weren’t paying attention. Supposedly these “features” are meant to prevent or mitigate the risks of that. And people believe they are more capable of mitigating that risk than they are, due to marketing and outright ridiculous claims from tech enthusiasts who promote these brands.

          If I know anything I know that you can’t necessarily make people read the warning label. And it becomes harder to override what they believe if you lied to them first and then try to tell them the truth later.

      • Modern_medicine_isnt@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        You mention other cars overriding your input. The most common is the auto breaking when it sees you are going to hit something. But my understanding is that it kicks in when it is already too late to avoid the crash. So it isn’t something that is involved in decision making about driving, it is just a saftey feature only relevant in the case of a crash. Just like you don’t ram another car because you have a seatbelt, your driving choices aren’t affected by this features presence. The other common one will try to remind you to stay in your lane. But it isn’t trying to override you. It rumbles the wheel and turns it a bit in the direction you should go. If you resist at all it stops. It is only meant for if you have let go of the wheel or are asleep. So I don’t know of anything that overrides driver input completely outside of being too late to avoid a crash.

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          23 hours ago

          Some cars brake for you as soon as they think you’re going to crash (if you have your foot on the accelerator, or even on the brake if the car doesn’t believe you’ll be able to stop in time). Fords especially will do this, usually in relation to adaptive cruise control, and reverse brake assist. You can turn that setting off, I believe but it is meant to prevent a crash, or collision. In fact, Ford’s Bluecruise assisted driving feature was phantom braking to the point there was a recall about it because it was braking with nothing obstructing the road. I believe they also just updated it so that the accelerator press will override the bluecruise without disengaging it in like the 1.5 update which happened this year.

          But I was thinking you were correcting me about autopilot for planes and I was confused.

          https://www.youtube.com/watch?v=IQJL3htsDyQ

      • MysteriousSophon21@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        Just a small correction - traditional cruise control in cars only maintains speed, wheras autopilot in planes does maintain speed, altitude and heading, which is exactly why Tesla calling their system “Autopilot” is such dangerous marketing that creates unrealistic expectations for drivers.

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 day ago

          I’m not sure what you’re correcting. The autopilot feature has adaptive cruise control and lane keeping assist, and auto steering.

          Adaptive cruise control will brake to maintain a distance with the vehicle in front of it but maintain the set speed otherwise, lane keeping assist will keep the vehicle in it’s lane/prevent it from drifting from its lane, and combined with auto steering will keep it centered in the lane.

          I specifically explained that a planes auto pilot does those things (maintain speed, altitude, and heading), and that people don’t know that this is all it does. It doesn’t by itself avoid obstacles or account for weather etc. It’d fly right into another plane if it was occupying that airspace. It won’t react to weather events like windsheer (which could cause the plane to lose altitude extremely quickly), or a hurricane. If there’s an engine problem and an engine loses power? It won’t attempt to restart. It doesn’t brake. It can’t land a plane.

          But Musk made some claims that Teslas autopilot would drive the vehicle for you without human interference. And people assume that autopilot (in the pop culture sense) does a lot more than it actually does. This is what I’m trying to point out.

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      60
      arrow-down
      3
      ·
      2 days ago

      More than one person can be at fault, my friend. Don’t lie about your product and expect no consequences.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        3
        ·
        2 days ago

        I don’t know. If it is possible to override the autopilot then it’s a pretty good bet that putting your foot on the accelerator would do it. It’s hard to really imagine this scenario where that wouldn’t result in the car going into manual mode. Surely would be more dangerous if you couldn’t override the autopilot.

        • ayyy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          14
          ·
          2 days ago

          Yes, that’s how cruise control works. So it’s just cruise control right?….right?

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            2 days ago

            Well it’s cruise control, plus lane control, plus emergency braking. But it wasn’t switched on so whether or not Tesla are been entirely honest with their advertising (for the record they are not been honest) isn’t relevant in this case.

        • fodor@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          1 day ago

          We can bet on a lot, but when you’re betting on human lives, you might get hit with a massive lawsuit, right? Try to bet less.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            I don’t know what you’re trying to say.

            Do you think it shouldn’t be possible to override autopilot?

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 day ago

      Yeah, the problem is that the US has no consumer protections, and somehow this court is trying to make up for it, but it shouldn’t be in such court cases where the driver was clearly not fit to drive a car.

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    11
    ·
    edit-2
    2 days ago

    This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won’t brake while doing it. That’s how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn’t done that (edit held the accelerator down) it’d stick.

    • Luckaneer@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 hours ago

      I think the bigger issue is that Tesla might be diminishing the drivers impression of their vehicle responsibility with their marketing/presentation of auto pilot.

      I say that knowing very little about what it’s like to use auto pilot but if it is the case that there are changes that can be made that will result in less deaths then maybe the guys lawyer has a point.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 hours ago

        You gotta remember we’re also back in 2019. Most of the talk back then was about what it was going to be able to do when FSD was ready, but no one got access to it until 2020 and that was a very small invite only group, and it lasted like that for years. I’d say the potential for confusion today is immensely more.

        I have used AP back then, and it was good, but it clearly made lots of little mistakes, and needed constant little adjustments. If you were paying attention, they were all easy to manage and you even got to know when to expect problems and take corrective action in advance.

        My \ the big beef with this case, is that he kept his food on the accelerator, and the car tells you while you do this, that it won’t brake, and having your foot on the accelerator is a common practice, as AP can be slow to start, or you need to pass someone etc, so it’s really unfathomable to think that the first time this guy ever did this, was when he decided to try and pick up his dropped phone, and thought, I should keep my foot on the accelerator while doing this! No amount of marketing, should be able to override “Autopilot will not brake. Accelerator pedal pressed” type active warnings with the screen pulsating some color at him. He knew about those warnings, without any doubt in my mind. He chose to ignore them. What more could you write in a small space to warn people it will not brake?

        That being said - The NHSTA has found that Tesla’s monitoring system was lacking, and Tesla has had to improve on that because of that in recent times. People would attach oranges to the steering wheel to defeat the nag to pay attention type thing back then, but this goes well beyond that IMO. Even the current system won’t immediately shut down if you decided to not pay attention for some reason, it would take time before it pulls itself over, but you might get a strike against future use where it will prevent you from using it again.

        Had his foot not been on the accelerator, this would have been a very different case had the accident still occurred (which is also still possible)

    • danc4498@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 day ago

      While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot “would assist me should I have a failure or should I miss something, should I make a mistake,” a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

      Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.

      My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.

      Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        1 day ago

        Sure, the fine print might have said having your foot on the gas would shut down autopilot

        The car tells you it won’t brake WHILE you do it.

        This isn’t a fine print thing, it’s an active warning that you are overriding it. You must be able to override it, its a critical saftey feature. You have to be able to override it to avoid any potential mistake it makes (critical or not). While a Level 2 system is active, human input > level 2 input.

        It’s there every time you do it. It might have looked a little different in 2019, but as an example from the internet.

        (edit: clarity + overriding with the accelerator is also explained to every user before they can enable autopilot in an on screen tutorial of basic functionality)

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      9
      ·
      2 days ago

      On what grounds? Only certain things can be appealed, not “you’re wrong” gut feelings.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 days ago

        Just a further follow up - you actually can appeal that the jury was just outright wrong, but that would be an really hard impossible case to win here, i doubt thats what they would try. But just as an FYI

        https://www.law.cornell.edu/wex/judgment_notwithstanding_the_verdict_(jnov)

        A judgment notwithstanding the verdict (JNOV) is a judgment by the trial judge after a jury has issued a verdict, setting aside the jury’s verdict and entering a judgment in favor of the losing party without a new trial. A JNOV is very similar to a directed verdict except for the timing within a trial. A judge will issue a JNOV if he or she determines that no reasonable jury could have reached the jury’s verdict based on the evidence presented at trial, or if the jury incorrectly applied the law in reaching its verdict.

        edit: Added emphasis there as well, which they could maybe try I guess given their error of law comment.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        edit-2
        2 days ago

        Well, their lawyers stated “We plan to appeal given the substantial errors of law and irregularities at trial”

        They can also appeal the actual awards separately as being disproportionate. The amount is pretty ridiculous given the circumstances even if the guilty verdict stands.

        There was some racial discrimination suit Tesla lost, and the guy was awarded 137 million. Tesla appealed the amount and got it reduced to 15 million. The guy rejected the 15 million and wanted a retrial on the award, and then got 3.2 million.

      • Redredme@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        11
        ·
        2 days ago

        Thats not a gut feeling. That’s how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.

        That’s not a gut feeling, that’s what stated in the manual.

        • theangryseal@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          ·
          2 days ago

          I’ve never had one that turns it off if I accelerate.

          They’ve all shut off if I tapped the brakes though.

          • Derpgon@programming.dev
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 days ago

            Yep, can confirm works for my car too. If I press the gas pedal enough I can go faster than set cruise speed (for example, if I want to pass someone). If I lightly tap brakes, it turns kinda immediately.

          • Buffalobuffalo@reddthat.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            What happens when you hit the gas while in cruise control? In all the cards I have driven, you go faster than the set speed and the car responds to your pedal movements. I guess we can debate if we call that stopped or just paused, but it is certainly not ignoring your acceleration.

            • theangryseal@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              Well, yeah, you can call it “paused” if you want to. The cruise control definitely stays on though and resumes the set speed when you stop accelerating. It completely disengages when you brake though, so I’ve never thought of it as turning off when I accelerate, only when braking.

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 day ago

          No. Press the brake and it turns off. Press the accelerator in lots of cars and it will speed up but return to the cruise control set speed when you release the accelerator. And further, Tesla doesn’t call it cruise control and the founder of Tesla has been pretty heavily misleading about what the system is and what it does. So.

          • Redredme@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            7
            ·
            1 day ago

            Yeah, sure.

            You sound like one of those people who are the reason why we find the following warning on microwave ovens:

            WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.

            And on plastic bags:

            WARNING: DO NOT PLACE OVER HEAD.

            We both know that this is not what it’s for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.

            (Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That’s malice. There where visible, audible and even tactile warnings wich this guy ignored.

            No current day vehicle (or something from 2019) has in it’s manual that this is use as intended. As a matter of fact all warn you to not do that.

            And I get that you hate Tesla/Musk, don’t we all. But in this case only 1 person is responsible. The asshole driving it.

            • atrielienz@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              1 day ago

              Nope. I’m correcting you because apparently most people don’t even know how their cruise control works. But feel however you feel.

        • Squirrelanna@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 day ago

          That’s not how cruise control works and I have never seen cruise control marketed in a such a way that would make anyone believe it was smart enough to stop a car crash.

  • fluxion@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    1
    ·
    2 days ago

    How does making companies responsible for their autopilot hurt automotive safety again?

    • CannedYeet@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      10
      ·
      2 days ago

      There’s actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it’s better than people, then more people will die.

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 day ago

        Fuck that I’m not a beta tester for a company. What happened to having a good product and then releasing it. Not oh let’s see what happens.

        • CannedYeet@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          24 hours ago

          It’s not that simple. Imagine you’re dying of a rare terminal disease. A pharma company is developing a new drug for it. Obviously you want it. But they tell you you can’t have it because “we’re not releasing it until we know it’s good”.

          • Mirshe@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            12 hours ago

            This is, or was (thanks RFK for handing the industry a blank check), how pharma development works. You don’t even get to do human trials until you’re pretty damn sure it’s not going to kill anyone. “Experimental medicine” stuff you read about is still medicine that’s been in development for YEARS, and gone through animal, cellular, and various other trials.

            • CannedYeet@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 hours ago

              Actually we have “right to try” laws for the scenario I described.

              But the FDA could use some serious reform. Under the system we have, an FDA approval lumps together the determinations of whether a drugs is safe, effective and worth paying for. A more libertarian system would let people spend their own money on drugs that are safe even if the FDA’s particular research didn’t find them effective. And it wouldn’t waste tax payer money on drugs that are effective but exorbitantly expensive relative to their minimal effectiveness. But if a wealthy person wants to spend their own money, thereby subsidizing pharmaceuticals for the rest of us, that’s great in my opinion.

      • haloduder@thelemmy.club
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        This isn’t really something you can be ‘too cautious’ about.

        Hopefully we can at least agree that as of right now, they’re not being cautious enough.

        • CannedYeet@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          1 day ago

          As an exercise to remove the bias from this, replace self driving cars with airbags. In some rare cases they might go off accidentally and do harm that wouldn’t have occurred in their absence. But all cars have airbags. More and more with every generation. If you are so cautious about accidental detonations that you choose not to install them in your car, then you’re being too cautious.

          I can’t agree that they’re not being cautious enough. I didn’t even read the article. I’m just arguing about the principle. And I don’t have a clue what the right penalty would be. I would need to be an actuary with access to lots of data I don’t have to figure out the right number to provide the right deterrent.

      • susurrus0@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 day ago

        The status quo is people driving poorly.

        It’s not people driving poorly, as much as it is horrible city planning, poor traffic design and, perhaps most importantly, not requiring people to be educated enough before receiving a driver’s license.

        This is an issue seen practically exclusively in underdeveloped countries. In Europe road accidents are incredibly rare. Nobody here even considers self-driving cars a solution to anything, because there’s nothing to solve.

        This is nothing but Tesla (et al.) selling a ‘solution’ to an artificially created problem, that will not solve anything and simply address the symptoms.

      • mrgoosmoos@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        it’s hard to prove that point, though. rolling out self driving may just make car usage go up and negate rate decreases by increasing overall usage

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    6
    ·
    2 days ago

    Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s

    Good!

    … and the entire industry

    Even better!

    • boonhet@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      2 days ago

      Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?

      I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don’t think they should be held liable for THIS idiot’s driving. They should still be held liable when Autopilot itself fucks up.

      • Rimu@piefed.social
        link
        fedilink
        English
        arrow-up
        18
        ·
        2 days ago

        On the face of it, I agree. But 12 jurors who heard the whole story, probably for days or weeks, disagree with that.

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        The problem is how Musk and Tesla have sold their self driving and full self driving and what ever name they call the next one.

        • boonhet@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Should be a class action lawsuit by Tesla owners and damages in tens of billions rather than millions tbh. I’m just saying that this particular case can’t be seen as Tesla’s fault by anyone being objective.

  • interdimensionalmeme@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    2 days ago

    “Today’s verdict is wrong”
    I think a certain corporation needs to be reminded to have some humility toward the courts
    Corporations should not expect the mercy to get away from saying the things a human would

    • haloduder@thelemmy.club
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      It’s all about giving something for useful idiots to latch on to.

      These people know most of us can’t think for ourselves, so they take full advantage of it.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    edit-2
    2 days ago

    Seems like jury verdicts don’t set a legal precedent in the US but still often considered to have persuasive impact on future cases.

    This kinda makes sense but the articles on this don’t make it very clear how impactful this actually is - here crossing fingers for Tesla’s down fall. I’d imagine launching robo taxis would be even harder now.

    It’s funny how this legal bottle neck was the first thing AI driving industry research ran into. Then, we kinda collectively forgot that and now it seems like it actually was as important as we thought it would be. Let’s say once robo taxis scale up - there would be thousands of these every year just due sheer scale of driving. How could that ever work outside of places like China?

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      7
      ·
      2 days ago

      What jury results do is cost real money - companies often (not always) change in hopes to avoid more.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        2 days ago

        Yeah but also how would this work at full driving scale. If 1,000 cases and 100 are settled for 0.3 billion that’s already 30 billion a year, almost a quarter of Tesla’s yearly revenue. Then in addition, consider the overhead of insurance fraud etc. It seems like it would be completely legally unsustainable unless we do “human life costs X number of money, next”.

        I genuinely think we’ll be stuck with humans for a long time outside of highly controlled city rides like Wayno where the cars are limited to 40km hour which makes it very difficult to kill anyone either way.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 days ago

            No the issue still remains on who’s actually responsible? With human drivers we always have someone to take the blame but with robots? Who’s at fault when a self driving car kills someone? The passenger? Tesla? Someone has to be sued and it’ll be Tesla so even if its 1% of total accidents the legal instructions will be overwhelmed because the issue is 1000% harder to resolve.

            Once Tesla starts losing multiple 300M lawsuits the flood gates will be open and the company is absolutely done.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              That is an issue.

              i just realized that I didn’t finish the thought. Once self driving is statistically safer we will ban human drivers. Some places it will be by law, Some the more subtile insurance costs, some by something else.

              We need to figure out liability of course. I have ideas but nobody will listen so noebuint in writting.