A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”

So, you admit that the company’s marketing has continued to lie for the past six years?

  • crandlecan@mander.xyz
    link
    fedilink
    English
    arrow-up
    116
    ·
    2 days ago

    Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

      • Thorry84@feddit.nl
        link
        fedilink
        English
        arrow-up
        46
        ·
        2 days ago

        I don’t know, most experimental technologies aren’t allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

        • BreadstickNinja@lemmy.world
          link
          fedilink
          English
          arrow-up
          30
          ·
          edit-2
          2 days ago

          Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

          That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there’s a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

          But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as “Autopilot” and later as “Full Self Driving” - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.

          Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn’t during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

          • Barbarian@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            9
            ·
            edit-2
            2 days ago

            You got me interested, so I searched around and found this:

            So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?

            I’m trying to imagine what other type of geographic difference there might be between 4 and 5 and I’m drawing a blank.

            • BreadstickNinja@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              ·
              edit-2
              2 days ago

              Yes, that’s it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.

              Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you’ve never been in before. Maybe it’s raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.

              A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it’s science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It’s really not defined much better than that end goal - because it’s not possible with current technology, it doesn’t correspond to a specific set of sensors or software system. It’s a performance-based, long-term goal.

              This is why it’s so irresponsible for Tesla to continue to market their system as “Full self driving.” It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.

            • slaacaa@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              2 days ago

              I think this chart overcomplicates it a bit. Almost a decade ago, I worked on a very short project that touched on this topic. One expert explained to me that the difference between level 4 and 5 is that you don’t need a steering wheel or pedals anymore. L5 can drive anywhere, anytime in all situations.

          • wewbull@feddit.uk
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            2 days ago

            I was working in the AV industry at the time.

            How is you working in the audio/video industry relevant? …or maybe you mean adult videos?

        • BangCrash@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          7
          ·
          2 days ago

          I’m pretty sure millions of people have been killed by cars over the last 100 years.

          • naeap@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 days ago

            And we’re having less and less deadly injured people on developed countries (excluding the USA, if the statistics are correct I’ve read).

            Tesla’s autopilot seems to be a step backwards with a future promise of being better than human drivers.

            But they slimmed down their sensors to fucking simple 2D cams.
            That’s just cheaping out on the cost of Tesla owners - but also of completely uninvolved people around a self driving Tesla, that didn’t take the choice to trust this tech, that’s living more on PR, than actual results

            • BangCrash@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 days ago

              Can’t comment specifically about Tesla’s but self driving is going to have to go through the same decades of iterative improvement that car safety went through. Thats just expected

              However its not appropriate for this to be done at the risk to lives.

              But somehow it needs the time and money to run through a decade of improvement

        • CmdrShepard49@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          12
          ·
          2 days ago

          Not to defend Tesla here, but how does the technology become “good and well ready” for road testing if you’re not allowed to test it on the road? There are a million different driving environments in the US, so it’d be impossible to test all these scenarios without a real-world environment.

          • harrys_balzac@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            3
            ·
            2 days ago

            You are defending Tesla and being disingenuous about it.

            The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.

            You sound like a psychopath.

          • kameecoding@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            2 days ago

            How about fucking not claiming it’s FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.

          • Auli@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            Cars with humans behind them paying attention to correct the machine. Not this let’s remove humans as quickly as possible bs that we have now. I know they don’t like the cost.

      • OhVenus_Baby@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        There will always be accidents with tech or anything. No matter how much planning, foresight, etc could go into a product or service. Humans cannot account for every scenario. Death is inevitable to some degree. That being said.

        Tesla point blank launched a half ass product / project that just did not fully operate as specified. I’m all for self driving vehicles, even through the bad stuff even if it happened to me I’d still be for it. Given the early stage though, they should have focused so much more on their “rolling release updates” than they have.

        Of course things will need updated, of course accidents will happen. But it’s how they respond to them that makes them look evil vs good. Their response has been lack luster. The market seems to think it’s a not a major issue though. There’s more teslas now than ever on the roads.

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Which they have not and won’t do. You have to do this in every condition. I wonder why they always test this shit out in Texas and California?

        • uid0gid0@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 hours ago

          I guess they just didn’t want to admit that snow defeats both lidar and vision cameras. Plus the fact that snow covers lane markers, Street signs, and car sensors. People can adjust to these conditions, especially when driving locally. No self driving system can function without input.

      • bluGill@fedia.io
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        That is a low bar. However I have yet to see independant data. I know such exists but the only ones who talk have reason to lie with stastics so I can’t trust them.