New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • HalcyonReverb@midwest.social
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    1 year ago

    I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 “keep your hands on the wheel” notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn’t do something similar.

    • tony@lemmy.hoyle.me.uk
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      It does and did… He kept driving anyway. Drink drivers FTW.

      I presume AEB kicked in but all that can do is reduce the speed of inpact… if you’re determined to kill yourself there’s not much the car can do.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          The problem with this is what if the car thinks there’s a barrier in front of you but there isn’t? People are arguing that these systems are too intrusive while also arguing that they don’t go far enough to take control away from drivers.

          This situation happened because a drunk driver ran into police cars, something that has been happening for as long as cars have existed.

          • Obi@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            That’s the issue with current “self driving” systems in a nutshell. We’re in this terrible middle ground right now where these features let careless drivers take their attention away, but not actually be able to control the vehicle safely. We should ban all that crap until actual self driving is viable.

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              1 year ago

              How does it become viable if you ban the technology? What we have now is advanced cruise control that protects drivers in some circumstances while having zero effect in others. Drivers were equally dumb and careless long before this technology existed. This new tech doesn’t make that aspect any worse. Banning it now just means more people will crash and more people will be injured.

              • Obi@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                Here’s a an article referencing a UK white paper that talks about the issues with level 2 and 3 autonomous vehicles.

                https://www.tu-auto.com/adas-level-2-3-avs-are-hazards-experts-warn/

                *“With adaptive cruise control (ACC) for instance, it takes twice the amount of time to respond to a sudden braking event than it does when you are manually driving. Drivers may believe that ACC is safer but actually taking your foot off the accelerator pedal and letting the car make the decisions leads to lower workload and can mean drivers are unprepared for an unexpected event.”

                University of Sussex object recognition researcher Dr Graham Hole was also questioned for the study and dubs Levels 2 and 3 “the worst of all worlds”. He says: “Human beings are rubbish at being vigilant – vigilance declines after about 20 minutes. With semi-autonomous you are reducing the driver to monitoring the system on the off-chance something goes wrong. Most of the time nothing goes wrong, leading the driver to have massive faith in the system in all conditions, which of course isn’t always the case.”*

                • CmdrShepard@lemmy.one
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  The paper features a defense of ADAS by Thatcham Research principal automated driving engineer Colin Grover, who claims much of the tech “operates in the background, like autonomous emergency braking … not all ADAS adds distraction … it is there to help when needed.”

                  Your first quote is only referring to ACC which maintains speed and distance between you and the car in front of you, but doesn’t include automatic braking, something included on all the cars with these systems currently.

                  I’ll ask again, how do you achieve level 4/5 autonomy if you ban these from the road and they never get real world testing.

                  • Obi@sopuli.xyz
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    1 year ago

                    Well, to answer your question, I’d say that it needs to be a coordinated national/international effort (e.g. led by the E.U for Europe). This gives the ability to enact long term, coordinated planning with predetermined cut-off dates where not only the technology of the cars would change, but also infrastructure.

                    To me it doesn’t make sense to adapt the vehicles to work with an infrastructure designed for humans, so if we really want self driving vehicles we should adapt the infrastructure for it, and also we should have all the cars talk to each other so they can work in unison (e.g. they would all start perfectly at the same time after a “red light”, which wouldn’t even need to be one, and eliminate collisions since everything would be predicted by the AI, what can’t be would still have to rely on cameras and sensors of course).

                    Meanwhile, car manufacturers could keep adding smart safety features, but nothing marketed as “autopilot” or “self-driving”.