New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired
New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired
This is stupid. Teslas can park themselves, they’re not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.
That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver’s fault and they should be held responsible for their actions. It’s not the courts job to legislate.
It’s actually the NTSB’s job to regulate car safety so if they don’t already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.
There’s no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I’ve experienced it first hand.
The headline doesn’t state that the warnings were consecutive.
Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?
I’ll grant you, though, 150 warnings is still a little tough tough to believe…
Evidently, he was aware enough to respond to the alerts, per the logs (as stated in the WSJ video that’s in the article). It shows a good bit of the footage, too.
Seems like they need something better for awareness checking than just gripping the wheel and checking where your eyes are pointed. And obviously better sensors for object recognition.
The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called “self driving”. Compare Tesla’s marketing to like GMs of Hyundai’s, both of which essentially have parity with Teslas system in terms of actual features, and you’ll see a big difference
deleted by creator
I turned off the “lane assist” in our Mazda because it kept steering me back toward obstacles I was trying to avoid, like cyclists, oversized loads, potholes, etc. I don’t know why anyone thought that was a good idea.
But try buying a car without those features now…sigh.
Use your turn signal to indicate your direction change and it won’t do that.
If you’re swerving to avoid a sudden obstacle you reasonably may not have the foresight or reaction to flip on a signal. The car still needs to not force you back on collision course.
That’s a good point, and is probably why they designed it so that if you swerve hard, lane assist shuts off. It only nudges you back to the middle of the lane if you are gently drifting to a side, so it only works in situations where your turn signal can be used to avoid it. Or you can just disable it if you drive a BMW or otherwise can’t use turn signals.
Even moving over slightly in the lane to avoid a pothole triggers it; it doesn’t seem like a turn signal should be necessary in that situation. Instead the situation seems to be that I’m seeing the pothole and altering the car’s course gently to avoid it, and I get close to the line and it freaks out.
I guess if I drove right up to the obstacle then swerved, it wouldn’t do it…but I was always taught swerving was a last-resort thing, best to drive as smoothly as possible. (This was my dad’s argument, and I said, “Uh, SOMEONE taught me to not swerve unless it was necessary…” (him). He laughed.
I agree. More regulation may be necessary, but it isn’t a cure-all. It can’t be relied upon to prevent people from being stupid.
Sounds like the injured officers are suing. It’s a civil case not criminal, so I’m not sure how much the court would actually be asked to legislate. I’d be interested to hear their arguments, though I’m sure part of their reasoning for suing Tesla over the driver is they have more money.
Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.
You say that yet a Tesla did exactly that, which caused some tailgaters to crash into the back of it, and everyone blamed the Tesla for causing an accident.
https://theintercept.com/2023/01/10/tesla-crash-footage-autopilot/