News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Government investigation finds Tesla FSD was indeed engaged during the Bay Bridge pileup

Started by Redaktion, January 18, 2023, 10:15:42

Previous topic - Next topic

Redaktion

The US Department of Transportation has released the NHTSA agency's special crash investigation findings of the Model S-caused pileup in San Francisco on Thanksgiving Day. The accident in the Bay Bridge tunnel has occurred with Tesla's Full Self-Driving Beta mode active indeed, just as the driver claimed.

https://www.notebookcheck.net/Government-investigation-finds-Tesla-FSD-was-indeed-engaged-during-the-Bay-Bridge-pileup.682641.0.html

Willy

Last time I checked, if you're in a collision because a vehicle in front of you stopped suddenly, it's your fault for following too closely.

MarkF

Not always true.  There are several instances where the lead driver was found at fault.  And this could fall under one of those exceptions.  The main exception to the following vehicles being at fault is when the lead vehicles braked hard for no reason, and if witnesses can be provided to back up the claim that the braking was unnecessary.

TeeKay

California law states: "It is illegal for you to be traveling on the freeway at a slower speed than which impedes the traffic behind you."

MarkF

Also the article states that the Tesla changed lanes and slowed down.  That means it may have been impossible for the car behind to maintain a safe distance, because in all likelihood the Tesla cut them off without there being enough room to stop.   Where I am from there was a rash of people doing this intentionally in front of Semi's in order to sue them for whiplash so it became important to learn who really caused the "accident".  In this case it was not intentional, but most accidents are not intentional, and that doesn't mean you get a free pass.

Sabrina Keyhani

First of all, who cares if FSD was on when it crashed.  Anyone with HALF A F*****G brain would monitor FSD VERY CLOSELY, especially, in traffic.  Just because your car HAS a feature doesn't mean that YOU—- THE HUMAN are responsible for how your vehicle behaves. 
The main premise of defensive driving is ALWAYS being in control of your vehicle, ALWAYS be alert & know your surroundings:  front, back, side to side, &, MOST IMPORTANTLY, ALWAYS ANTICIPATE your next step should something occur that you don't expect. 
Furthermore it's called BETA for a reason!  Meaning that your vehicle could behave unexpectedly & it may be unpredictable.  So, CLOSER attn is required than when you are the pilot, DUH!  My dad always said when you use cruise control you should always have your foot covering the break.  Whenever I read FSD incidents it's baffling!  I HAVE FSD & I can admit that it does act funky sometimes in BETA & where does it act funkiest?  Glad you asked, automatic lane changes, traffic & street ability to stop/go (in fact TOTALLY unreliable) it reads the wrong lights i.e. in turning lane & red light but lanes that are straight is green it may try and go with the wrong light.  I would never rely on it for that.
I've had it slam on brakes for no reason but I paid close enough attention that I was able to slam the accelerator so no one behind ever knew that there was an issue other than a quick tap on my brakes before I put the pedal to the medal. 
Bottom line if you can't drive worth a s**t in normal circumstances then don't try to do something beyond your control or expertise.  And don't download BETA for Christ's sakes!

Joel

It's seems that everyone just finds an excuse to go after Tesla for any dumb excuse.
As a Tesla owner that has FSD. First of all we need to stop blaming FSD and look at the owner. The car will give you several prompts and if you ignore those prompts yes the car will stop. The owner ignored the prompts and that is the reason why the car most likely stopped. The owner of the car caused the accident, not FSD...

Gus

I still think everyone needs to allow more space ahead and pay more attention. I never put myself in a position where I am unable to avoid the vehicle ahead if they suddenly stop.  This annoys other drivers who think we all should be driving up each others behinds lol. Not my problem.

Bobo

Fact is, even if someone stops everyone behind them should be able to stop as well.  If not they are either not paying attention or following too closely.  Also, it's likely the moron behind the wheel fell asleep and ignored the warnings to pay attention so the car just stopped.

The cause of this accident wasn't the car, it was a bunch of idiots who can't drive.

Bobo is retarded

All these mouth breathers that should never have gotten their license claiming the cars behind were following too close:

1. The Tesla cut off a whole row of drivers in the passing lane and floored the brakes. The Tesla immediately made the following distance too short where it was fine before.

2. Drivers with situational awareness see more than just the car ahead of them, when seeing an open road ahead and not expecting a toy car to slam on the brakes after cutting off the lead car, they maintained a distance adequate for responding to a reasonable rate of deceleration. After all, I'm willing to bet none of you come to a dead stop for the full 3 seconds at every stop sign.

3. On a crowded road like this, if you leave 3 seconds of "safe following distance," I guarantee 2-3 cars will merge in front of you, instantly reducing the space back to what it was. Unless you want to spend your entire commute decelerating and holding up the passing lane like a jackass, you do not drive like this.

Clown above

The problem with the clown above's asinine spiel is that he's completely abolishing all the cars behind of any responsibility.  The Tesla had enough space to merge, so the vehicle that was "cut off" should have compensated for the sudden lack of space that he had.

You are responsible for your actions in your vehicle.  You are supposed to be paying attention and cutting someone off isn't illegal.  If someone turns their signal on, you yield, you do not speed up and block them.

Bigger Clown Above

The problem with your spiel is you assume that just because the Tesla didn't clip the car in the other lane then it left enough room for the any car coming in that lane to stop safely, and a safe driving distance assumes the car in front of you will slow at a certain rate of speed, in a line of cars that suddenly go from highway speed to a dead stop from a collision, the further back that line you are the less reaction time you have to the sudden stop of crashed cars ahead.   All this happens in seconds.  Anyone who has ever been in an accident at highway speeds will tell you, you don't even have time to think about what's happening it is all just reflex reactions.   Reminder, the article says it changed lanes TO THE LEFT AND SUDDENLY DECELERATED TO 7 MPH.   This was on a busy highway, the car switches lanes from the slow lane to the passing lane and hard brakes.


To: Joel

I really don't see anyone saying the driver of the Tesla shouldn't have taken control from the car, of course the driver is ultimately responsible.  But facts are facts and the simple fact here is this accident doesn't happen at all if the car wasn't equipped with a self driving feature that is glitchy at best and just out right dangerous at worst.

Leslie Tan

I do have Autopilot but not FSD. I enjoy driving and typically don't use that feature except for a quick sip of water or coffee AND the roads look clear.
Ultimately, these systems are labeled "Driver's Assistance Systems" ... meaning the driver is and should be situationally aware AND in control.
A multitude of factors can and will contribute to AP and FSD not responding to conditions on the road.
Net-net, the Driver is and should always be driving and in control.

Not a Tesla fan

FSD could be better than human drivers, or or could be worse. It doesn't matter, because in a situation like a tunnel where cars are traveling at high speed, in close quarters, with no place to escape, all that matters is that it is different. Highways and driving in general work because drivers have a reasonable expectation of how other drivers will react. When random drivers do something unexpected, what was a reasonable following distance is no longer enough. Add to that the limited amount of training data that Teslas have in extreme situations like developing accidents, and you have a near guarantee that something like this accident will happen now and then.

To be generous, Tesla FSD is a novice driver with some savant-level skills. Sometimes the savant talents save lives, but sometimes the novice reveals itself and bad things happen. In a reasonable world, this is what beta testing is about, and everyone involved in the beta test knows the risks. Tesla has decided to break the rules of beta by forcing these risks onto other people - drivers, cyclists and pedestrians - who have not signed up and are unaware they are part of the experiment. Effectively all of us on the road are Elon Musk's lab rats, like it or not. Or maybe we're his pylons.

In any case, Tesla's experiments in self driving in live traffic are so far over the ethical line that they should have been shut down on day one. The driver who turned on FSD in the Bay Bridge tunnel is an idiot who shouldn't be trusted again with any vehicle more dangerous than a bicycle. And we should all be glad there were no fuel trucks or hazardous cargoes in range of his stupid actions.

Not a Tesla fan

In short, no competent, sane driver would consider it a good idea to pull into the fast lane and slam on the brakes in heavy traffic in a high speed tunnel.

The rest of us base our driving decisions on the assumption that the drivers around us are competent and sane. If we had to prepare for insane actions, traffic would simply not work.

Enter Tesla FSD. Its actions in this case show that it is either incompetent or insane by human standards. If we allow it on public roads in its current form, every assumption we make about other drivers becomes invalid. The actions that allow most driving to be safe become impossible.

Current FSD is not compatible with humans driving cars. Let Elon use his billions to build a private test city where he can experiment to his heart's content in a controlled environment. That's science, and when his product is ready to make the roads safer, I'll be first in line.

But what he's doing isn't science by any standard. He is a ten year old mixing random chemicals from the kitchen. He might invent a cancer cure, or he might generate chlorine gas and kill his whole family.

He rolls the dice. We pay the price.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview