News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by MarkF
 - January 21, 2023, 17:39:02
BJ: you don't stand on the brakes for those things though, because as a self aware human being you know the possibility of injuring yourself and damaging your vehicle, you react to those situations as any reasonable person would.   The self driving software is not self aware, and has no self preservation instinct.  Bottom line as others have said is we are all in this science experiment to see if it will work without killing too many people.  They are fully aware this software is going to kill some, but unless you are famous or politically connected, your life means nothing.  You are just a number on a stat sheet, and as soon as someone with political pull or Hollywood fame gets seriously injured or killed then we will see this system removed from the vehicles.   Do any of you really think anyone would remember the Titanic 100 years later if it didn't have a first class section?  Can anyone remember the name of someone killed in a helicopter other than Kobe?  Common everyday hard working people don't matter, we are just numbers on a piece of paper to the corporate heads.  Just lab rats in their experiment, expendable.
Posted by Bj
 - January 21, 2023, 02:50:34
Wasn't the Tesla's fault. I can stand on the brakes for a retread or a piece of litter. Anyone following has a duty to stop in time. Following too close was the cause. However, if Tesla can stop faster than other cars let's hear it.
Posted by Shelly
 - January 21, 2023, 02:15:02
You can't blame the auto industry for people driving unsafe.Noone will go in the wind with this like you planned.Maybe the other drivers shouldn't of been so close then.Whos fault?100%driver.Next you'll be trying to go at the car dealerships and you need to quit the hate.How many recalls on all automobiles of others?With the pollution automobiles are causing it's the only smart solution.Dont hate it.
Posted by Samsungiphone
 - January 20, 2023, 19:04:03
The government finds that the car was in FULL SELF DRIVING mode. So what if the car was in FSD mode. The driver can take necessary actions even if the car was in FSD. Quickly pressing the brake to disable FSD is what I do when I feel that car is wanting to do something i dont is safe. When the car is slowing down for a stoplight I can press on the accelerator to speed back up. The government came to the conclusion that TESLA is at fault for a defective FSD. I would seriously investigate a little more into what were the drivers actions on preventing the car from slowing down. Anyone in any car can feel and see if their car is slowing down. We what did he do? If the driver was to blind and numb to feel and see the car slowing down maybe he heard the alerts the car gives out when something like that happens. I know when coming to a stoplight and the stoplight is green the car gives an alert telling me that it will come to a stop unless I press on the accelerator to proceed. So did the driver get any alerts prior to the car slowing down? What did the driver do. Did he just allow the car to do what the car thought was right? Can't blame TESLA for drivers not being attentive and just being plain stupid. Let me give all of you my story. I'm driving 880 south going to 237 east. The bridge on the east bound lane I can see stoplights on the next exit om calaveras. While I was on FSD on 880 south exiting on the 237 east on the right  lane not on the fastrak lane my car picked uo the stop light on calaveras and my car alerted me and started to slow down. I was like WTF? I pressed on the accelerator but I also checked the monitor at the same time  informing me that that the car was approaching a stop light and will stop and for me to proceed that I need to press on the accelerator. What I did was accelerate. While this happened my hands were still on the steering wheel like always just in case I needed to take any extra measures if something were to go wrong. So I believe that the government can't blame TESLA even if the FSD was defective because the driver still can take necessary actions to disable FSD when the driver noticed the car was slowing down. The diver was not paying attention at the time. Yes the diver can say FSD was enabled. The question that any investigator should ask is what were the drivers action was. Did he try to press on the accelerator? Did he try to disable FSD?  In my opinion the driver is at fault. I believe that there are ways to find out if the driver tries to press the accelerator or tap on the brakes to disable FSD. Need to investigate any warning alerts. Check cabin cameras. If driver was doing making actions to the alerts or what was the driver doing peridod.to put the blame on tesla right away is just wrong. Need to look and investigate thoroughly from both sides.
Posted by Samsungiphone
 - January 20, 2023, 18:25:53
I think that even though the driver put the car in auto pilot that the driver was still at fault. First of all the driver should of been aware that car was slowing down and if it was why didn't he accelerate. Secondly if the driver wasn't aware of the car slowing down that would Mean to say the driver wasn't paying attention and was busy doing something else. Where Tesla always says all drivers must be attentive when using autopilot. Last the drivers behind the tesla should of seen the brake lights and should of had enough time to hit their brakes. Which means that all the drivers in the pile up where driving to close ( tailgating) that they didn't have time to give them enough time to brake or they were too busy themselves to brake on time.
Summary of this comment is that there are people who drive tesla cars "NEED" to be attentive at all times while using autopilot. It is impossible that the tesla driver didn't notice or feel the car was slowing down if they did then the driver would of taken action by stepping on the accelerator. So although the driver out the car in autopilot and the FSD became defective for slowing down its still the driver who is still at fault for NOT TRYING the car from slowing down that turned into a pile up. Has anyone looked in the cabin camera seconds before the accident. That's the only real proof of what the drivers actions were when they realized the car was slowing down coming to a stop. Because again if the the diver blamed it on FSD and knew of the car slowing down the driver could of stepped on the brake to disengage autopilot and continue on to normal driving. So I believe that it's not TESLA's fault for a defective FSD. IT'S TOTALLY THE DRIVERS FAULT FOR NOT PAYING ATTENTION. Tesla has warning alerts whenever the car will take any actions. If the driver claims he's not at fault and puts the blame on tesla for a defective FSD it's BS and just want to get out from paying for the accident and increase of his insurance and wants to get money from tesla. We all know that TESLA cars are not perfect. And that goes for all brands of EV cars. Taking that to mind anyone who buys EV cars are not informed of that they must be attentive when driving even though they have autopilot. Except tesla. Tesla informs their car owners that must be they still need to be attentive when driving. Tesla car owners also read it when when sitting the car up and when they get certain software updates. The blue flashing alert light on the screen can be seen if you're diving. People always want to know fast they are going. So the driver can't any there was no warning for them to take any action. DRIVER IS AT FAULT.
Posted by MarkF
 - January 20, 2023, 17:28:16
Annemouse:. I take it you have never been involved in a chain reaction crash.   When the car in front of you goes from 50-60 miles an hour to 0 almost instantly because it ran into something, not because of braking because that is slower than crashing,  it takes you a fraction of a second to react to that, that fraction of a second is magnified by the number of cars in the line.  Even when cars don't crash from it if you see a sudden slow down on the highway it is always a couple cars back the line where someone has to swerve off on to the shoulder to avoid a crash because they didn't have the same time to react to the problem.   In this case if it hadn't been in the tunnel with no escape path or shoulder, maybe the accident doesn't happen, but in the real world tunnels exist and so do regular roads that have no escape path.

And to those who say cutting someone off is not illegal, actually it is, that is called either reckless driving or in some states aggressive driving and both of them can get you a ticket and points on your license.
Posted by Annemouse
 - January 20, 2023, 17:13:30
So maybe the first car behind the Tesla has an excuse if the Tesla merged aggressively when there wasn't actually room to do so safely. The rest behind the first? No excuse.
Posted by Annemouse
 - January 20, 2023, 16:56:48
Tailgating causes wrecks. Leave enough room to stop. Play stupid games, win stupid prizes.
Posted by Not a Tesla fan
 - January 20, 2023, 16:22:28
In short, no competent, sane driver would consider it a good idea to pull into the fast lane and slam on the brakes in heavy traffic in a high speed tunnel.

The rest of us base our driving decisions on the assumption that the drivers around us are competent and sane. If we had to prepare for insane actions, traffic would simply not work.

Enter Tesla FSD. Its actions in this case show that it is either incompetent or insane by human standards. If we allow it on public roads in its current form, every assumption we make about other drivers becomes invalid. The actions that allow most driving to be safe become impossible.

Current FSD is not compatible with humans driving cars. Let Elon use his billions to build a private test city where he can experiment to his heart's content in a controlled environment. That's science, and when his product is ready to make the roads safer, I'll be first in line.

But what he's doing isn't science by any standard. He is a ten year old mixing random chemicals from the kitchen. He might invent a cancer cure, or he might generate chlorine gas and kill his whole family.

He rolls the dice. We pay the price.
Posted by Not a Tesla fan
 - January 20, 2023, 15:56:14
FSD could be better than human drivers, or or could be worse. It doesn't matter, because in a situation like a tunnel where cars are traveling at high speed, in close quarters, with no place to escape, all that matters is that it is different. Highways and driving in general work because drivers have a reasonable expectation of how other drivers will react. When random drivers do something unexpected, what was a reasonable following distance is no longer enough. Add to that the limited amount of training data that Teslas have in extreme situations like developing accidents, and you have a near guarantee that something like this accident will happen now and then.

To be generous, Tesla FSD is a novice driver with some savant-level skills. Sometimes the savant talents save lives, but sometimes the novice reveals itself and bad things happen. In a reasonable world, this is what beta testing is about, and everyone involved in the beta test knows the risks. Tesla has decided to break the rules of beta by forcing these risks onto other people - drivers, cyclists and pedestrians - who have not signed up and are unaware they are part of the experiment. Effectively all of us on the road are Elon Musk's lab rats, like it or not. Or maybe we're his pylons.

In any case, Tesla's experiments in self driving in live traffic are so far over the ethical line that they should have been shut down on day one. The driver who turned on FSD in the Bay Bridge tunnel is an idiot who shouldn't be trusted again with any vehicle more dangerous than a bicycle. And we should all be glad there were no fuel trucks or hazardous cargoes in range of his stupid actions.
Posted by Leslie Tan
 - January 20, 2023, 15:30:19
I do have Autopilot but not FSD. I enjoy driving and typically don't use that feature except for a quick sip of water or coffee AND the roads look clear.
Ultimately, these systems are labeled "Driver's Assistance Systems" ... meaning the driver is and should be situationally aware AND in control.
A multitude of factors can and will contribute to AP and FSD not responding to conditions on the road.
Net-net, the Driver is and should always be driving and in control.
Posted by Bigger Clown Above
 - January 20, 2023, 14:58:06
The problem with your spiel is you assume that just because the Tesla didn't clip the car in the other lane then it left enough room for the any car coming in that lane to stop safely, and a safe driving distance assumes the car in front of you will slow at a certain rate of speed, in a line of cars that suddenly go from highway speed to a dead stop from a collision, the further back that line you are the less reaction time you have to the sudden stop of crashed cars ahead.   All this happens in seconds.  Anyone who has ever been in an accident at highway speeds will tell you, you don't even have time to think about what's happening it is all just reflex reactions.   Reminder, the article says it changed lanes TO THE LEFT AND SUDDENLY DECELERATED TO 7 MPH.   This was on a busy highway, the car switches lanes from the slow lane to the passing lane and hard brakes.


To: Joel

I really don't see anyone saying the driver of the Tesla shouldn't have taken control from the car, of course the driver is ultimately responsible.  But facts are facts and the simple fact here is this accident doesn't happen at all if the car wasn't equipped with a self driving feature that is glitchy at best and just out right dangerous at worst.
Posted by Clown above
 - January 20, 2023, 12:15:50
The problem with the clown above's asinine spiel is that he's completely abolishing all the cars behind of any responsibility.  The Tesla had enough space to merge, so the vehicle that was "cut off" should have compensated for the sudden lack of space that he had.

You are responsible for your actions in your vehicle.  You are supposed to be paying attention and cutting someone off isn't illegal.  If someone turns their signal on, you yield, you do not speed up and block them.
Posted by Bobo is retarded
 - January 20, 2023, 08:36:50
All these mouth breathers that should never have gotten their license claiming the cars behind were following too close:

1. The Tesla cut off a whole row of drivers in the passing lane and floored the brakes. The Tesla immediately made the following distance too short where it was fine before.

2. Drivers with situational awareness see more than just the car ahead of them, when seeing an open road ahead and not expecting a toy car to slam on the brakes after cutting off the lead car, they maintained a distance adequate for responding to a reasonable rate of deceleration. After all, I'm willing to bet none of you come to a dead stop for the full 3 seconds at every stop sign.

3. On a crowded road like this, if you leave 3 seconds of "safe following distance," I guarantee 2-3 cars will merge in front of you, instantly reducing the space back to what it was. Unless you want to spend your entire commute decelerating and holding up the passing lane like a jackass, you do not drive like this.
Posted by Bobo
 - January 20, 2023, 06:11:24
Fact is, even if someone stops everyone behind them should be able to stop as well.  If not they are either not paying attention or following too closely.  Also, it's likely the moron behind the wheel fell asleep and ignored the warnings to pay attention so the car just stopped.

The cause of this accident wasn't the car, it was a bunch of idiots who can't drive.