Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Caribbeans

(784 posts)
Wed Dec 21, 2022, 08:31 PM Dec 2022

Tesla 'full self-driving' triggered an eight-car crash, a driver tells police

Source: CNN

A driver told authorities that their Tesla’s “full-self-driving” software braked unexpectedly and triggered an eight-car pileup in the San Francisco Bay Area last month that led to nine people being treated for minor injuries including one juvenile who was hospitalized, according to a California Highway Patrol traffic crash report.

CNN Business obtained the report detailing the crash through a public records request Wednesday. California Highway Patrol reviewed videos that show the Tesla vehicle changing lanes and slowing to a stop.

California Highway Patrol said in the Dec. 7 report that it could not confirm if “full self-driving” was active at the time of the crash. A highway patrol spokesperson told CNN Business on Wednesday that it would not determine if “full self-driving” was active, and Tesla would have that information.

The crash occurred about lunchtime on Thanksgiving, snarling traffic on Interstate 80 east of the Bay Bridge as two lanes of traffic were closed for about 90 minutes as many people traveled to holiday events. Four ambulances were called to the scene.

Read more: https://www.cnn.com/2022/12/21/business/tesla-fsd-8-car-crash/index.html



How has the public been allowed to act as guinea pigs for untested "BETA" software that does not do what is advertised? There are NO "Full Self Driving" cars in the US and there never have been. A class action lawsuit is called for. The NTSB and the NHTSA have either been compromised or they are ignorant on this issue. Which is more likely? They have shown that they are useless at doing their job regardless and should be held accountable.
41 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Tesla 'full self-driving' triggered an eight-car crash, a driver tells police (Original Post) Caribbeans Dec 2022 OP
Until accident and Tesla experts tell police... Alexander Of Assyria Dec 2022 #1
Is it...weird, that the CHP doesn't seem to care that dangerous cars are on the road? NullTuples Dec 2022 #2
Not at all FBaggins Dec 2022 #5
Self driving vehicles approved by government are probably safer than humans approved?! Alexander Of Assyria Dec 2022 #30
I'd believe that...if there were any consequences for them killing someone. There are not. NullTuples Dec 2022 #34
Who will ever trust Musk's products anymore, especially self-driving, after his... brush Dec 2022 #3
History shows this man must never be trusted with anything at all Caribbeans Dec 2022 #6
His performance is just like Trump. LiberalFighter Dec 2022 #22
We're al long way away from self driving cars being viable on public roads. patphil Dec 2022 #4
It will never work with individual cars controlling themselves. ALL the cars would need to ZonkerHarris Dec 2022 #8
I agree, there'd have to be an electronic conversation going on between all the vehicles on the road patphil Dec 2022 #9
Yup. And when those conversations aren't happening? Or are hacked? erronis Dec 2022 #11
Chaos! patphil Dec 2022 #16
2 sci-fi movies come to mind that explored the possibilities and problems. canuckledragger Dec 2022 #21
MAGA's would sue for being tracked. rickford66 Dec 2022 #36
But having a bunch of loose nuts behind the steering wheel is better? erronis Dec 2022 #10
Planes and Ferries are a completely different use case for automation oioioi Dec 2022 #24
Well, according to NOVA "Look Who's Driving," 100M cars are on roads each day. Hestia Dec 2022 #38
Agreed. Given the infinite number of situations that can occur... brush Dec 2022 #28
Human driving vehicles are cause of how many accidents, minor and fatal? Alexander Of Assyria Dec 2022 #31
Computers aren't going to see incidents developing off the road that'll have an impact on the road. patphil Dec 2022 #33
From the NOVA program, self-driving will probably happen for your great-great grandchildren Hestia Dec 2022 #39
They Musked up. ZonkerHarris Dec 2022 #7
Anybody that bought into this rich boy's schemes has been thoroughly musked. erronis Dec 2022 #12
Would think auto insurance companies would be all over this. cbabe Dec 2022 #13
Let me just make moniss Dec 2022 #14
Wait, there is no way this pileup is the fault of the Tesal Random Boomer Dec 2022 #15
In reality it's impossible to maintain the recommended safe following distance on that road. NullTuples Dec 2022 #17
And on the metro California freeways, keeping 1 car length apart for each 10 miles of speed is iluvtennis Dec 2022 #18
It probably won't work if people keep cutting in, Dysfunctional Dec 2022 #19
But again, that's not the lead driver's responsibility Random Boomer Dec 2022 #29
"How many cars have this issue and how often?" Caribbeans Dec 2022 #37
One has to deliberately enable Tesla's autopilot Zorro Dec 2022 #40
Correct. cab67 Dec 2022 #35
If you are behind the wheel of a car YOU are responsible. MicaelS Dec 2022 #20
In June, NHTSA published a report on 10 months of Level 2 ADAS crash data. OnlinePoker Dec 2022 #23
More trains and trolleys, please. Yavin4 Dec 2022 #25
As if drivers. Old Crank Dec 2022 #26
ANYTHING E-Loon's selling......... wolfie001 Dec 2022 #27
The stock is taking a nosedive just like the cars. live love laugh Dec 2022 #32
These cars should never have been allowed on the road. ificandream Dec 2022 #41

FBaggins

(26,793 posts)
5. Not at all
Wed Dec 21, 2022, 08:57 PM
Dec 2022

The driver is responsible either way (he can sue Tesla if he thinks the car is responsible). That's what police care about.

There really isn't much "self driving" unique to Tesla here. Lots of cars have the ability to brake on their own these days. The state doesn't have a need to take much notice unless/until accidents caused by cars outnumber those avoided by the technology.

And, of course, it's entirely possible that the driver is lying. Lots of claims of "unexpected acceleration/deceleration" end up being cases of the driver hitting the wrong pedal.

NullTuples

(6,017 posts)
34. I'd believe that...if there were any consequences for them killing someone. There are not.
Thu Dec 22, 2022, 02:23 PM
Dec 2022

(outside of private lawsuits)

brush

(53,978 posts)
3. Who will ever trust Musk's products anymore, especially self-driving, after his...
Wed Dec 21, 2022, 08:49 PM
Dec 2022

disastrous tenure at twitter? The man is damaging both companies.

Caribbeans

(784 posts)
6. History shows this man must never be trusted with anything at all
Wed Dec 21, 2022, 09:06 PM
Dec 2022

From 2015, when fElon was regarded as the second coming of Christ by almost everyone in the friggin world

DALE VINCE V ELON MUSK: ELECTRIC CAR TSARS AT WAR OVER MOTORWAY CHARGING STATIONS

Driving.co.uk | 18 March 2015

IN A windswept storage yard at a service station off the M25 lies a forlorn reminder of Britain’s electric superhighway dream. Standing amid the discarded boxes from Burger King at the back of South Mimms services is a packing case festooned with faded shipping notices to say that it belongs to the electric car company Tesla...snip

...In the City they call them fat-fingered trades — when a trader inadvertently hits the wrong key on his computer and costs his bank millions. The price of the email — sent in error late on Sunday, May 18, last year from a Tesla employee to someone at Ecotricity with the same first name as its intended recipient — has yet to be determined, but it could easily run into the millions too.

The email allegedly made clear Tesla’s plans to break up the partnership it had built with Ecotricity to install chargers in service areas and instead to make a direct approach to their operators. Ecotricity says that the email — written in haste after an article last year in The Sunday Times that disclosed Musk’s concept of an electric superhighway in Britain — also revealed Tesla’s intention to blacken Ecotricity’s name with politicians and the media. Instead of being sent to Simon Sproule, then Tesla’s head of corporate communications, the email was sent by mistake to Simon Crowfoot of Ecotricity, alerting the company to Tesla’s plans.

Vince said the email was evidence not only of underhandedness but of a “brutal” corporate culture within the American firm.

“I have never seen anything approaching Tesla’s behaviour and we have been around 20 years this year and we’ve had some run-ins, you know. I do think it’s cultural — I just think that’s how they operate.” more
https://www.driving.co.uk/news/dale-vince-vs-elon-musk/


Of course this happened AFTER Tesla's bald faced lies about "Battery Swapping" - can you find the lie here?


June 2013

fElon MusQ is a liar, a cheat and a thief. If his lips move, he's lying. Shame on billions of people for not recognizing this years ago.

patphil

(6,258 posts)
4. We're al long way away from self driving cars being viable on public roads.
Wed Dec 21, 2022, 08:54 PM
Dec 2022

They need to react to a near infinite number of variations on millions of possible situations...impossible to reliably react. And that doesn't include mechanical problems with the vehicle itself.
A class action suit is the only way to stop this from becoming a major problem.
Imagine a self-driving 40 ton semi traveling at 70 miles an hour, and the havoc it could create.
It's bad enough when a human is behind the wheel, but take out the human and anything can happen.
Several decades ago, I had a Fort Tempo with a computerized ignition system. It had a problem such that when I stepped on the gas the vehicle slowed down, and when I took my foot off the gas it sped up. The electronic ignition/fuel injection messed up the air fuel mixture, creating a potentially dangerous situation. It only corrected after a hard reset was performed by shutting off the engine.
I think we are over a decade away from this sort of thing being safe and reliable.

ZonkerHarris

(24,304 posts)
8. It will never work with individual cars controlling themselves. ALL the cars would need to
Wed Dec 21, 2022, 09:11 PM
Dec 2022

be controlled by one master controller to be truly effective.

canuckledragger

(1,671 posts)
21. 2 sci-fi movies come to mind that explored the possibilities and problems.
Wed Dec 21, 2022, 10:52 PM
Dec 2022

The Marvel movie 'Logan' set a few decades into the near future shows automated freight trucks causing the kind of problems you's expect by hitting unwary animals/pedestrians on the highway.

And the Movie AI demonstrated a type of automated system where all the cars going onto the highway/etc, have a central controller...whose AI was murderous. (think of potential glitches affecting 100s of vehicles at once.

It's one of the reasons I like good sci-fi movies...someone with a good imagination can explore all sorts of possibilities and problems without putting anyone in danger.

erronis

(15,470 posts)
10. But having a bunch of loose nuts behind the steering wheel is better?
Wed Dec 21, 2022, 09:29 PM
Dec 2022

I know it won't happen in a controlled scientific experiment but we're very likely to get a lot of good results in real life.

What is the ratio of serious incidents when a human is fully in charge vs. when some electronics are in charge?

The airlines seem to feel that a huge percentage of flight decisions should be made by on-board (and some centralized) systems.

The ferries in Puget Sound have had far less major incidents when computers took over.

oioioi

(1,127 posts)
24. Planes and Ferries are a completely different use case for automation
Thu Dec 22, 2022, 12:11 AM
Dec 2022

Planes have a lot of people coordinating and maintaining them and their actual travel conditions and traffic control is relatively predictable and consistent - same with Ferries.

The idea of integrating self-driving vehicles with an existing infrastructure that is shared with human drivers is completely different.

Spatial simplicity such as being on a straight road / waterway or flying lends itself well to software control. The variables are relatively few and largely predictable. A kid is not going to chase a bouncing ball into a plane's glide path.

Driving a vehicle requires a great deal of visual interpretation and intuition. Software doesn't operate on intuition, no matter how impressive the user interface is. It is generally rule-based, even if we call it "Artificial Intelligence".

If you haven't explored chatGPT https://chat.openai.com/chat it's worth taking a look. It's like Wikipedia on steroids. It's meant to impress you with it's authoritative language and syntax formation. It is impressive. It's mind-blowing when you first consider the implications.

Scratch the surface a little deeper however, and it will be apparent that the AI is sometimes factually incorrect about things. It's very convincing and it doesn't make very many mistakes, but it definitely does make mistakes.

Driving requires a huge amount of cognition and attention to detail - at least good driving does - there are of course the variables introduced by the "not-so-good" drivers, which are practically impossible to predict reliably. AI and machine learning relies on probability. it will get stuff right a lot of the time - often 99.999 percent or more of the time. But when there are so many uncontrollable variables, sensory inputs and moving parts involved in self-driving on today's public roads that there will always be the potential for a catastrophic mistake.

 

Hestia

(3,818 posts)
38. Well, according to NOVA "Look Who's Driving," 100M cars are on roads each day.
Sat Dec 24, 2022, 02:21 AM
Dec 2022

I think I've read that there are 17,000 +/- accidents each day; not bad averages

brush

(53,978 posts)
28. Agreed. Given the infinite number of situations that can occur...
Thu Dec 22, 2022, 09:04 AM
Dec 2022

I doubt manufacturers will ever be able to program the unimaginable number situtuations that can come up on the road, especially in bad weather climates.

 

Alexander Of Assyria

(7,839 posts)
31. Human driving vehicles are cause of how many accidents, minor and fatal?
Thu Dec 22, 2022, 01:47 PM
Dec 2022

At least computers don’t need to eat a burger and fries while driving! Or look at cellphone message from mom to get milk!

Crash!!

patphil

(6,258 posts)
33. Computers aren't going to see incidents developing off the road that'll have an impact on the road.
Thu Dec 22, 2022, 02:02 PM
Dec 2022

Like an accident happening on an overpass that will rain debris down onto the road in front of the car.
Or a strong wind blowing across the road that could push the car into a different lane.
Or a load on a truck in front of you beginning to break free.
A human driver sees many things that current computer driven cars can't; dangerous situations that are just developing that the computer won't react to until it's too late.
There's an old saying in the computer programing world, "To err is human, but to really mess things up it takes computer".

I think we are at least 10-15 years away from driverless cars being common on the road.

 

Hestia

(3,818 posts)
39. From the NOVA program, self-driving will probably happen for your great-great grandchildren
Sat Dec 24, 2022, 02:25 AM
Dec 2022

10-15 years is extremely rosy and for the reasons you stated above; human can see and react, whereas computers can't.

It's the reason humans still work at amazon warehouses - humans can map out a warehouse much better than robotics - who can only do one task at a time, whereas humans can see what is in their space and work with and around the information.

moniss

(4,274 posts)
14. Let me just make
Wed Dec 21, 2022, 09:47 PM
Dec 2022

a suggestion to everyone about this subject of "self-driving". Honestly if you can't be responsible enough to drive your car yourself then maybe you shouldn't be on the road. Make sure you maintain the tires and brakes. Check your fluids, lights, mirrors, horn, wipers etc. before you start driving for the day. It only takes a couple of minutes at most. See the other vehicles and be seen is the rule. Drive near the speed limit and slow down before making corners. This is not a race. Don't tailgate. Pay attention to the road and don't be distracted by your phone, GPS, kids etc. Don't blast your sound system so loud that you can't hear the siren of an emergency vehicle, a sudden odd noise from your vehicle or the horn from a fellow traveler. Obviously don't indulge and drive.

If you just heed the basics as I've described you have the best chance to come back in one piece and not be a part of someone else not coming back. You don't need "self-driving"or idiot Musk. You need you. All he wants is your check to clear.

Random Boomer

(4,170 posts)
15. Wait, there is no way this pileup is the fault of the Tesal
Wed Dec 21, 2022, 09:48 PM
Dec 2022

It doesn't matter why you come to a sudden, full stop on a highway, the cars behind you are responsible for managing the distance that gives them time to stop safely.

So, even though I wouldn't drive a Tesla on a bet, much less own one, the crash itself is wholly the fault of the drivers in the cars behind the stopped Tesla.

NullTuples

(6,017 posts)
17. In reality it's impossible to maintain the recommended safe following distance on that road.
Wed Dec 21, 2022, 10:00 PM
Dec 2022

I've been on the section of I-80 where the accident occurred countless times over the last few decades. At many times of day if you were to try to fade back the recommended 16 car lengths (at that speed), 6-8 cars would immediately fill in the space. Even in the slow lane, which then requires constant merging back into the main roadway thanks to exit-only lanes, a following distance of 10 or even 8 car lengths is immediately and continually filled with other vehicles. Simply put, the rules of safe stopping distance are often rendered irrelevant. It's why I don't like that stretch of road and avoid it if I can. Not everyone has that luxury.

iluvtennis

(19,912 posts)
18. And on the metro California freeways, keeping 1 car length apart for each 10 miles of speed is
Wed Dec 21, 2022, 10:08 PM
Dec 2022

IMPOSSIBLE during commute hours [and in some areas of California, commute hours are 24/7).

 

Dysfunctional

(452 posts)
19. It probably won't work if people keep cutting in,
Wed Dec 21, 2022, 10:24 PM
Dec 2022

but 1 car length for each 10 mph is hard to judge. I use the 2-second rule. When the driver in front of you passes a spot you can see, count 2 seconds. You are too close if you reach the spot before the 2 seconds are up.

Random Boomer

(4,170 posts)
29. But again, that's not the lead driver's responsibility
Thu Dec 22, 2022, 10:45 AM
Dec 2022

There are any number of reasons that a car could come to a dead-stop on a highway: debris, system failure, deer collision. Legally, it's the responsibility of the drivers behind that car to avoid a collision. The fact that our highways make safety practically impossible to apply is an indictment of our transportation system, rather than of this particular Tesla.

The crucial question is more "How many cars have this issue and how often?" Is Tesla unique in this or is it just getting the blare of publicity compared to other cars?

We bought a PT Cruiser with a faulty chip that also stopped without warning, but fortunately that was on a city street and no collisions resulted. So I know from personal experience that this can happen, yet there aren't any blaring headlines about PT Cruisers (utter POS car, btw).

Caribbeans

(784 posts)
37. "How many cars have this issue and how often?"
Thu Dec 22, 2022, 06:51 PM
Dec 2022

This "Phantom Braking" (that's what Tesla owners call it) has been apparent for years, yet not a THING Has been done about it.

Here are more examples than anyone would want to know about along with detailed descriptions from the most popular Tesla Owners forum Tesla Motors Club- not a hate group.

Search results for query: Phantom Braking at teslamotorsclub.com

https://teslamotorsclub.com/tmc/search/4840524/?q=Phantom+Braking&o=relevance

Who consented to be a BETA test subject for Tesla?

More importantly - WHERE IS NHTSA AND OR NTSB?????????????

They have allowed this to take place for years and done NOTHING.


Zorro

(15,757 posts)
40. One has to deliberately enable Tesla's autopilot
Sat Dec 24, 2022, 09:27 AM
Dec 2022

Thus you are consenting to be a BETA test subject when you do so. And Tesla labels autopilot as beta.

cab67

(3,012 posts)
35. Correct.
Thu Dec 22, 2022, 02:40 PM
Dec 2022

I had an elderly aunt who lived in a fairly rural part of her state. She drove to the nearest big city - Boston, in her case - and ended up being pulled over for driving too slowly. The problem? She kept trying to keep a safe distance between her car and the car in front of her, but every time that space was achieved, more cars filled the gap. She ended up slowing down below the minimum speed limit.

(We were horrified to learn she'd been trying to drive anywhere, much less through Boston.)

MicaelS

(8,747 posts)
20. If you are behind the wheel of a car YOU are responsible.
Wed Dec 21, 2022, 10:37 PM
Dec 2022

No matter what cruise control the vehicle has.

OnlinePoker

(5,730 posts)
23. In June, NHTSA published a report on 10 months of Level 2 ADAS crash data.
Wed Dec 21, 2022, 11:26 PM
Dec 2022

273 of the 392 crashes were by Teslas followed by 90 for Honda and 10 for Suburu in the top 3. On the face of it, this would look like Teslas are particularly dangerous, but Tesla had the lion share of vehicles with Level 2 ADAS on the road with this system (over 800,000). What isn't said in any reports I can find is how these crashes compare to a similar number of vehicles not using ADAS, or a comparison of how much mileage is being driven compared to non-ADAS vehicles. I do know that in crash ratings, Tesla rates 5 stars across the board and have for several years on all models (that were rated).

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf
https://www.nhtsa.gov/ratings

Old Crank

(3,679 posts)
26. As if drivers.
Thu Dec 22, 2022, 05:14 AM
Dec 2022

As if drivers need a new get out of jail free card.
To go with,
The sun was in my eyes
Came out of nowhere......

wolfie001

(2,321 posts)
27. ANYTHING E-Loon's selling.........
Thu Dec 22, 2022, 06:23 AM
Dec 2022

.....I ain't buying. He's forever linked to Donnie "Fats" in my book.

Latest Discussions»Latest Breaking News»Tesla 'full self-driving'...