GP: The owner of a Tesla told police the electric vehicle’s self-driving feature apparently malfunctioned on Thanksgiving Day near San Francisco’s Bay Bridge, causing an eight-car pileup on Interstate 80.
The car was going about 55 miles per hour when it suddenly moved over to the far left lane and abruptly braked, CNN Business reported. That maneuver caused a chain reaction of vehicles slamming into each other at freeway speeds, according to the report.
Four ambulances were called to the scene and nine people were treated for minor injuries, including a juvenile who was hospitalized, CNN reported.
The news outlet said it used a public records request to get the California Highway Patrol‘s crash report. That report indicated the law enforcement agency was unable to confirm whether full self-driving mode had been active at the time of the incident. MORE
I’m pretty sure Teslas have a “black box” that will straighten this out.
Robots in manufacturing have all kinds of safety requirements, dedicated redundant safety relays, safety cages, light curtains, proximity sensors, laser sensors, load cells, independent redundant I/O, etc.
They are effectively putting robots on the roadways with realistically NONE of these features.
These cars rely on and require a lot of information & communication with exterior entities.
I find it incredible they are unleashing this shit on the public.
It’s easily 20 years too soon if it ever is truly viable.
SMDH!
I’ve worked with autonomous, self-guided battery powered vehicles in a controlled, enclosed industrial setting being operated by trained operators for 28 years now.
And the little bastards can still surprise me.
And by “little” these things are 6500 pounds of stainless and battery fully loaded, 15 feet long and move at no greater than a couple MPH, and they can still slip on water, get diverted if they hit a floor failure, shove an empty pallet into an ankle because the rotating laser doesn’t look that low, and do odd things when their encoders, drives, output cards, input cards, or cables to any of it fail. Mind you, these were built by some of the best companies in the industry with long track records of making such vehicles and with thousands spent on maintaining them annually with best practices like TPM being conducted on them by skilled labor.
So how much MORE likely are these failures in the wild, in vehicles driven by people who just get in and go, at MUCH higher speeds in varying climates, poor roads, uncertain markings, and around people who may not know the car is driving itself?
…One of the biggest hazards to industrial AGVs is interacting with operator driven vehicles like forklifts, floor scrubbers, and trucks. The AGVs have a sensor package that usually stops them, but the guy looking at his phone on the forklift doesn’t.
Its quite noisy when 8 tons of solid steel vehicles collide at perhaps a combined two miles an hour.
Now do this outside, in the snow, in hollow sheet metal vehicles at 70 MPH.
I myself have a Kia 4WD that has a ‘driver assist’ feature where it can and does drive itself for short stretches. I drive 90 miles round trip every day, so its a Godsend to me that it can do this, but I would NEVER fully trust it. I don’t know its algorithm or how its machine vision works exactly, but it tends to follow the line on the right so it wants to exit the freeway as the line pulls away. It also sometimes decides to stop guiding with little warning, mostly a small green steering wheel icon that goes white. Rain and fog can wig it, and it ain’t great at seeing deer (or pedestrians or cars) leap from the side suddenly. It also sometimes twitches left or right unexpected for no reason I can discern. It additionally has little control authority and is easily overridden by relatively small driver inputs that may not even be intentional.
It has its place, but its place isn’t fully replacing the driver.
Planes with highly developed autopilots in clear skies operating under rigid rules by highly trained professionals sometimes get destroyed with a loss of all crew and passengers by a mud dauber’s nest in a pitot tube.
Until and unless automation is made by Jesus, it will NEVER be infallible.
Anyone who fully trusts it in a poorly controlled system like a highway is riding for a fall.
Good luck with that.
@LocoBlancoSaltine — Tesla’s Full Self-Driving system is in beta testing. Owners/drivers are the crash test dummies (in more ways than one), public roadways are the testing environment, and the rest of us using the roads are the test case load.
If you squint a little, the setup looks a bit like the current beta testing of Pharma’s hack of the immune system using mRNA.
My fervent hope is that transportation secretary choo-choo Pete gets run down and slaughtered to a bloody pulp by an autonomous vehicle.
That way they will be deemed homophobic and taken off the road.
It’s the only way…
Roomba is exotic enough for me.
Is it my imagination or did we never hear of many stories like this before Musk bought Twitter?
Gawd I wish some day I will not see his name is a headline. Soon I hope.
I can’t believe sns never considered the natural gas powered Chevy bolt. charging at night would cut his transportation cost big time. natural gas is what what powers most electric generation these days. from the reviews on the bolt most owners never even bring their cars to the shop first 100,000 miles. tired only. breaks never need service
brake. tires
Anyone notice that there are NO SELF DRIVING MOTORCYCLES?
If your hands are not on the wheel, you should not be in the drivers seat. Both my parents vehicles have “Eyesight Auto Braking assist” and the manual tells you very clearly that you are the driver, NOT THE CAR. Lane departure, position warnings are all fine BUT its your fuckin car.
I recently had a Toyota rental car. At first I couldn’t figure out why the steering would suddenly get real stiff when I took a certain fork in the road. I finally figured out it was a lane departure feature. If I used the turn signal, it wouldn’t resist. I was happy to turn the car in.