The family of a North Carolina man who died after driving his car off a collapsed bridge while following Google Maps directions is suing the technology giant for negligence
My exact first thought. And why not a billion BIG Red SIGNS saying shit like: “Collapsed Bridge ahead”, “Warning: immediate death ahead”, “What the fuck are doing?! Turn around”, etc.
User error caused this man to go over a cliff. User error does not excuse tesla accidents when the user is supposed to be hands off. One was an accident caused by a person, the other was an accident caused by a machine attempting to make human decisions. There’s a huge difference.
I wasn’t talking about FSD, I was talking about AP.
Although if you use FSD, to sign up, you need to acknowledge this (among other things)
“Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent,”
If it leaves Beta in V12, and that warning is gone, there will be problems probably =( It’s not ready to lose such an extreme warning. And it legit shouldn’t leave beta until they take on liability and it’s legit FSD.
I’m going to get downvoted to hell here, but if you defend google here, you should be defending Tesla when someone severely misuses auto-pilot.
Play games on AP and don’t pay attention causing crash, not Tesla’s fault. Drive off a bridge cause the GPS tells you to, not Google’s fault.
You’re responsible for driving your car at all times.
The biggest fault here would be whoever was in charge of that bridge. If it collapsed 9 years ago why was it not blocked off?
My exact first thought. And why not a billion BIG Red SIGNS saying shit like: “Collapsed Bridge ahead”, “Warning: immediate death ahead”, “What the fuck are doing?! Turn around”, etc.
That’s the biggest question for me.
A GPS is a tool that aids a person
Tesla FSD is marketed as Self Driving
User error caused this man to go over a cliff. User error does not excuse tesla accidents when the user is supposed to be hands off. One was an accident caused by a person, the other was an accident caused by a machine attempting to make human decisions. There’s a huge difference.
I wasn’t talking about FSD, I was talking about AP.
Although if you use FSD, to sign up, you need to acknowledge this (among other things)
“Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent,”
If it leaves Beta in V12, and that warning is gone, there will be problems probably =( It’s not ready to lose such an extreme warning. And it legit shouldn’t leave beta until they take on liability and it’s legit FSD.