“Tesla FSD Beta Tried to Kill Me Last Night”
September 2nd, 2023Via: Electrek:
I was testing Tesla’s latest Full Self-Driving (FSD) Beta update last night (v11.4.7), and a new aggressive bug has nearly made me crash at highway speed twice.
Please take this as a public service announcement.
I received the new FSD Beta v11.4.7 update on my Model 3 this week.
…
I was on FSD Beta with the speed set at 118 km/h (73 mph) on the 20 direction Montreal, and the system automatically moved to the left lane to pass a car.
As I was passing the car, I felt FSD Beta veering aggressively to the left toward the median strip. Fortunately, I use FSD Beta as recommended by Tesla, which means my hands on the wheel and my eyes on the road.
I was able to steer back toward the road, which disengaged FSD Beta. It was super scary as I almost lost control when correcting FSD Beta and again, I was passing a vehicle. I could have crashed into it if I overcorrected.
When you disengage Autopilot/FSD Beta, Tesla encourages you to send a message about why you disengaged the system.
I did that, but I wasn’t sure what happened, so my message was something like: “Autopilot just tried to kill me, so please fix it.”
All I can ask is why would anyone drive in a car that was on auto-pilot? I would never trust any of these computers to do anything that could potentially kill me.
I don’t know. Is it the same people who pay good money to deploy surveillance platforms from Apple, Amazon and Google in their homes??? It’s nuts.