Tesla’s Full Self Driving Beta is so poor in some situations that it endangers the lives of other traffic participants, according to a new video.
Autonomous driving is one of the most researched topics in the automobile industry, and Tesla looks like it’s at the forefront. Recently, Tesla’s new “Full Self-Driving” Beta software was made available to a few owners, and Youtuber AI Addict has uploaded a video showing the software in motion. Do not watch the video expecting this software to be the next greatest invention because it is not working.
In the video, AI Addict takes his Model 3 integrated with the FSD Beta 8.2 on Oakland’s streets, and the software seems to be confused and late to react most of the time. Although the software is advertised and referred to as fully self-driving, the driver needs to pay attention to the road all the time, which was made clear in this video. The start of the ride was good when the car knew how to pass two parked vehicles in front of them but quickly deteriorated when it tried to park itself on the centerline of a left turn lane.
Small mistakes like wrong parking and lane picking may be acceptable in the testing phase as a means for development. However, the system made some bad decisions that might’ve caused an accident. Notably, the car almost crashed when it proceeded past the stop sign and the upcoming cross-traffic coming. This incident endangered collision with two vehicles, but the situation didn’t escalate thanks to the other driver.
Another near collision incident happened when the FSD system struggled to navigate through an intersection and almost drove straight into a fence before the driver took control of the wheel to prevent it. These incidents show how crazy it is to have such software running on the streets possessing the risk of causing accidents and property damage.
Tesla has been using its customers as guinea pigs to test out the limitations and improve its software. But the beta software is far from a finished product. It can detect basic figures like cars, stop signs, pedestrians, bikes, and traffic lights but is confused when it has to act accordingly.
Besides, Tesla will not take responsibility if a car owner had an accident while having his vehicle on FSD mode. That’s because the Full Self Driving software comes with a disclaimer that it’s not a full self-driving feature. Go figure. It’s like buying a pack of candies with baked beans inside. Tesla, and especially authorities, need to address this soon before people are injured, or even worse, killed.