Recently, Tesla launched the beta version of its “Full Self-Driving” (FSD) technology, a more advanced version of autopilot of the brand cars Elon Musk. The additional feature costs $ 10,000 and offers self-parking, traffic light recognition and stop signs and the ability to enter ramps and highway exits.
Tesla’s autopilot, however, is not quite so automatic. In a video posted on March 12, YouTube user AI Addict drives a Tesla Model 3 with the FSD beta. It is possible to see the vehicle dodging objects and other vehicles, trying to go down railway tracks and even hitting a row of poles that separates the road from a bike path. The only thing that prevented accidents was the driver, who quickly took over.
See the video:
In another video from the same channel, the user travels through the city of Oakland, in the United States, and watches the autopilot drive towards the wrong side of the road and almost crash several times with other cars.
“As is evident, FSD 8.2 doesn’t work as well in big cities as Oakland,” said the creator at the end of the video. “It seems to work very well and, in fact, very excellently when it comes to suburbs and rural areas, but it is clear that Tesla still needs to make further updates when it comes to these urban units.”
See the video:
The United States National Highway Traffic Safety Administration (NHTSA) recently confirmed investigating 23 accidents that may have been caused by a failure in the Tesla cars autopilot tool.
“To make things safer, Tesla could start using in-vehicle cameras to monitor driver attention, as many other automakers do,” said Jason Levine, executive director of the Center for Auto Safety, a consumer protection group , to the news site Insider. Systems like GM’s Super Cruise already do this type of monitoring, tracking the driver’s eyes to make sure he has his eyes on the road.
Levine also points out that changing the names “Autopilot” and “FSD”, which are “misleading”, since none of the technologies are really autonomous, would also be a good start. “The insistence on a totally hyperbolic description really undermines any kind of effort to present this technology in a way that does not present an irrational risk,” he said.
Tesla recognizes that both autopilot and FSD require constant attention from the driver and are not fully autonomous. However, it reinforces that their cars are safe and that their data shows that the Teslas have fewer accidents per kilometer traveled and “even less when the autopilot is in use”. Finally, the company makes it clear that drivers should pay attention to the road and follow Tesla’s recommendations to regain control of your cars when needed.