Jump to content

Rising Concerns Over Tesla’s Full Self-Driving System Safety Issues and Fatal Crashes


Recommended Posts

Posted

image.png

 

Questions about the safety and reliability of Tesla’s “Full Self-Driving” system are mounting, with multiple incidents and troubling experiences reported by users. William Stein, a technology analyst at Truist Securities, has personally tested Tesla’s self-driving technology three times in the past four months, responding to Elon Musk’s repeated invitations to try the latest versions of the system. Despite the company’s claims that a Tesla equipped with Full Self-Driving can travel from point to point with minimal human intervention, Stein’s experiences have consistently highlighted the system’s flaws, making unsafe and even illegal maneuvers during each test drive. In his most recent test earlier this month, Stein was accompanied by his 16-year-old son, who was left “terrified” by the car’s unpredictable behavior.

 

Tesla Self Driving Risks

 

Stein’s experiences, coupled with a deadly crash in the Seattle area in April involving a Tesla using Full Self-Driving that killed a motorcyclist, have drawn increased scrutiny from federal regulators. The National Highway Traffic Safety Administration (NHTSA) has been investigating Tesla’s automated driving systems for more than two years due to a series of accidents that raised concerns about safety. These incidents have intensified doubts among experts and industry observers about the viability of Tesla’s autonomous system and whether it can ever operate safely on a large scale. Stein expressed skepticism about Musk’s ambitious timeline, stating that Tesla is unlikely to deploy a fleet of autonomous robotaxis by next year, a goal Musk has repeatedly promised.

 

The timing of these safety concerns is critical for Tesla, as the company is nearing a key phase in its autonomous vehicle strategy. Musk has told investors that Full Self-Driving could surpass human driving safety standards by the end of this year or early next year. Furthermore, Tesla is set to unveil a vehicle specifically designed as a robotaxi in less than two months. Musk has maintained that for Tesla to deploy these robotaxis, the company must demonstrate to regulators that its system can drive more safely than human drivers. According to federal regulations, Tesla’s vehicles must meet national safety standards before they can operate autonomously on public roads.

 

Musk has presented data showing the number of miles driven per crash for Tesla’s less sophisticated Autopilot system. However, safety experts have criticized this data, arguing it is misleading because it only accounts for crashes severe enough to trigger airbag deployment, without considering how often human drivers had to intervene to avoid a collision. Currently, Tesla’s Full Self-Driving system is being used on public roads by approximately 500,000 Tesla owners, representing slightly more than one in five Teslas in use today. Many of these drivers paid over $8,000 for the optional system, despite the company’s warnings that cars equipped with Full Self-Driving cannot truly drive themselves and that motorists must remain vigilant and ready to intervene at any time.

 

Tesla also monitors driver behavior, suspending access to Full Self-Driving for those who fail to properly supervise the system. Recently, Tesla began referring to the system as “Full Self-Driving (Supervised),” acknowledging that human oversight is still crucial. Musk has admitted that his past predictions about autonomous driving have been overly optimistic. In 2019, he promised a fleet of autonomous vehicles by the end of 2020, a goal that remains unfulfilled five years later. Many experts who closely monitor the development of autonomous driving technology doubt that Tesla’s system can function reliably across the United States as promised. Michael Brooks, executive director of the Center for Auto Safety, stated bluntly, “It’s not even close, and it’s not going to be next year.”

 

Stein’s most recent test drive took place in a Tesla Model 3, which he picked up from a Tesla showroom in Westchester County, just north of New York City. The Model 3, Tesla’s most affordable vehicle, was equipped with the latest version of Full Self-Driving software, which Musk claims utilizes artificial intelligence to manage steering and pedal control. While the ride initially felt smooth and more human-like compared to previous versions, Stein reported that within a short drive of less than 10 miles, the car made a left turn from a through lane while running a red light, a maneuver that Stein described as “stunning.” Although there was little traffic at the time and the maneuver did not seem immediately dangerous, Stein later found himself intervening when the vehicle drove down the center of a parkway, straddling two lanes of traffic moving in the same direction.

 

In a report to investors, Stein emphasized that the latest version of Full Self-Driving does not fulfill Musk’s promise of achieving true autonomy, nor does it appear to be approaching the capabilities required for robotaxi deployment. Stein’s earlier test drives in April and July also revealed unexpected and unsafe behavior from Tesla vehicles. Despite these concerns, Tesla has not responded to requests for comment on these incidents. While Stein believes Tesla’s driving technology will eventually prove profitable, he does not foresee a future where fully autonomous robotaxis with no driver and a passenger in the back seat will be on the roads anytime soon. He predicts significant delays or restrictions on where these vehicles can operate.

 

Stein noted that there is often a substantial gap between Musk’s statements and the reality of what is achievable. Many Tesla enthusiasts have shared videos on social media of their cars driving themselves without human intervention, but these videos rarely provide a comprehensive view of how the system performs over extended periods. Other videos have captured alarming instances of dangerous behavior. Alain Kornhauser, who leads autonomous vehicle research at Princeton University, shared his own experiences after borrowing a Tesla from a friend for two weeks. While he found the car’s ability to detect pedestrians and other vehicles impressive, Kornhauser still had to intervene on several occasions when the car made unsettling maneuvers. He emphasized that Full Self-Driving is not yet ready to operate without human oversight in all settings, stating, “This thing is not at a point where it can go anywhere.”

 

Kornhauser does believe that Tesla’s autonomous system could work effectively in smaller, controlled areas within cities where detailed maps could guide the vehicles. He questioned why Musk hasn’t pursued this more incremental approach to deployment, suggesting that offering rides in limited areas could provide significant mobility benefits to the public. For years, experts have pointed out that Tesla’s reliance on cameras and computer vision for navigation has inherent limitations. Cameras are prone to failure in adverse weather conditions and low-light environments, making it challenging for the system to consistently identify and respond to objects in its path. In contrast, other companies developing autonomous robotaxi technology, such as Alphabet Inc.'s Waymo and General Motors’ Cruise, use a combination of cameras, radar, and lidar sensors to enhance detection accuracy.

 

Missy Cummings, a professor of engineering and computing at George Mason University, underscored the importance of combining multiple sensor types, arguing, “If you can’t see the world correctly, you can’t plan and move and actuate to the world correctly. Cars can’t do it with vision only.” Even vehicles equipped with radar and lidar face challenges, highlighting broader safety concerns not just for Tesla, but for the entire autonomous vehicle industry. Phil Koopman, a professor at Carnegie Mellon University specializing in autonomous vehicle safety, noted that it will take many more years before autonomous vehicles powered solely by artificial intelligence can handle the complexities of real-world driving conditions. “Machine learning has no common sense and learns narrowly from a huge number of examples,” Koopman said. “If the computer driver gets into a situation it has not been taught about, it is prone to crashing.”

 

The fatal crash in April near Seattle has further fueled concerns. Authorities reported that a Tesla using Full Self-Driving struck and killed a motorcyclist in Snohomish County. The Tesla driver, who has not been charged, admitted to using Full Self-Driving and looking at his phone when the car rear-ended the motorcyclist, who was pronounced dead at the scene. The NHTSA is currently reviewing information about the crash from Tesla and local law enforcement. The agency is also aware of Stein’s troubling experiences with Full Self-Driving.

 

The NHTSA is investigating whether a Tesla recall earlier this year, intended to enhance the vehicle’s driver monitoring system, was effective. The agency also pushed Tesla to recall Full Self-Driving in 2023 because the system could, under “certain rare circumstances,” violate traffic laws, increasing the risk of a crash. The NHTSA has not yet confirmed whether the recall has successfully addressed these issues. As Tesla’s electric vehicle sales have stagnated in recent months, despite significant price reductions, Musk has urged investors to view the company as a leader in robotics and artificial intelligence, rather than just a car manufacturer. Tesla has been working on its Full Self-Driving system since at least 2015, but progress has been slow and fraught with challenges. During a recent earnings call, Musk remarked, “I recommend anyone who doesn’t believe that Tesla will solve vehicle autonomy should not hold Tesla stock.”

 

Stein, however, advised investors to critically evaluate Full Self-Driving’s performance, noting that Tesla’s most established artificial intelligence project, which is already generating revenue and being used in real-world conditions, still has a long way to go before it can be considered truly autonomous and safe. The growing list of safety issues, fatal accidents, and missed deadlines underscores the significant hurdles Tesla faces as it strives to make Full Self-Driving a viable and widespread reality.

 

Credit: ABC News 2024-08-30

 

news-logo-btm.jpg

 

Cigna Banner (500x100) (1).png

 

Get the ASEAN NOW daily NEWSLETTER - Click HERE to subscribe

  • Agree 1
Posted
1 hour ago, Social Media said:

Questions about the safety and reliability of Tesla’s “Full Self-Driving” system are mounting, with multiple incidents and troubling experiences reported by users.

Surprise, surprise, surprise, NOT.

  • Thumbs Up 1
  • Agree 1
Posted

This is the exact reason why I won't buy any EV car or vehicle. Suppose I am driving home from work and a Tesla driven by a foetus smashes into me?????? Who am I supposed to sue if I am dead or injured. And who will pay for the new battery in my car???? Answer me that! Do foetuses have money? Do they have licenses and are they even insured?

  • Haha 1
Posted
19 minutes ago, retarius said:

This is the exact reason why I won't buy any EV car or vehicle. Suppose I am driving home from work and a Tesla driven by a foetus smashes into me?????? Who am I supposed to sue if I am dead or injured. And who will pay for the new battery in my car???? Answer me that! Do foetuses have money? Do they have licenses and are they even insured?

 

Absolutely.   The roads are so much safer with humans behind the wheel right?  You can rely on humans not to drink and drive, read and send text messages, engage in road rage, smoke weed and do the myriad of other things that cause 20,000 deaths per year in Thailand alone.   Good luck suing the guy in a truck who didn't have insurance, may or may not have a licence, doesn't have much money and was drunk whilst texting on his phone and crashed into you.   

 

I am guessing here but I would put money on humans being the cause of 99% of accidents that result in someones death rather than technical failure.   The sooner technology removes the weak link in the car (the human), the sooner the roads will be safer and less congested.   If humans had to meet the same safety standards consistently as self driving cars hardly anyone would be allowed to drive.

Posted
4 hours ago, advancebooking said:

jesus. A self driving car in Thailand. Recipe for disaster

Definitely be miles better than locals behind the wheel. 😱

  • Haha 1

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.



×
×
  • Create New...