Tesla driver’s complaint being looked into by US auto safety regulators
US auto safety regulators are looking into a complaint from a Tesla driver that the company’s Full Self-Driving software caused a crash.
The driver was beta testing the Full Self-Driving software, and the Tesla SUV went into the wrong lane and was hit by another vehicle, according to a complaint filed by the driver with the National Highway Traffic Safety Administration.
The car went into the wrong lane and I was hit by another driver in the lane next to my lane, the driver wrote.
The vehicle, a 2021 Tesla Model Y small SUV, gave the driver an alert halfway through the turn, and the driver tried to turn the wheel to avoid other traffic, according to the complaint.
But the car took control and forced itself into the incorrect lane, creating an unsafe maneuver putting everyone involved at risk, the driver wrote.
No one was injured in the crash, but the Model Y was severely damaged on the driver’s side, according to the complaint filed with the agency online Monday and posted in its public complaint database.
The crash happened on November 3, and the driver’s location is Brea, California, but the location of the crash was not identified. NHTSA does not release names of those who file complaints.
It is likely the first complaint filed with the agency alleging that Full Self-Driving software caused a crash.
A message was left Friday seeking comment from Tesla, which has disbanded its media relations department.
A NHTSA spokeswoman said Friday night the agency is aware of the complaint and is communicating with Tesla to get more information. The spokeswoman says people should report safety concerns to the agency.
The inquiry is another sign that NHTSA is becoming more aggressive in watching autonomous and partially automated driving systems under President Joe Biden.
In the past the agency has been reluctant to regulate the systems, saying that it didn’t want to delay potentially life-saving technology.
Tesla says that Autopilot and Full Self-Driving are driver-assistance systems and cannot drive themselves, despite their names.
The automaker says drivers have to be ready to intervene at any time.
Selected Tesla drivers have been beta testing the software on public roads, a practice that critics say endangers others because the software has flaws and the drivers are untrained. Other companies that test on public roads have human safety drivers on board ready to intervene.
Beta testing is a field test of software done by users before the full commercial release is ready.
Critics have been calling on NHTSA to act after several videos were posted on the internet allegedly showing Tesla’s software making mistakes and drivers having to take action.
Hopefully, this gives @NHTSAgov ammunition it needs to take action on FSD now rather than waiting for Tesla to take its time through partial data releases, Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, wrote on Twitter.
In June, NHTSA ordered automakers to report any crashes involving fully autonomous vehicles or partially automated driver assist systems.
It wasn’t clear whether Tesla reported crash involving the California driver. Two months later it opened a formal investigation into Tesla’s Autopilot partially automated driver-assist system after a series of collisions with parked emergency vehicles.
NHTSA already has asked Tesla for information about the beta testing, including a requirement that testers not disclose information.
The agency said that non disclosure agreements could hamper its ability to investigate.
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)
Dear Reader,
Business Standard has always strived hard to provide up-to-date information and commentary on developments that are of interest to you and have wider political and economic implications for the country and the world. Your encouragement and constant feedback on how to improve our offering have only made our resolve and commitment to these ideals stronger. Even during these difficult times arising out of Covid-19, we continue to remain committed to keeping you informed and updated with credible news, authoritative views and incisive commentary on topical issues of relevance.
We, however, have a request.
As we battle the economic impact of the pandemic, we need your support even more, so that we can continue to offer you more quality content. Our subscription model has seen an encouraging response from many of you, who have subscribed to our online content. More subscription to our online content can only help us achieve the goals of offering you even better and more relevant content. We believe in free, fair and credible journalism. Your support through more subscriptions can help us practise the journalism to which we are committed.
Support quality journalism and subscribe to Business Standard.
Digital Editor