California has told Tesla it is considering stricter regulation of the electric vehicle (EV) maker’s driving assistance tools being tested on public roads, following videos posted online of disturbing episodes.
Several clips on YouTube and Twitter show drivers testing Full Self-Driving (FSD) beta and suddenly having to regain control of their vehicles to prevent their Tesla from hitting a pole or veering into the oncoming lane.
Tesla has noted the tools require active driver supervision, but the California Department of Motor Vehicles (DMV) said in a letter to the firm on January 5 it is reviewing whether the features meet the definition of an autonomous vehicle.
Elon Musk’s EV company has recruited some motorists for real-conditions tests of FSD beta, which is supposed to be able to drive in the city, stop automatically or make turns.
California’s DMV wrote in its letter that it is revisiting its “classification decision following recent software updates, videos showing dangerous use of that technology and open investigations” from US regulators.
If the DMV decides to classify Tesla’s driver assistance systems as an autonomous vehicle, the rules will be stricter. “DMV will be initiating further review of the latest releases, including any expansion of the programme and features,” the letter said.
Tesla would, for example, have to report any problems it encounters to the agency and would have to identify all drivers testing its new tools. The company did not respond to a request for comment.
- AFP with additional editing by George Russell