Last week, Musk unveiled Tesla's two-seater, two-door "Cybercab" robotaxi concept without a steering wheel and pedals that would use cameras and artificial intelligence to help navigate roads.
The U.S. auto safety regulator said on Friday it has opened an investigation into Tesla's Full Self-Driving software after reports of four collisions, including one fatal crash, involving its driver-assistance technology in low-visibility conditions.

The preliminary probe will cover 2.4 million Tesla vehicles, the National Highway Traffic Safety Administration said, accounting for a big share of the electric automaker's vehicles on U.S. roads.

It will assess FSD's ability to detect and suitably respond to reduced visibility conditions, among other issues, the regulator's Office of Defects Investigation said.

The company did not immediately respond to a Reuters request for comment. Its shares were marginally down before the bell.

The latest evaluation could be a hurdle for CEO Elon Musk's efforts to shift Tesla's focus to self-driving technology and robotaxis amid competition and weak demand in its auto business.

Last week, Musk unveiled Tesla's two-seater, two-door "Cybercab" robotaxi concept without a steering wheel and pedals that would use cameras and artificial intelligence to help navigate roads.

In its present state, Tesla's approach to driver-assistance technology requires constant driver attention but keep costs down. It has, however, faced legal scrutiny with at least two fatal accidents involving the technology.

The company had in December recalled more than 2 million vehicles in the U.S. to install new safeguards in its Autopilot advanced driver-assistance system, after a federal safety regulator cited safety concerns.

The NHTSA said in April it was probing whether that recall and the new safeguards were adequate.