Bussiness
Cop Leaps Out of the Way Just as Self-Driving Tesla Wipes Out His Cruiser
Gangway!
Crash Bandi-cop
On Thursday, a driver behind the wheel of a Tesla slammed into a police cruiser that was parked at the scene of another collision, CBS News reports. And yep, you guessed it: the Tesla was reportedly in one of its self-driving modes at the time.
The incident took place just minutes after midnight in Fullerton, California. An officer from the city’s police department was responding to another deadly crash, directing traffic while waiting for clean-up crews to arrive.
According to the Fullerton Police Department, emergency flares had been spread out on the road to alert drivers. But while the cop stood by his patrol vehicle — with emergency lights flashing — a blue Tesla came careening towards him out of nowhere.
The officer leaped out of the way, but the out-of-control EV slammed into his parked cruiser. It came within seconds of being a fatal accident — but no serious injuries were reported.
Repeat Offender
The police claim that the driver admitted to using his phone while behind the wheel of the Tesla as it was in self-driving mode. Which mode that was — either Autopilot or Full Self-Driving — wasn’t specified.
There’s the possibility that the driver could be lying, of course, blaming controversial tech to get off the hook, and we won’t know for certain until authorities view the automatic crash report generated by Tesla. But this would be far from the first time that the automaker’s self-driving cars have been involved in crashes — or even just crashes with other cop cars.
The US National Highway Traffic Safety Administration, in fact, opened an investigation into Tesla in 2021 after its cars in self-driving modes kept colliding with emergency vehicles. The probe ended last year with the NHTSA forcing Tesla to issue software updates to two million of its cars, but the regulator is now considering whether that action went far enough.
Distracted Drivers
Beyond its incidents with emergency vehicles, Tesla has repeatedly come under fire for accounts of its cars endangering occupants and others on the road. A driver last month, for example, said his car in Full Self-Driving mode failed to stop at a railroad crossing and nearly plowed into a moving train.
Debates over the safety of those systems rage on. But there is equal controversy — and concern among regulators — over not just the performance of the driving modes, but their names.
Though “Autopilot” and “Full-Self-driving” may give the impression of the cars being fully autonomous, they’re just driver-assistance systems — an arguably deceptive marketing strategy that has incurred the scrutiny of federal agencies, including the NHTSA, and state regulators.
Critics fear that drivers, believing their Teslas to be more autonomous than they really are, may get lazy and inattentive behind the wheel. And from what we can tell, it sure sounds like that’s what happened here.
More on Tesla: Watch a Self-Driving Tesla Get Completely Baffled by a Road Worker Directing Traffic