Tesla is demanding that an advocacy group remove videos of its vehicles running over child-sized mannequins, escalating the debate over the safety of the company’s Full Self-Driving software.
Tesla sent a cease-and-desist letter to the Dawn Project, an anti-Tesla group fronted by software developer Dan O’Dowd. In it, the company alleges the videos are defamatory and misrepresents the capabilities of its driver-assist technology. (The letter was first reported by The Washington Post.)
New video of Master Scammer Musk’s Full Self-Driving @Tesla ruthlessly mowing down a child mannequin wearing a safety vest in a real school crosswalk. No cones. Room to swerve. Video of pedals.
Everything is real except the child, bc you know what would happen to a real child! pic.twitter.com/a3ut9bpSqG
— Dan O’Dowd (@RealDanODowd) August 15, 2022
The letter is in response to an advertisement from the Dawn Project that purports to show a Tesla vehicle with FSD software running over a child-sized mannequin while driving at 20 mph.
“It has come to our attention that you, personally, and The Dawn Project have been disparaging Tesla’s commercial interests and disseminating defamatory information to the public regarding the capabilities of Tesla’s Full Self Driving (FSD) (Beta) technology,” Dinna Eskin, senior director and deputy general counsel at Tesla, writes in the letter. “We demand that you immediately cease and desist further dissemination of all defamatory information, issue a formal public retraction within 24 hours and provide Tesla with the below demanded documentation.”
The letter comes a few days after YouTube removed several videos that show Tesla drivers carrying out their own safety tests to determine whether the company’s Full Self-Driving (FSD) software would enable it to automatically stop for children walking across or standing in the road. Those videos were posted in response to demonstrations by O’Dowd’s group that showed Tesla vehicles using FSD running over the child-sized dummies.
It does not appear that Tesla filed any objections to the videos from its fans using real children to demonstrate FSD’s driver-assist capabilities. But it claims that the Dawn Project’s videos “portray unsafe and improper use of FSD Beta and active safety features,” adding that the group’s actions “actually put consumers at risk.”
Eskin also demands that O’Dowd and the Dawn Project preserve records and communications related to its anti-Tesla advertising campaign, noting that the company was putting the group “on notice” for future legal action.
Tesla is finding itself under increasing scrutiny over its decision to allow nonprofessional drivers to test a beta version of its driver-assist software on public roads. Critics say the company is endangering the lives of its customers as well as pedestrians, cyclists, and other drivers on the road. Tesla supporters insist FSD is safer than human driving.
Earlier this month, auto safety advocate Ralph Nader called on federal regulators to recall FSD, calling its deployment “one of the most dangerous and irresponsible actions by a car company in decades.”
The National Highway Traffic Safety Administration is currently investigating 16 crashes in which Tesla owners using Autopilot crashed into stationary emergency vehicles, resulting in 15 injuries and one fatality. The probe was recently upgraded to an “Engineering Analysis,” which is the second and final phase of an investigation before a possible recall.
Tesla vehicles today come standard with a driver-assist feature called Autopilot. For an additional $15,000, owners can buy the FSD option, which Tesla CEO Elon Musk has repeatedly promised will one day deliver fully autonomous capabilities. But to date, FSD remains a “Level 2” advanced driver-assistance system, meaning the driver must stay fully engaged in the operation of the vehicle while it’s in motion.
In addition to the emergency vehicle crashes, NHTSA has also compiled a list of Special Crash Investigations (SCI) in which the agency collects data beyond what local authorities and insurance companies typically gather at the scene. The agency also examines crashes involving advanced driver-assist systems, like Tesla’s Autopilot, and automated driving systems.
O’Dowd’s group is not the first to highlight some potential flaws in Tesla’s FSD. Earlier this year, lidar manufacturer Luminar published a video demonstrating how its laser sensors enabled vehicles to stop for child-sized dummies, while Tesla’s camera-only based system did not.
Nonetheless, O’Dowd seems to relish in irking Musk. The software company executive previously ran a full-page ad in The New York Times disparaging Tesla’s FSD, offering $10,000 to the first person who could name “another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes.”
In response to Tesla’s cease-and-desist letter, O’Dowd called Musk a “crybaby hiding behind his lawyer’s skirt.”
“He is obsessed with stopping me from exposing that his Full Self-Driving cars could mow down a child dressed in a safety vest in a school crosswalk,” O’Dowd said in a statement. “I guess because that wouldn’t be good for the brand.”
Read the full article here