The North Carolina native wanted to disprove a widely circulated video of a Tesla running the company’s “fully self-driving” beta software — which allows the car to steer, brake and accelerate, but requires an attentive human driver ready to turn the wheel. take — plowing in child-sized mannequins.
Dan O’Dowd, a CEO of a software company that published the video earlier this month, thinks
the National Highway Traffic Safety Administration should ban “full self-driving” until Tesla CEO Elon Musk “proves it won’t mow kids.”
Then Cupani, who runs a car shop that focuses on imports and Teslas, got involved and recruited his son. Although he describes himself as a “BMW man,” Cupani says the software can’t be compared to what Tesla offers. Nor was it the first time he enlisted his son, who Cupani said is 11 years old, in a potentially viral car effort: Earlier this year, he posted a video of his son driving his Model S Plaid – which 0 can reach -60 in 1.99 seconds — in a private parking lot. It has been viewed more than 250,000 times.
“Some people look at it and say, ‘Oh this crazy dad, what is he doing?'” Cupani told CNN Business. “Well, I do a lot of stuff like that, but I’m going to make sure my kid doesn’t get hit.”
Cupani filmed the test of “fully self-driving” in a parking lot. His son stood at the end of an aisle with a smartphone to film the test. Cupani accelerated the Tesla from the other side of the lot and engaged ‘fully self-driving’, up to 35 mph. The Tesla braked steadily and came to a stop—well ahead of his son.
Cupani did another test with his son on the street using Autopilot, the more rudimentary Tesla driver assistance software, and found that it stopped for his son too. “This Dan man says he’s an expert at this, an expert at that,” Cupani said. “Well, I’m an expert in cars, future technology, professional driving instructor.”
Cupani is one of many Tesla supporters who objected to O’Dowd’s video and started making their own tests. Some asked their children to help. Others built homemade mannequins or used blow-up dolls.
The impassioned defenses and criticisms of “fully self-driving” show how the technology has become a focal point in the industry. The California DMV recently said the name “fully self-driving” is deceptive and is a reason to suspend or revoke Tesla’s license to sell vehicles in the state. Ralph Nader, whose critique of the auto industry in the 1960s fueled the creation of the National Highway Traffic Safety Administration (NHTSA), joined a chorus of critics of “fully self-driving driving” this month.
But it’s also yet another example of the unintended consequence of deploying an unfinished, disruptive technology into the wild — and shows how far some Tesla believers are willing to go to defend it and the company. There seemed to be enough people conducting their own experiments that a government agency took the extraordinary step of warning people not to use children to test a car’s technology.
“Consumers should never attempt to create their own test scenarios or use real people, especially children, to test the performance of vehicle technology,” NHTSA said in a statement Wednesday. The agency called this approach “very dangerous.”
Earlier this month, Tad Park, a California resident, saw that another Tesla enthusiast wanted to test “fully self-driving” with a child, and volunteered. two of his children. Park told CNN Business it was “a little difficult” to get his wife to do that. She agreed when he promised to drive the vehicle.
“I’m never going to push the boundaries because my kids are way more valuable to me than anything,” Park said. “I’m not going to risk their lives in any way.”
Park’s tests, unlike O’Dowd’s, started with the Tesla at 0 mph. The Tesla dropped out of all of Park’s tests for two of his children involved in the video, including a 5-year-old. Park said he didn’t feel comfortable doing a 40mph higher speed test — like O’Dowd did using the mannequins — with his kids.
Toronto native Franklin Cadamuro created a “box boy,” a childlike shape made from old Amazon cardboard boxes. “Don’t blame me for what the car does or doesn’t do,” he posted at the beginning of his video. “I’m a big Tesla fan.”
His Tesla slowed as he approached “box boy.” Then it accelerated again and hit his cardboard mannequin. Cadamuro speculated that this could be because the cameras couldn’t see the short boxes once they were directly in front of the bumper, and therefore forgot they were there.
Human babies learn that an object out of sight still exists at about eight months, many years before they qualify for a driver’s license. But the capability may still elude some artificial intelligence systems like Tesla’s “fully self-driving.” Another Tesla fan found a similar result.
Cadamuro said his video started out as entertainment. But he wanted people to see that “full self-driving” isn’t perfect.
“I find that a lot of people have two extreme thoughts about the ‘fully self-driving’ beta,” Cadamuro said. “People like Dan think it’s the worst thing in the world. I know some friends who think it’s almost perfect.’
Cadamuro said he had also conducted other tests where his Tesla, which was traveling at higher speeds, effectively steered around “box boy.”
Detecting smaller objects like young children quickly and accurately will generally be more difficult than detecting large objects and adults for a computer vision system like what Tesla vehicles rely on, according to Raj Rajkumar, a professor at Carnegie Mellon University who conducts research. to autonomous vehicles.
The more pixels an object includes in a camera image, the more information the system has to detect features and identify the object. The system will also be affected by the data it is trained on, such as the number of images of small children it is exposed to.
“Computer vision with machine learning is not 100% foolproof,” said Rajkumar. “Like the diagnosis of a disease, there are always false positives and negatives.”
Tesla has not responded to a request for comment and is generally out of contact with the professional news media.
“Wild West Chaos Reigns”
After criticism from Tesla fans of his original tests, O’Dowd released another video
Some Tesla supporters had criticized O’Dowd’s use of cones as lane markers in his original testing, which may have limited the sedan’s ability to steer around the mannequin. Others claimed that O’Dowd’s test driver forced the Tesla to hit the mannequin by pressing the accelerator, which was not visible in videos O’Dowd released. Some Tesla enthusiasts also pointed to blurry messages on the Tesla vehicle’s screen as an indication that O’Dowd’s test driver was pressing the accelerator pedal to manipulate the tests.
O’Dowd told CNN Business that the blurry posts referenced the unavailability of supercharging and uneven tire wear. CNN Business could not independently verify what the message said, as O’Dowd did not provide a sharper video of what happened in the car during the tests.
In his second video
, O’Dowd’s tested without cones on a residential street and showed the interior of the Tesla, including the accelerator pedal. The Tesla, as in O’Dowd’s other tests, hit the child’s mannequin.
O’Dowd complained in an interview with CNN Business earlier this year that no testing agency is investigating the code for “fully self-driving.” The US government does not have performance standards for automated driver assistance technology like Autopilot.
O’Dowd is the founder of the Dawn Project, an effort to make computers safe for humanity. He unsuccessfully ran for the US Senate this year in a campaign focused solely on his criticism of “fully self-driving driving.”
NHTSA is currently investigating Tesla’s driver assistance technology, so changes may be on the way.
“The software that controls the lives of billions of people in self-driving cars should be the best software ever written,” O’Dowd said. “We’re using absolute chaos rules in the Wild West and we’ve gotten something so terrible.”