This is a story that could make you rethink your trust in autonomous driving technology. A Tesla fan tried to prove that the company’s Full Self-Driving (FSD) system can handle a coast-to-coast journey without human intervention—only to end up paying over $22,000 in repairs. But here’s where it gets controversial: the damage wasn’t caused by a system failure, but by a pre-existing issue that the car’s sensors failed to detect. Let’s break this down.
Back in 2016, Tesla CEO Elon Musk boldly claimed the company would one day achieve a ‘driverless’ trip across the U.S., proving that FSD could navigate any road without human input. While that goal hasn’t been realized, a group of Tesla enthusiasts recently attempted the same challenge in a nearly brand-new Model Y. Their mission? To test whether FSD could handle the real-world chaos of long-distance travel.
Justin Demaree, known online as Bearded Tesla Guy, led the effort. Alongside a friend, they set off from San Diego, activating FSD Supervised and hoping for a smooth ride. However, within 60 miles, disaster struck. At 75 mph, the car slammed into a metal ramp, sending the passengers flying and causing significant damage. The system didn’t brake or swerve—despite the obvious danger.
The aftermath was far worse than expected. A damaged battery, shattered parts, and a car that refused to charge at Supercharger stations left the duo stranded. After a costly repair, the bill exceeded $22,000—but here’s the twist: Tesla covered the battery replacement under warranty. Yet, the root cause of the problem was already present. A service technician revealed that the battery had a cell imbalance before the crash, which the impact likely accelerated.
And this is the part most people miss: FSD isn’t a fully autonomous system. It’s still classified as Level 2 automation, meaning the driver must always stay alert. The crash highlights a critical flaw: the system can’t predict every edge case. As one expert put it, ‘FSD might handle a scenario perfectly one day, only to fail the next when the driver expects it to work.’
While no one was hurt, the incident raises questions. Is this a flaw in the technology, or a limitation of current AI? Demaree claims he recently drove 1,000 miles on FSD without intervention—but independent tests show that even experienced drivers can’t rely on the system to handle all situations. What do you think? Is this a sign of progress, or a warning that we’re not ready for true autonomy? Share your thoughts in the comments below.