Computer Science & Engineering professor Sanjoy Baruah provides expert insight on the recent crashes involved with Tesla's Autopilot. >> Read the full article on mashable.com
Consumer groups brought up limitations of Autopilot's capabilities in a letter to the Federal Trade Commission last week. They called Tesla's autonomous feature "dangerously misleading and deceptive." Instead of focusing on and educating about its partial capabilities, Tesla makes Autopilot seem like a fully autonomous tool through marketing, advertising, company statements, and online content, consumer advocates say.
That's not the first time Tesla's been under fire for pumping up expectations of Autopilot. Back in 2016, Germany called out the company for claiming the cars could drive themselves more than they really could in Autopilot mode.
As with most new technology, Washington University in St. Louis engineering professor Sanjoy Baruah says our expectations are too high. "Users are still trying to get a feel for what it’s supposed to be doing for us," he said about Autopilot and other self-driving tools. And while Autopilot may not be able to handle this basic driving scenario with emergency vehicles blocking the road, Baruah sees how automated tech can be a life-saver for sleepy, distracted, or inebriated drivers. It's a balancing act that we'll eventually get the hang of — and the technology will improve, too. "It's new things we are learning to come to terms with," Baruah said.
Try as it might to explain that Autopilot is only semi-autonomous and still requires full driver attention, Tesla has more explaining to do until it becomes clear for drivers. Until then, they'll keep crashing into parked cars — or worse.