8c07dbf9b75a1642bfce6550d88044fb

The Problem with Hype(r) Cars

After much anticipation, Faraday Future finally revealed its production car, the FF 91. The presentation introduced the FF 91 as “the smartest car you’ll ever drive” and described capabilities of advanced sensors, machine learning, and autonomous driving — all great buzzwords. We saw a live demonstration of the FF 91’s ability to drive itself with the “driverless valet” feature. The car successfully parked itself in a parking lot outside the reveal and we were told to “never worry about parking again.”

Except, I watched the rest of the reveal and I’m pretty worried.

Before you chalk this up to the dogpile, let me explain.

I’m not worried about the “driverless valet” feature that failed on stage — because it didn’t. The car presented on stage was running specific code for the demo, which was different from the fully functioning car we watched park itself earlier. I don’t know why, but that’s not relevant. The point is the demo code failed, not FF’s self-driving system.

The reason I’m worried is because my Ph.D. is in Human Factors and I have a history in working with automation in aviation. Specifically, I’ve worked on how humans interact with automation in commercial airline cockpits for the FAA’s Next Generation Air Transportation System. This article is my first transition from peer-reviewed scientific publications to editorials. Why?

Because I watched an “autonomous car” reveal that contained more jargon than meaningful automation concepts.

Because I watched a senior VP of Research & Development use the farce of dimming the lights as a cover to sneak a technician into a supposedly self-driving car.

Because I was told a car’s self-driving feature didn’t work because of steel structure in a building’s roof.

Because a PR person had to clarify that a car was using real production technology that just needed to be recalibrated.

And most of all, because Faraday Future cut all of this out of the reveal video before they uploaded the event stream.

So what’s the problem?

Well it’s been over 100 years since the first autopilot was developed in aviation and 35 years since airplanes have been capable of full automation we’d call SAE Level 5. Over these last 35 years, we’ve learned the decision to use advanced automation is guided by trust and self-confidence. In other words, my decision to use automation depends more heavily on whether I feel like it will work, rather than how reliably it actually works. I’m pretty self confident that I can park my car. I do it every day. I know that the FF 91 can probably do it, too. But do I trust that it could park itself in a seven-story concrete and steel parking garage at my work? Do other people who watched the reveal or read the articles about it trust it will? If they don’t, they’ll never buy the car to find out.

And that’s the problem.

Vlad Pop is a research scientist with a Ph.D. in Human Factors. He is a lifelong racing enthusiast and former Formula E developer using years of cockpit automation experience to ensure the future of automated driving is in good hands.

Leave a Reply

Your email address will not be published. Required fields are marked *