mattdlynch
Matt
mattdlynch

Tesla needs to understand that people aren’t going to read the manual, and design their systems for that. People do not receive training in the different ways semi-autonomous vehicles handle situations compared to regular vehicles, so Tesla needs to make a greater effort to show people.

Police are trained in how to properly use a taser. Current vehicle licensing programs do not cover semi-autonomous vehicles, so the training issue is very different.

I think a problem with that is the fact that autonomous systems are so new that there is no real government guidance on how to interact with them most effectively. When Tesla (and particularly Musk) show off what people can do when not paying attention, people think that’s ok.

Someone might be shown how to activate it, but that doesn’t mean that the original user is going to walk the new driver through every little step of the owner’s manual instructions. Humans have an unfortunate tendency to think “I know this, therefor other people must know this too.” Even if the original user did, as

Very well put, I think you said it better than I ever could.

I think Tesla needs more warnings, frankly. Require hands on the wheel more often, or use a driver-facing camera to make sure they are scanning the horizon every so often instead of just reading a book or watching a movie. Something to ensure that the user is actually paying attention and doesn’t get complacent, and

What if one person activates it, then someone else drives it? Say, a married couple, or a family? Displaying that only once, and expecting every future driver to know the instructions, isn’t really a good decision.

If that’s ONLY displayed the first time a user tries to activate Autopilot, that’s definitely not enough. It needs to do that every single time.

But how obvious is that message? Does it require the user to actually read it before accepting? Could the car’s audio system read it aloud, to make sure the user knows? I think it’s pretty obvious that the current system needs to more clearly convey “you as a user must do this, because the system can’t do it all

Excellent dissection of Tesla and Musk overall.

I think this is where we walk a fine line of what we call “dumb.” Were the people dumb to not read the fine print*, or are Tesla dumb for not making these instructions and requirements clearer**? I’m not saying I know the exact answer, but it’s a conversation we need to have. I tend to err on the side of “design for

“Read the instructions” also isn’t a reasonable answer there. If the system can be initiated and run without reading it, then people aren’t going to read it; the vast majority of consumer products can be operated without reading the annual first, so people get in the habit of not doing so. If the product were sold as

If people are “too stupid to understand” your design and word choice, then you have a bad design. “Blame The User” isn’t a reasonable reaction to product/use failures.

Finally, someone else who agrees on this. The idea that a company needs to tailor their word choice based on how the users are going to interpret it has seemed like blasphemy here, where people are all too ready to blame the user for poor systems design.

+1 for Hot Fuzz reference.

Looking at that picture like “Yup, I remember my roommate being like th- WHERE IS HIS LEFT HAND?”

“And it’s not a beta test, we’ve gone over this multiple times on here.”

Most marketing is not marketing something that can kill someone when used as implied, like how Autopilot is abused.

Yes, but I woulda argue that such a situation could put people at serious risk. Say it stops right after the crest of a hill, in the leftmost lane of a highway.

I think we fundamentally disagree on the problem.