flintstone314
flintstone314
flintstone314

Seriously, there are areas of this country where a stunt like this is going to result in gunfire. 

Keep in mind, the engine doesn’t do anything but two things.

I don’t get how they expect this thing to justify its price.

OH, exactly. It’s a virtuous cycle. Having a healthy after-market helps everybody.

Thanks for the shout-out to Rich! He is awesome and his crusade is a noble one. He is also the one who surfaced how easy it is to get into a “This VIN Code is no longer eligible for software support” Robo-Script from Tesla tech support. You play your cards wrong on a lightly-damaged, salvage-titled unit and Tesla can

There are literally no injuries with the model 3

It seems to be missing the thrusty bits... And I don’t mean Elvis’ hips.

Yes. It is completely unreasonable to ask consumers to relax and let the car drive itself while also being hyper-vigilant and ready to stop mistakes at any time. We live in a country where you can’t make poisonous paint and label it with “Don’t eat this paint”. How is it OK to make an “autopilot” car and then say

It’s not confirmation bias.

Yes, exactly. It’s insane that Tesla calls their system “Autopilot” then scolds people who get into accidents because they assumed their car was being automatically piloted. Their desire to market the car past its ability is killing people, and making the road more dangerous, and it needs to stop.

This is the right answer.

go to

What is autopilot for?

This is where human judgment is most important. A hard brake may or may not useful in scenarios involving sudden lane change of the car in front of you:

Even at it’s closest setting, my VWs ACC is still way far back from the car ahead of it, enough that cars easily jump in front of me (DeMuro did a video about the issue). I can understand the Tesla wanting to be closer so the driver is less annoyed, but safety must come first.

Its in EVERY video, the Tesla is following too closely. My auto follow cruise control in the nearest setting is still about 2.5 seconds back from the car ahead. Tesla’s sure like to tail/stalk their target.

I dont know whats worse. The tweet or this post.

So let’s say TM is correct here and the motorist didn’t take the wheel as instructed. Well guess what that should be? That should be part of the FMEA. Since the severity of such a thing would get the highest (worst) score it should have been addressed. The probability wouldn’t have scored well but the detectability