fletch123
Fletch
fletch123

The number of times I had to thump a starter with a hammer when I drove a tow truck would boggle some of these young kids minds.

Yeah, to a trained driver it’s extremely unsettling to watch, and it does it every single time in the video.

By my job somebody actually died because of that.

That was something I noticed as well. I was always taught the same: Keep the wheel straight so that if you are rear ended, you will not be pushed into oncoming traffic.

Am I the only one bothered by the fact that the car is angled into traffic and the wheel is turned waiting for the opportunity to make the left turn?

Did you watch the video at all? she linked it at the time stamp to show all the turns. The sketchy one was the last aborted left turn. The system didn’t detect a truck AND a truck towing a boat as a moving target (yellow designation instead of the blue in the other attempts) and was literally about to pull in front of

Big-3 cars from the 1970s and 1980s would often not start.

To be fair, lots of cars from the 70's and 80's would reliably (or, should I say, unreliably), fail to start on occasion. Or take a lot of effort to get them to start.  Once we got into the late 80's and beyond, and the switch to electronic fuel injection had happened and quality had improved cars tended to start

The difference is that we KNOW that there are inexperienced drivers on the road. That’s just part of driving, and a risk we acknowledge exists and has existed for as long as there have been drivers. The other is a corporation putting the public at risk for profit and/or avoiding the regulations/oversight that

Hope Dr. Seuss is still OK to post here!

Exactly right. Further, when you’re driving with a sixteen-year-old and they do something wrong, you can tell them, explain what happened, and give advice on what to do. You can’t do any of that with a self-driving system.

most 16-year-olds learning to drive typically have something called “common sense.” Joke all you want but they typically don’t drive down the wrong side of the road with total confidence. They drive like shit, sure, but the type of mistakes they make are human mistakes and they typically hit the brakes when unsure.

See, the liability is the sticky part. To my knowledge, if you crash because of Autopilot/FSD when it’s active, it’s your fault because you weren’t being attentive. If you take over when Autopilot/FSD warns you to, as it was intended (say you didn’t have enough time to take over and perform evasive maneuvers), it’s

More like an unknowing/unwilling victim. The amount of close calls in that video is freaking terrifying and NHTSA continues to allow the use of this stupid Full Self Driving (beta) label. I expect at some point insurance liability may catch up with all this reckless beta testing.

should we be letting companies beta test self-driving car software in public with no oversight?

Tesla owner here. FSD is a scam and a sham. Autopilot is barely good enough where I’m okay using it on roads, if and only if I don't mind it randomly flipping shit and giving passengers heart attacks because a road dips before a bridge... 

This here’s [slaps roof] a real unicorn. A true collectors item. The Jeep Beetle. Only one ever made. 

Uhh, got any clean Cherokee XJ shells out there?  Not-burned would be preferred, but a clean Cali shell would go real nice under my offroader.

Put Jeep badges on them and David Tracy will come take them away for you.

What, old Soviet underwear trimmings too good for you?