samwichse
samwichse
samwichse

Your false equivalency is jaw dropping. Cruise Control will not slow you down if the road changes speed limits or if you approach a red light or sharp turn... nor was it intended to. Using “autopilot” through interchanges... which this obviously is one... is asinine and not the intended purpose of autopilot.

Now playing

is there something wrong in the video? i guess people dont understand how their systems work, and call it tesla’s error!

A co-pilot is a fully qualified pilot, ready to make all the command decisions to operate an aircraft. Humans aren’t programmed such that a word describes all the detailed functions of the system the Tesla car uses; they are supposed to be instructed as to when and where it is appropriate to use. This type of highway

Airplane pilots don’t stop paying attention when they turn on autopilot. They’re also not constantly a few seconds away from hitting something or having to deal with hairless apes piloting multiton blunt objects at speed.

There’s additional focus on Tesla. Not because of autopilot, but because Teslas don’t use fuel, dealer networks, or need the kind of service or parts that conventional cars use. Don’t think for a minute that the Tesla deathwatch articles are anything other than wishful thinking by certain industries. The most

I drove 1500 miles on autopilot this week. I did it with my hands resting on the steering wheel. Believe it or not, keeping my hands on the wheel is the most comfortable place for them because it’s right freaking there in front of my within easy reach and I’ve been conditioned to hold onto it over 30 years of driving.

Once you get in a Tesla, you are both alive and dead until someone opens the door.

Also, let’s not forget that if it wasn’t steering into the barrier at highway speeds, he wouldn’t need to intervene in the first place.

But this is really pretty standard spin for a Tesla incident. If the driver intervenes when it’s doing something absurdly wrong, and doesn’t recover, it’s the Driver’s fault, because

It just maintains course and altitude! It doesn’t know how to find THE ONLY AIRSTRIP WITHIN A THOUSAND MILES SO IT CAN LAND ITSELF WHEN IT NEEDS GAS!

“Unless it can drive without me it doesn’t change much.” Just because something isn’t perfect doesn’t mean it isn’t useful. What about non-adaptive cruise control? It too can’t be used to drive the car on its own.

Co-pilot is a worse name because it implies the car can take full control (which a co-pilot is tasked to do if pilot becomes incapacitated).

Simple, no one cares about the systems of other brands. I don’t doubt that there probably have been other accidents (even fatal) involving level 2 systems in other brands, but it’s practically never reported, certainly not to the national level.

A: Don’t Use Autopilot?

Hey, I fly a Piper Archer with an advanced autopilot that will even fly an instrument approach for you. That doesn’t mean I trust it to take me us all the way in without supervision. In fact, very rarely will I allow it to fly the full approach as any thermals make it start to oscillate out of the glide slope and

I get that Tesla is under a microscope and any excuse to be in the headlines

Here is what buggs me. Tesla basically comes out and says we have this thing that half drives but it can still mistakes so you need to keep your hands on the wheel.

So your argument is that people are the problem?

Following the painted lane markers does seem like the intended behavior. This seems that the autopilot system can only be blamed for not stopping prior to the barrier and not for departing from the road, as the most clear lane marker heads directly toward the obstacle.

— Excluding Waymo, Uber and other 3rd party accidents

How come we don’t hear stories of other manufactuer’s self driving accidents? It’s not like Tesla is the only one with this technology. I get that Tesla is under a microscope and any excuse to be in the headlines, but I’m curious if Mercedes, Volvo or BMW have an equal amount of self driving accidents that simply go