laird
Laird Popkin
laird

That’s not what autopilot in an airplane does. Autopilot in an airplane is similar to Autopilot in a Tesla, in that it automates the boring parts of flying and the pilot is responsible for maintaining awareness and control.

If Tesla drivers were all idiots that weren’t paying attention, making Autopilot a huge risk,

The driver is responsible for not ignoring the siren and flashing lights. Slowing down gives them more time to respond.

there are tens of thousands of collisions with emergency vehicles every year with people driving manually. The reality is that emergency vehicles are parked in unusual positions, so even with flashing lights, etc., they’re 5x as likely to be in a collision as a non-emergency vehicle.

Zero of the 11 emergency vehicle accidents in the last 4 years that they’re talking about involved cars using FSD Beta. So I’m not sure why they’re talking about FSD Beta. FSD Beta is in very limited testing (2,000 testers so far) with zero collisions, so not relevant to this discussion.

The system that they’re talking

Autopilot in airplanes is pretty close to what Tesla’s Autopilot does, which is why they use the name. That is, autopilot in an airplane automates the boring part of flying, but the pilot has to remain aware and is responsible for remaining in control of the airplane.

If people didn’t know what autopilot in an airplane

Agreed, it means that the system learns from “good drivers”, which is better for the system’s behavior. And, of course, good drivers are more likely to continue FSD Beta’s clean track records (zero collisions so far).

Legally, Tesla is very clear that their current system is driver assist, and the driver is legally responsible for maintaining control of the car. And since Tesla drivers with Autopilot have 1/10th the collision rate of the average drivers, and the tiny number of Tesla drivers using FSD Beta have zero collisions, so it

Note that Tesla is very clear that they’re selling a system that is currently driver assist, and that their goal is to develop full self driving in the future, and that they’re giving a discount because it’s not done yet. And given that Tesla drivers have much lower collisions rates than non-Tesla drivers, due to the

There’s zero evidence that Tesla drivers think that Autopilot doesn’t require the driver to maintain awareness and control over the car. The collision rate when Autopilot is engaged is 1/10th the national average, which certainly implies that drivers aren’t particularly at risk when using Autopilot.

Teslas don’t have “FSD Mode” other than a tiny number of beta testers with FSD Beta.

They have Autopilot, which displays a dialog that explains that it is driver assist and the driver has to maintain control, which they driver has to acknowledge when you first turn it on.

Perhaps that’s why in reality drivers using

Autopilot in a Tesla means the same thing as Autopilot in an airplane, which is where it got the name. That is, it automates boring parts of flying/driving, and the pilot/driver needs to remain aware and in control. And Tesla reminds drivers of this repeatedly, not just in. the manual but in an alert that is displayed

This feels like an intentional choice, not a flaw, because they engineered the car to engage the brakes when they could have not done so. So the question is - is there a good reason that Tesla (and several ICE and EV cars) made the decision to lot leave a dead car free-rolling? Is it perhaps safer more broadly, even

Tesla’s steering is a mechanical linkage, not drive-by-wire, so that no matter what the electronics does you can still steer.

Teslas burst into flames 1/10th as often as gas cars. They just get a lot of news coverage when they do, because they’re new and thus news-worthy. Sadly the reports tend to leave out the relevant context.

That’s not what happened in this case - the car locked up with no warning. Which is of course concerning. It’s the first time I’ve seen such a report, so it’s likely pretty rare given how Teslas seem to get wide reporting.

There’s actually a fairly long range in Teslas when they hit 0% on the battery, on top of which

This is all a wild overreaction to 11 collusions with emergency vehicles over 4 years. In reality, emergency vehicles drive and park unsafely quite often, since they’re doing unusual things in order to respond to emergencies, and as a result are about 5x more likely to be in accidents, and cause deaths, than normal

Autopilot in a Tesla means exactly the same thing as it does in an airplane, which is where they got the name. That is, it’s a system that automates the boring parts of flying/driving, and the pilot/driver has to maintain awareness and be prepared to take control. Teslas with Autopilot engaged have 1/10th the

Keep in mind that ‘chip company’ is different from ‘chip fab’. For example, Apple is now a chip company, making a huge volume of high performance custom chips, but they don’t fab the chips - that’s TSMC. it would make zero sense for Tesla to build their own chip fab when there are so many very good fabs available.

In terms of Cybertruck alone, sure, a delay is a shame. But from Tesla’s perspective, would you rather put resources into scaling up Y production volumes since they have infinite demand, or in starting to produce the Cybertruck which will ramp up over the following year? The financial math is pretty obvious.

For a challenger, a ‘divisive’ design is a win, because they literally have nothing to lose, and if only 10% of truck buyers like the Cybertruck, that’s still a huge success for Tesla - the Truck market is huge, and a 10% market share would keep them selling every unit they could make for years. And doing a radically