The real good stuff is amazing. Like you never knew how you lived without it. And then you go on living without it. So, coming full circle, I agree. Kentucky Deluxe for me.
The real good stuff is amazing. Like you never knew how you lived without it. And then you go on living without it. So, coming full circle, I agree. Kentucky Deluxe for me.
That still baffles me, though it makes a little more sense. Two dudes want to drink some champagne after the team lost. What’s the big fuss?
Should or should not does not matter. They were not obligated to, as is being suggested.
Why is anyone mad at this?
But it’s only material if Tesla knew the autopilot system could not see the trailer AND that it caused the crash—on or before May 18th.
No, it isn’t. How it happened is how it’s handled, normally. You don’t have to halt the entire company’s plans just to investigate one accident. If you know the cause was something wrong with the car, then it's a different story. We still don't know what was the ultimate cause of the crash.
There was not really even a slight question here. It had been 11 days when the sale occurred. 59 days after the accident, the fault is still undetermined. This is not a close case.
Having all the data instataneously would not tell them the cause. Almost two months later, we still don’t know whether the guy was watching a movie or what exactly happened. You don’t release a report that maybe possibly our car played some small undetermined role in a fatality. That’s just dumb.
Assuming the design hiccup would be material (fairly large assumption), Material suspicions =/= material information. 11 days. It takes longer than that to nail down a cause or causes. No chance this was material information.
Ok, but the more important point is that may not have even known that was a potential cause when the stock sale took place. The crash happened May 7. The sale happened May 18. If they’re still sorting it out and figuring out what went wrong, then I feel like there is no way this could be a failure to disclose material…
I’m not sure it’s that risky. A single fatal autowreck. The only thing that brings it close to being material is that cars with self-driving features are relatively new. Enough? Maybe, but highly likely not. It’s not even clear that the exact causes of the accident are understood. A single accident that is unlikely…
Can’t be.
Who? If anyone knows, hit me up at (281) 330-8004. Thanks.
There’s a reason the gun companies lobbied hard for federal legislation protecting them from such liability. They didn’t do so because there was little chance of being liable. They did so because there were multiple actions filed using that legal theory. The law evolves. Not too long ago “separate but equal” was…
No, but they sell products knowing that they will be used in a dangerous manner. Should that lead to liability? There’s arguments both ways, and I don’t want to get into them. The point is that there are significant similarities between that and Tesla’s autopilot should severe accidents continue to happen because…
For sure. I think what he meant was “so they won’t get fucked.” It’s a pretty good system to limit liability, tbh. I don’t see this being a big problem for Tesla unless a few more dangerous accidents happen. At some point, they’re enabling people to be reckless with full knowledge of the potential risks. If they know…
tbh, I don’t foresee much of a problem for Tesla even in your hypothetical.
No. 3 was clearly the best.
This guy’s biggest crime was fucking up the law on 12(b)(6) motions in Federal Court.