Most likely scenario is a pretty killer Godzilla reboot.
Most likely scenario is a pretty killer Godzilla reboot.
I didn’t watch the truck ones, but if it’s like the third video, those are low-speed incidents, which I think use different logic than freeway speeds. Notice they’re towing the object for the higher speed stuff (and it still hit it at 70kph), so it’s a moving object at that point. My guess is this is because it would…
Automatic Emergency Braking systems (all of them, by all manufacturers currently) work in this way. They will not brake for a stationary object as they’re only designed to avoid rear-ending other cars. They’re made to track moving objects, rather than stationary objects, otherwise they would be a larger danger on the…
When a pilot engages autopilot on an aircraft, who’s responsibility is it to ensure that the aircraft remains in control, and isn’t heading towards anything that could kill crew and or passengers?
Exactly this. People forget the responsibility of control is on the “driver” of the vehicle. Not the computer the driver turned on and didn’t pay attention to. That’s why the new Nissan X-trail ads worry me. They actively tell drivers they don’t need to pay as much attention because the brakes work even when the…
Human drivers aren’t safe. When human drivers drive toward a hazard instead of away from it as they should they are not safe.
Ugh, stop with the “they’re responsible because they called it autopilot” crap. Driver = moron. He even admitted to knowing this stretch of road was dangerous. Tesla calling it “advanced cruise control” wouldn’t change a damn thing on how people integrate with this new technology which some humans will never be bright…
Obviously you didn’t keep reading. Autopilot is not a perfect system, nor meant to be used without any human intervention. It’s incumbent upon the driver to pay attention to the road and to take over should the computer have trouble. The driver didn’t, Tesla is not to blame for his negligence and incompetence.
We never would have got past the horse and buggy in today’s litigious no-personal-responsibility society. They guy drove into a concrete wall that was visible half a mile away.
IMO the driver was to blame. He knew that particular stretch of road has issues for autopilot. He should have not used autopilot for that part of the drive. Ultimately, you are the owner/driver of the vehicle. Your life is in your hands. That’s what I was taught in life - you are your own destiny & responisibility.…
I was just about to comment about this, and thank you for thinking like I do.
Do not fuck with a food item that appears to flipping you off.
Some suburban neighborhoods have designated streets as “Local Traffic Only” which is an official traffic designation that Waze will respect. Los Altos Hills did this. But you have to get past the local and possibly state traffic engineeers.
I expect Waze has NO interest in de-prioritizing routing down a particular road just because the residents of that road think they should. Imagine the flood gates that would open. No, they are going to hold firm to routing down whatever road makes the most sense from a time-saved perspective unless local government…
Can’t people driving just look at it and say “oh, I don’t want to go up that”. :)
How dare those dirty outsiders use their public street.
It’s local residents that are complaining, (because they are seeing an increase in traffic by outsiders,) not Waze users.
You gotta admit Waze makes sense.
It’s NOT Waze’s fault that drivers can’t make it up dangerously steep LA street without crashing. I second that. It is a legal street. It’s the driver’s choice to navigate it, or go elsewhere. There is too much litigiousness and litigation all around for life to be pleasant. STOP blaming others for your debilitating…
Agree. Holding Waze responsible would be like holding Ford responsible for the idiots who drive their Mustangs into people/objects.