Software's alleged inability to handle cross traffic central to court battle after two road deaths
Tesla knew Autopilot caused death, but didn't fix it::Software's alleged inability to handle cross traffic central to court battle after two road deaths
Didn't, or couldn't? Tesla uses a vastly inferior technology to run their "automated" driving protocols. It's a hardware problem first and foremost.
It's like trying to drive a car with a 720p resolution camera mounted on the license plate holder versus a 4k monitor on the top of the car. That's not a perfect analogy, but it's close enough for those not aware of how cheap these cars and their tech really is.
I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It's a question that doesn't have a right answer, but it must be answered by anybody implementing a self driving car.
I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.
Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.
I am the last person to defend Elon and his company but honestly it's user error. It's like blaming Microsoft for deliberately ignoring logic and downloading viruses. The autopilot should be called driver assist and that people still need to pay attention. Deaths were caused by user negligence.
It's time to give up the Tesla FSD dream. I loved the idea of it when it came out, and believed it would get better over time. FSD simply hasn't. Worse, Musk has either fired or lost all the engineering talent Telsa had. FSD is only going to get worse from here and it's time to put a stop to it.
Calling it Autopilot was always a marketing decision. It's a driver assistance feature, nothing more. When used "as intended", it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That's awesome, and would never have happened in a conventional car.
I have the "FSD" beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.
At the end of the day, if the car makes a poor choice because of the automation, I'm still responsible as the driver, and I don't want an accident, injury, or death on my conscience.
There's like three comments in here talking about the technology, everyone else is arguing about names like people are magically absolved of personal responsibilities when they believe advertising over common sense.
Since when has autopilot, especially in 2019, ever had the ability to deal with “cross-traffic” situations? It always has been a glorified adaptive cruise with lanekeeping and has always been advertised as such. Literally the same as any other car with LKAS. Tesla’s self-driving software wasn’t released to the public until 2021/2022.
Meanwhile about 120 people died in traffic related accidents today in the US.
The driver was also not even paying attention to the road so the blame should be on him not the car. People need to learn that Tesla's version of autopilot has a specific use case and regular streets is not that.
Software's inability to handle cross traffic....You mean the Driver's inability to handle cross traffic, while also trusting his life to a glorified Speak n Spell.