‘LiDAR is lame’: why Elon Musk’s vision for a self-driving Tesla taxi faltered
NotMyOldRedditName @ NotMyOldRedditName @lemmy.world Posts 11Comments 3,482Joined 2 yr. ago
NotMyOldRedditName @ NotMyOldRedditName @lemmy.world
Posts
11
Comments
3,482
Joined
2 yr. ago
I mean the data IS sanitized, but not to the level that would have required certain human things to not happen.
Part of what's led to its improvement over the years is better going through the data and removing bad things or properly labeling them.
That left turn that was cut to short makes it into the first set of training as a cursory look at it seemed okay, and then they see that cars are cutting turns to short. So they go through the data again and try to find examples of it and then label them properly so it doesn't think it's okay.
But that's not a simple process, and then trying to only have certain good behaviors gets really hard because they're actually very uncommon in normal driving because the bad behavior is socially acceptable.