I agree that LIDAR or radar are better solutions than image recognition. I mean, that's literally what those technologies are for.
But even then, that's not enough. LIDAR/radar can't help it identify its lane in inclement weather, drive well on gravel, and so on. These are the kinds of problems where automakers severely downplay the difficulty of the problem and just how much a human driver does.
You are making it far simpler than it actually is. Recognizing what a thing is is the essential first problem. Is that a child, a ball, a goose, a pothole, or a shadow that the cameras see? It would be absurd and an absolute show stopper if the car stopped for dark shadows.
We take for granted the vast amount that the human brain does in this problem space. The system has to identify and categorize what it's seeing, otherwise it's useless.
That leads to my actual opinion on the technology, which is that it's going to be nearly impossible to have fully autonomous cars on roads as we know them. It's fine if everything is normal, which is most of the time. But software can't recognize and correctly react to the thousands of novel situations that can happen.
They should be automating trains instead. (Oh wait, we pretty much did that already.)
That may be part of it, but Saudi Arabia also has a long track record of being incredibly abusive and generally just not giving a shit about worker's rights.
Absolutely terrifying... but thank you for the insight.
Yeah, the only way someone is dying in a furnace before feeling pain is if you're dealing with molten-metal-type temperatures. Not a bakery oven. I'm sure this poor woman experienced excruciating pain for far too long.
Yeah, 100%. This is the town's fault IMO - not maintaining the markings in the first place (it's not the contractor's fault that the old marking is non-existent), and then probably refusing to pay the contractor "extra" to repaint the whole thing.
I grew up in a small town in Canada. We never had any kind of lock down drills.
Even talking about it this way is misleading. An LLM doesn't "guess" or "catch" anything, because it is not capable of comprehending the meaning of words. It's a statistical sentence generator; no more, no less.
Nobody going to mention a Cask of Amontillado? Maybe not the most mind-bending example, but the tale of leading a supposed friend to their own horrific murder was not a thing I expected to be reading in school.
I had blocked that one from my memory; I remember now. Thanks. ಠ_ಠ
I'm surprised to see Return of the King on there, tbh. Much as I like LotR, the severe lack of female characters is pretty pronounced.
He can give himself whatever titles he likes, that doesn't mean he makes any positive technical contribution.
Did you tell him you guess you have to stop doing non-web development then? Clearly you're not qualified if you can't have the corresponding title.
"Strategy" implies he actually thinks about it. I think it's just a reflex; fault belongs elsewhere, always. The man is incapable of critical thought, especially inward.
Machine learning has many valid applications, and there are some fields genuinely utilizing ML tools to make leaps and bounds in advancements.
LLMs, aka bullshit generators, which is where a huge majority of corporate AI investment has gone in this latest craze, is one of the poorest. Not to mention the steaming pile of ethical issues with training data.
Very nice writeup. My only critique is the need to "lay off workers to stop inflation." I have no doubt that some (many?) managers etc... believed that to be the case, but there's rampant evidence that the spike of inflation we've seen over this period was largely due to corporate greed hiking prices, not due to increased costs from hiring too many workers.
You can very safely remove the "probably" from your first sentence.
I mean, there is a hard limit on how much info your brain can take in. It's time. Every hour spent learning one thing is an hour not spent learning everything else.
Agreed. The solution to this is to stop using LLMs to present info authoritatively, especially when facing directly at the general public. The average person has no idea how an LLM works, and therefore no idea why they shouldn't trust it.
My guess is that your name is so poorly represented in the training data that it just picked the most common kind of job history that is represented.