Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)ER
Posts
0
Comments
141
Joined
2 yr. ago

  • Yeah, 100%. This is the town's fault IMO - not maintaining the markings in the first place (it's not the contractor's fault that the old marking is non-existent), and then probably refusing to pay the contractor "extra" to repaint the whole thing.

  • Even talking about it this way is misleading. An LLM doesn't "guess" or "catch" anything, because it is not capable of comprehending the meaning of words. It's a statistical sentence generator; no more, no less.

  • Nobody going to mention a Cask of Amontillado? Maybe not the most mind-bending example, but the tale of leading a supposed friend to their own horrific murder was not a thing I expected to be reading in school.

  • Machine learning has many valid applications, and there are some fields genuinely utilizing ML tools to make leaps and bounds in advancements.

    LLMs, aka bullshit generators, which is where a huge majority of corporate AI investment has gone in this latest craze, is one of the poorest. Not to mention the steaming pile of ethical issues with training data.

  • Very nice writeup. My only critique is the need to "lay off workers to stop inflation." I have no doubt that some (many?) managers etc... believed that to be the case, but there's rampant evidence that the spike of inflation we've seen over this period was largely due to corporate greed hiking prices, not due to increased costs from hiring too many workers.

  • Agreed. The solution to this is to stop using LLMs to present info authoritatively, especially when facing directly at the general public. The average person has no idea how an LLM works, and therefore no idea why they shouldn't trust it.

  • Yeah, exactly. The issue is precisely that it's NOT just showing search results. MS's software is generating libelous material and presenting it as fact.

    Air Canada was forced to give a customer the compensation its chat bot made up. Germany/Europe in general is a bit stronger on public protections than Canada, so I'd expect MS would be held liable if this journalist decides to press a suit.