It's funny. If you asked any of these AI hyping effective altruists if a calculator 'understands' what numbers mean or if a video game's graphic engine 'understands' what a tree is, they'd obviously say no, but since this chunky excessive calculator's outputs are words instead of numbers or textured polygons suddenly its sapient.
It's the British governments fault for killing Alan Turing before he could scream from the rooftops that the Turing test isn't a measure of intelligence
Then why have you implemented it before it's safe to do so? Shit like this would get most things recalled or sued back in the day for endangering people with false information.
Isn't that what food photographers do to make pizza cheese look stretchier? The bot should recommend nailing the pizza slice down to the table next lol
I can only assume Americans are using so much oil on pizza that all structural integrity is compromised and its more like greasy tomato soup served on flatbread
I don't know about elsewhere on the planet, but in the USA pre-shredded cheese sold at the grocery store is usually powdered with something to prevent the shredded cheese from re-amalgamating. Consequently, this shredded cheese always takes longer and higher temperatures to melt and reincorporate unless it's rinsed off first. Most Americans aren't aware of this, and so often shredded cheese topping on shit just comes out badly
Ok I understand how the AI got this one wrong. You can make a "glue" by mixing flour and water together. You can also thicken a sauce by adding flour. So the AI just jumbled it all up into this. In its dataset, it's got "flour + water = non toxic glue". However, adding flour to a sauce, which contains water, also thickens the sauce. So in the AI's world, this makes perfect sense. Adding the"non toxic glue" to the sauce will make it thicker.
This just shows how unintelligent so called "Artificial Intelligence" actually is. It reasons like a toddler. It can't actually think for itself, all it can do is try link things that it thinks are relevant to the query from it's dataset.
You're actually giving it too much credit here. It seems to have lifted the text from a reddit joke comment that got shared/archived/reposted a lot (enough?) and therefore was the one of the 'most frequent' text strings that are returned on the subject
This is pure speculation. You can't see into its mind. Commercially implemented AIs have recommended recipes that involve poison in the past, including one for mustard gas, so to give it the benefit of the doubt and assume it was even tangentially correct is giving it more slack than it has earned.
So best case it regularly employs one of the most basic and widely known logical fallacies of affirming the consequent (flour + water -> non-toxic glue "therefore" non-toxic glue -> flour + water). Sorry, but if your attempt to make a computer use inductive reasoning tosses the deductive reasoning that computers have always been good at due to simplicity out the window, then I think you've tailed not only at "artificial intelligence", but at life.