In case Copilot was too much work
In case Copilot was too much work
cross-posted from: https://lemmy.zip/post/27030131
The full repo: https://github.com/vongaisberg/gpt3_macro
In case Copilot was too much work
cross-posted from: https://lemmy.zip/post/27030131
The full repo: https://github.com/vongaisberg/gpt3_macro
Is this the freaking antithesis of reproducible builds‽ Sheesh, just thinking of the implications in the build pipeline/supply chain makes me shudder
Just hash the binary and include it with the build. When somebody else compiles they can check the hash and just recompile until it is the same. Deterministic outcome in presumambly finite time. Untill the weights of the model change then all bets are off.
this is how we end up with lost tech a few decades later
You'd have to consider it somewhat of a black box, which is what people already do.
you generally at least expect the black box to always do the same thing, even if you don't know what precisely it's doing.
someone post this to the guix mailinglist 😄
A little nondeterminism during compilation is fun!
So is drinking bleach, or so I've heard.
The top issue from this similar joke repo I feel sums up the entire industry right now: https://github.com/rhettlunn/is-odd-ai
I think it's a symptom of the age-old issue of missing QA: Without solid QA you have no figures on how often your human solutions get things wrong, how often your AI does and how it stacks up.
One step left - read JIRA description and generate the code
Congratulations. You've invented the software engineer.
lol, that example function returns is_prime(1) == true
if i'm reading that right
"hey AI, please write a program that checks if a number is prime"
Brave new world, in a few years some bank or the like will be totally compromised because of some AI generated vulnerability.
Well it's only divisible by itself and one
Even this hand picked example is wrong as it returns true if num is 1
Jesus fuck
That reminds me of Illiad's UserFriendly where the non tech guy Stef creates a do_what_i_mean() function, and that goes poorly.
I would say this AI function generator is a new version of: https://en.m.wikipedia.org/wiki/DWIM
Create a function that goes into an infinite loop. Then test that function.
I cracked at "usually".
Does that random 'true' at the end of the function have any purpose? Idk that weird ass language well
It's the default return. In rust a value without a ; at the end is returned.
That honestly feels like a random, implicit thing a very shallow-thought-through esolang would do ...
Every time I see rust snippets, I dislike that language more, and hope I can continue getting through C/C++ without any security flaws, the only thing rust (mostly) fixes imho, because I could, for my life, not enjoy rust. I'd rather go and collect bottles (in real life) then.
Is this not what we are eventually striving for? To speak to computers in a natural human language and be able to build things that way, Star Trek style?
I mean sure I wouldn't trust/use it now.