The Perfect Response
The Perfect Response
The Perfect Response
Tools have always been used to replace humans. Is anyone using a calculator a shitty person? What about storing my milk in the fridge instead of getting it from the milk man?
I don't have an issue with the argument, but unless they're claiming that any tool which replaced human jobs were unethical then their argument is not self consistent and thus lacks any merit.
Edit: notice how no one has tried to argue against this
People have begun discussing it, although i suppose it was an unfair expectation to have this discussion here. Regardless, after i originally edited this, you guys did have tons of discussions with me. I do appreciate it, and it seems that most of us support the same things. It kinda just seems like an issue with framing and looking at things in the now vs the mid term future.
Yes, I also think the kitchen knife and the atom bomb are flatly equivalent. Consistency, people!
Edit: 🤓 erm, notice how no one has tried to argue against this
Such a strong point, can't believe I didn't think of that.
I can't believe I never thought about calculators. You and I really are the brothers Dunce, aren't we?
I like how he made an edit to say no one is arguing his point and the only response he got is arguing his point and then he replies to that with no argument.
They virtually always do this. People are, very often, not actually motivated by logic and reason; logic and reason are a costume they don to appear more authoritative.
He made a completely irrelevant observation. There was no argument. He didnt try to refute anything I said, he tried to belittle the argument. No response was neccassary. If anyone else has reaponded, i havent had a chance to look.
I read it as sarcasm. 🤷
Your "argument" is called false equivalence.
You're a tool.
The people who made calculators didn't steal anything from mathematicians in order to make them work.
"If you facilitate AI art, you are a shitty person"
There are ethical means to build models using consentually gathered data. He says those artists are shitty.
Now you're moving the goalpost. You said told always replace humans and made the analogy to calculators and refrigerators. The fact is that the cast majority of generative AI in use today didn't get their content ethically.
Lmao im moving the goalpost? That was literally my argument from the get go. Dude says your shitty for using/facilitating ai art. Dudes dead wrong. End of story
Didn't they? Did they get consent from the mathematicians to use their work?
Math is discovery, not creation.
What a completely arbitrary distinction
I think the fact that AI sucks ass at even the most basic math proves that the difference between discovery and creation is, indeed, not arbitrary.
Unless you are the kind of person to use AI to do math, then yeah I can see how it can look that way.
I think the fact that AI sucks ass at even the most basic math proves that the difference between discovery and creation is, indeed, not arbitrary.
I don't follow your reasoning at all.
Hahah love it
The issue isn't automation itself. The issue is the theft, the fact that art cannot be automated and the use of it to further enshittification.
First, the models are based off theft of OUR data and then sold back to us for profit.
Secondly, most AI art is utter crap and doesn't contribute anything to human society. It's shallow slop.
Thirdly, having it literally everywhere while also being completely energy inefficient is absolutely dumb. Why are we building nuclear reactors and coal plants to replace what humans can do for cheap??
Edit: further, the sole purpose of AI is to hoard wealth to a small number of people. Calculators, hammers etc. do not have this function and do not also require lots of energy to use.
Ive responded to a lot of that elsewhere, but in short: i agree theft bad. Capitalism also bad. Neither of those are inherit to ai or llms though, although theft is definitely the easy way.Art can be automated, nature does it all the time. We cant do it to a high degree now, i will concede.
Quality is of course low, its new. The progress in the last year has been astounding, it will continue to improve. Soon this will no longer be a valid argument.
I agree, modern ai is horribly innefficient. It's a prototype, but its also a hardware issue. Soon there will be much more efficient designs and i suspect a rather significant alteration to the architecture of the network that may allow for massively improved efficiency. Disclaimer: i am not professionally in the field, but this topic in particular is right up mutiple fields of study i have been following for a while.
Edit: somehow missed your edit when writing. To some extent every tool of productivity exists to exploit the worker. A calculator serves this function as much as anything else. By allowing you to perform calculations more quickly, your productivity massively increases in certain fields, sometimes in excess of thousands of times. Do you see thousands of times the profits of your job prior to the advent of calculators, excluding inflation? Unlikely. Or the equivelent pay of the same amount of "calculators" required for your work? Equally unlikely. Its inherit to capitalism.
Art can be automated
Under what definition of art can that be possible? Is art to you nothing more than an image? Why automate art and not other tasks? What is the point of automating art? Why would you not want to make art yourself and instead delegate it to a machine?
Art is whatever people put into it. If its a passion for you, theres nothing stopping you from making it. For me, art is mostly music designed to evoke an emotional response and game assets for hobby programming, not for commercial use. Im not profitting off either of those, but I don't have the funds to pay someone to make custom assets nor do I have the talent.
There has been art before that was not done by humans. There is a selfie of a monkey, paintings by elephants, semi-self generated art via fungus, etc. In those examples, the story is as much a piece of the art as the image itself. That subset of art cannot be replaced without simply lying.
nor do I have the talent
And why do you think you do not have "talent"? What is that "talent" you speak of? Is it something people are born with? What is the problem with what you make, if all you care about is what people put into art?
Art is whatever people put into it
"It" what? The pronoun "it" is referring to what? Art? Without this clarification I cannot accurately make sense of anything else in your response.
Keep in mind that, while defining a term, you cannot use that term in it's own definition.
I keep trying to tear away, but you are so goddamn funny.
Art can be automated, nature does it all the time.
Okay.
There is a selfie of a monkey,
Right.
And it was nature that did this.
Nature took a picture of itself as a monkey.
Nature, in its monkey form, started a wild rube goldberg machine, including a monkey and the monkey's finger, to automate the picture taking process: a picture of nature.
Tree falls in forest. Thats art.
Nature created humans. Humans create art.
Take a paintbrush, hook it up to a treadmill. Thats art.
So many easy ways to automate art, its almost like you're poorly flailing to get a gotcha thats never going to happen. I was trying to be civil and not simply treat you like a dumbass, but seeing as you're intentionally trying to be one i no longer see the harm.
And now youre going to start spouting esoteric crap like humans not being a part of nature. Tf are we then, some other dimensional being? You think you've got a soul and that makes you different from nature, but all that really means is that you're bad at analysing things from perspectives other than your own.
Edit: lmfao nevermind, reading clearly isnt your strong suit. From the get go you've been spouting ireelevant bullshit like a child, then when i ignore your tantrum you double down. You're clearly not here for a discussion, so hopefully you won't be too terribly surprised when (surprise surprise) here in 20 years everything i said turns out to be true. To those that actually follow the tech and have for decades, none of what is currently happenning is surprising and weve been trying to warn people for years. But you always end up with people like you, too dumb to listen and too convinced of their own ego despite literally not making a single good point this entire conversation, hell even a single coherent thought. Have a great day 😄
Art can be automated, nature does it all the time.
Ok, first off: what is your definition of automation? This is what I mean when I say automation.
Nature does not automate art. Are you equating the process of, say, almost all bower birds make bowers, therefore that's automation? Then you have a poor understanding of what automation, art, and therefore AI/LLM, is meant to achieve.
With art, you need to think about the state of mind to create that piece in the first place. Before it was created, it doesn't exist in any capacity. Why the art piece exists in the first place is the reason why AI cannot automate it because human emotions are very complex.
If an AI/LLM can experience human emotions, we've essentially created another type of human. This is deeply profound and, with the technology and materials we have now (that is, the processing chips and hardware), it is simply not possible. We're at the point where we're making small, tiny leaps in gains.
Which leads me to...
It's a prototype, but its also a hardware issue. Soon there will be much more efficient designs and i suspect a rather significant alteration to the architecture of the network that may allow for massively improved efficiency.
It is not a software/coding issue that limits an LLM's capability to emulate the human psyche. Again, it is not tweaks in code structure that will send us rocketing up the graph of progress. It is the limitation of the actual materials that we use and their maximum efficacy, hence why we need nuclear reactors and so on to power thousands of processors. We will never get to the point of replicating human ability and energy efficiency with the materials that exist in Earth. And, are we going to spend more energy and resources to look to the stars for a material that may or may not exist to create a machine that has the capability to think as a human?
How long did humans take to evolve to the capacity we have? That took hundreds and thousands of years of trial and error. But I digress...
Its inherit to capitalism.
Absolutely agree. The whole purpose of this 'AI boom' is to make more money for the <1%, steal from us and hoard it for themselves. On this basis, I completely reject the use of LLMs. Fuck AI.
we will never get to the point of replicating human ability and energy efficiency with the materials that exist in Earth
That is flatly incorrect. There is a type of ai that is literally just replicating the human mind, hardware and all. That is well within our current technology, although the connections would not be the same as it would merely be a clone. But a cloned human is an artificial intelligence.
I know that form of ai is not what you are referring to, but why not? What is it about ai that makes it impossible to replicate in a metallic substrate? And even assuming a metallic substrate is flatly impossible, that still doesnt stop progress. There are youtubers currently working of making an artifical rat brain in a jar play doom. This is not a piece of a living rat, these are rat neurons grown from stem cells that were converted from skin cells. So we could just as easily start progress down physical ai's.
As for evolution, that was millions of years of random chance, the difference between that and guided evolution is too great to even compare. And the materials came from Earth in the first place. The entire idea of ai is based around replicating what nature did in the first place, thats how all our technologies are made. People said it was impossible for people to fly merely a decade before the wright brothers. The only difference now is that there is no material scientist on earth that claims wed have to go to outer space to replicate the hardware neccassary for ai.
Edit: forgot about the first part of your comment. this should largely cover that
But a cloned human is an artificial intelligence.
No, it's not. Artificial intelligence is something that is artificially created, like a machine, that can think like a human. A human clone is human, literally.
I think we're both standing from extremely different points of view here on what AI, that is artificial intelligence actually is. But I concede that my statement about it being impossible to create is hyperbolic. We're can't say for certain that it's impossible.
I know that form of ai is not what you are referring to, but why not?
... Because... It's wrong.
I wouldn't call investing power and resources to replicate human capability progress. It's literally going backwards and rebuilding from scratch. Is this line of research honestly worth pursuing at the cost of our climate and environment? It's the same with the Wright Brothers; their technology paved way for increased consumption of resources and rate of spread of disease.
Yeah we get brand new shiny things but at what cost? Is it worth it in the long run? Is it worth automating human capability when we've messed up every single step of our planetary ecosystem?
I would much rather live in a world where all the effort and resources that is currently put into 'AI' redirected into sustainable systems. That, to me, is progress that is worth pursuing.
Would you replace a loved-one (a child, spouse, parent etc.) with an artificial "tool"? Would it matter to you if they're not real even when you couldn't tell the difference? And if your answer is yes, you had no trouble replacing a loved-one with an artificial copy, then our views/morals are fundamentally so different that I can't see us ever agreeing.
It's like trying to convince me that having sex with animals is awesome and great and they like it too, and I'm just no thanks, that's gross and wrong, please never talk to me again. I know I don't necessarily have the strongest logic in the AI (and especially "AI art") discussion but that's how I feel.
Thats a lot of different questions in a lot of different contexts. If my parent decided to upload their conciousness near the end of their life into a mech suit covered in latex(basically) that was indistinguishable physically from a human(or even not, who am I to judge) and the process of uploading a conciousness was well understood and practiced, then yes, I would respect their decision. If you wouldn't, you either have difficulty placing yourself in hypothetical situations designed to test the limits of societal norms, or you abjectly do not care about the autonomy of your parent.
Child, I have no issue adopting. If they happen to be an artificial human I don't see why that should proclude them from being allowed to have parents.
Spouse, I'm not going to create one to my liking. But if we lived in a world with AI creating other AI that are all sentient, some of which presumably choosing to take a physical form in an aforementioned mech, why shouldnt i date them? Your immediate response is sex, but lets ignore that. Is an asexual relationship with a sentient robot ok? What about a friendship with said robot? Are you even allowed to treat a sentient robot as a human? Whats the distinction? I'm not attempting a slippery slope, I genuinely would like to hear where your distinctions between what is and isn't acceptable lies. Because I think this miscommunication either stems from a misunderstanding about the possible sentience of ai in the future, or from the lack of perspective of what it might be like from their perspective.
Edit: just for the record, i dont downvote comments like yours, but someone did, so i had to upvote you.
Are you even allowed to treat a sentient robot as a human?
Oh, boy, this one's really hard. I'll give it my best shot, though. Phoo. Okay, here goes.
Yes.
Ohhhh fuck. Oh god. Oh please. Scubus, how did I do? Did I win?
Now please argue to me that chatgpt is sentient.
Ah, sorry. I misunderstood your argument. No, I would never replace a loved one with a "tool". But replacing loved ones with tools was never something I was arguing for. Chatgpt is a very crude prototype for the type of AI I am referring to. And he didnt say chatgpt, he said "degenerative AI" but also stated "AI art".
The entire argument is centered around those who use or make ai art being "shitty people", no exceptions. But that falls apart when you ever remotely analyze it. There are ethical ways to do the entire process.
But replacing loved ones with tools was never something I was arguing for.
You are arguing in favor of replacing people, flesh and blood, with machines.
Manual labor? Sure. I love post-scarcity.
Art? Culture? My mom? Obscene. Profane, even. Morally reprehensible. We're holding you back from recess until you learn to appreciate your classmates.
Nah, that is a false equivalence. Replacing "people" with machines is very different from replacing "bonds" with machines. You are not literally killing the people and replacing them with a robot. It's a job.
You are conflating replacing the job with literally replacing the person. And personal bonds are not jobs, nor are hobbies. You are not going to have someone or a robot play golf for you. Nor would you replace your mom.
Art and culture are two different things, they are not replacing bonds. But i think the disconnect there comes solely from the current state of ai. Once it improves to the point of being indistinguishable as all technologies do, i think those will be seen as much less problematic outside of the lens of capitalism.
You are not literally killing the people and replacing them with a robot.
... You think my position is that I think stable diffusion will kill people.
Like Body Snatchers?
To anyone still reading: This is ultimately why I didn't go for the point-by-point essay post so many else did. How am I supposed to respond to this? Genuinely.
You asked me if id replace my mom with a robot despite knowing the answer would be no. How tf am i supposed to respond to that. How tf am i even supposed to do that. Ask bad questions, get bad answers.
Thanks for the reply (and the upvote, although I've hidden all lemmy scores from my account so I really don't care about voting for that matter).
My thought experiment is a lot more complicated if the "AI tool" is sentient, i.e. it can be proven without a hint of a doubt that the robot is essentially no different from a human. If we ever get that far, it's a whole another can of questions.
What I tried to (perhaps unsuccessfully) argue is that, yes we have and are replacing humans with tools all the time, but there's also a line (I think) most wouldn't cross, like replacing a loved-one with a tool. In my original argument that tool would just be an imitation, not a sentient machine. Maybe even a perfect imitation, but nothing more than that - a machine that has learned how to behave, speak etc. I don't think many of us would be happy with a replacement like that.
For me it's same with AI art. I can't appreciate art made by AI because it's just imitation made by a tool. It has no meaning, no "soul".
This may come as a shock to you, but nobody was working as a refrigerator. Refrigerators didn't replace the milk man, the stores did. Which was fine at first since those stores were supposed to buy the milk from the milkman and just make it more readily accessible. Then human greed took over, the stores or big name brands started to fuck over the milk man, and conspired with other big name stores to artificially increase the price of bread while blaming covid and inflation, and now some, although few people are trying to buy it back from the milk man if they can afford / access it.
Those tools that did replace humans, did not steal human work and effort, in order to train themselves. Those tools did not try to replace human creativity with some soulless steaming pile of slop.
You see, I believe open source, ethically trained AI models can exist and they can accomplish some amazing things, especially when it comes to making things accessible to people with disabilities for an example. But Edd Coates' is specifically talking about art, design and generative AI. So, maybe, don't come to a community called "Fuck AI", change the original argument and then expect people argue against you with a good will.
The "milkman" is a delivery person who works for milk producers. The company that produces milk still exists, the role of the milkman was just made unnecessary due to advances in commercial refrigeration - milk did not have to be delivered fresh, it could be stored and then bought on-demand.
https://en.wikipedia.org/wiki/Milk_delivery
"Human greed" didn't take over to fuck over the milkman, they just didn't need a delivery person any more because milk could be stored on site safely between shipments.
I would argue it wasn't just the refrigeration, but also the suburbanization of living and the cost effectiveness of delivering the milk from the farm to the store, which (in theory) made milk cheaper. You would still need the milkmen if stores and supermarkets didn't exist. In an alternative world where we didn't invent commercial / household refrigerators, you could still buy milk from stores daily without the need of a milkmen, becaue ultra-pasteurization exists.
I guess thats the problem with analogies and I don't think either of us will get anywhere by further arguing about this one specific example.
Sure. What I mean to say is that the milkman didn't disappear as a result of corporate greed conspiring to artifically increase the price of bread or whatever. Like you said, suburbanization and the supermarkets just made it so milk delivery was no longer necessary. The alternative is to continue paying milk deliverers... because that's what they've always done, regardless of the fact that people can just pick up milk with the rest of their groceries.
Tons of people do! I browse /all and dont want to block /fuck_ai because a ton of you do have great discussions with me. Im not brigading, i have never once saught out this community, but ive always tried to be respectful and i havent gotten banned. So I'd say all is well.
As far as the crappy stuff, that really seems like just another extension of consumerism. Modern art has irked people for a while because some of it is absurdly simplistic. But if people are willing to buy into it, thats on them. Llms have very limited use case, and ethically sourcing your data is clinically neccassary for both ethical and legal reaosns. But the world needs to be prepared for the onset of the next generation of ai. Its not going to be sentient quite yet, but general intelligence isnt too far away. Soon one ai will be able to outperform humans on most daily tasks as well as some more spcified tasks. Llms seemingly took the world by surprise, but if youve been following the tech the progression has been somewhat obvious. And it is continuing to progress.
Honestly, the biggest concern i have with modern ai outside of how its being implemented is that it is environmentally very bad, but im hoping that the increase in the ai bubble will lead to more specialised energy efficient designs. I don't remember what the paper was but they were using ai to generatively design more efficient chips and it was showing promising results. On a couple of the designs they werent entirely sure how they functioned(they have several strong theories, but theyre not certain. Not trying to misrepresent this), but when they fucked with them they stopped behaving as predicted/expected(relative to them being fucked with, of course a broken circuit isnt going to function correctly)
Sorry, I made the comment about being on Fuck AI because of your edit to the original message. I wasn't trying to accuse you of anything.
Back to the AI stuff. I am sorry if I am a little sceptical about your claims about the "next generation of AI" and how "soon" they will outperform humans when even after all these years, money and energy poured into them, they still manage to fuck up a simple division question. Good luck making any model that needs to be trained on data perfect at this point, because AI slop that has been already generated and released in to the internet has already took care of that. Maybe we will have AGI at some point, but I will believe that when I actually see it.
Finally, I don't know about modern art being absurdly simplistic. How can you look at modern animation or music and call it absurdly simplistic. How can you look at thousands of game UI designs in Edd Coates' website and call them absurdly simplistic? All AI will ever create when it comes to art is some soulless amalgamation of what it has seen before, it will kill all creativity, originalty and personality from art, but businessman in suits will gladly let it take over human artists because it is cheaper then hiring human artists and designers.
Yeah, i definitely get that. I suspect there will soon be techniques for sanitizing training data, although that just makes unethical capture more easy. And assuming the final goal is sentience, im not entirely sure it is unethical to train of others peoples data as long as you control for overfitting. The reasoning being that humans do the exact same thing. We train on every piece of media weve ever seen and use that to inspire "new" forms of media. Humans dont tend to have original thoughts, we just reshasg what weve heard. So every time you see a piece of media, you quite literally steal it mentally. It's clearly a different argument with modern AI, I'm not claiming it does the same thing. But its main issue when it comes to that seems to be overfitting, too much of it's inspiration can be directly seen. Sometimes it comes off as simply copying an image that was in its training data. Thats not inspiration, thats plagirism.
And yeah i tend to assume were going to kill off capitalism because if we dont this discussion isnt going to matter anyways