Once again, the "use case" for AI art is "makes it easier for scam artists to scam old people."
Every single time this stuff comes up with any sort of "practical" use for it, it's always something that actively makes society worse in some way. I've yet to hear an AI art defender actually justify why this is ok. It's always "Ok, it might be used by scam artists and cheapskates who don't want to pay artists and fascists to spread their toxic ideas and manipulate people, but it could have some hypothetical positive use case in the future, so we shouldn't discount it just yet!
It's an extremely powerful piece of productive capital that runs locally inside a fairly cheap piece of capital that most people can acquire. We can say "so powerful a device should never have been created!" but it was and it now exists as a piece of productive capital no less disruptive than countless other machines.
We aren't going to be rid of it now that it exists, so the only move that remains is to take hold of it, learn to use it, and exploit it - any artist picking it up immediately has an advantage over every gormless techbro dipshit that's just churning out nonsense without looking at it. Like every bit of aesthetic taste and ability to draw and edit image massively improves what one can do with a machine that cleans up sketchy linework and handles shading in seconds, while the vast majority of people using it are just hitting "generate" as a treat button and barking and clapping at the gibberish it spews out.
Like if you look at what techbros are doing, they love the dogshit pixar-style "I made a machine to generate shitty 3d blob art because I can't even be bothered to use generic assets in blender and do the most basic and braindead work ever" shit, or the "photorealism, but with oily brushstrokes and nightmare fuel JPEG error looking shit" style, which look awful and are almost impossible to fix up, but the AI is actually fairly competent at traditional art styles which are also trivial to clean up and edit since they're (comparatively) low-detail and abstract.
Seriously, if one looks at the people interacting with AI art right now most are just babbling at magic prompt machines someone else runs, then of the people involved enough to run it locally most are using simple prompt UIs, while the most complex thing anyone uses is comfyui, a braindead basic flowchart interface that's absurdly simple and easy to use, and most of the community cries about how it's too complicated and hard to use. Techbros are all talentless dipshits and anyone with a brain and art skills could take their toys and eat their lunch.
That is true, and ideally this would be a tool for artists (I would love to save time on backgrounds and things for example, using AI to fill in the parts of work I find tedious and time consuming and just fix it up as needed) but unfortunately it also doesn't generate wholly new art, it creates a collage of existing work, but doesn't attribute any of the art of the other artists used to make it. So even if it were a tool used by artists, it would be effectively stealing art from other artists in the process.
And the problem ultimately is any art space that allows AI art to be used is flooded with it. Look at the front page of deviantart for an example. Used to be an actually interesting art website with unique and interesting stuff, now it is just the same generic hundreds of pieces of AI art because it takes 0 effort to actually make. The market gets flooded and actual artists can't be seen by potential supporters because those supporters would have to wade through a mountain of shit to find their work. So it actively becomes detrimental to artists in any space it is allowed, which is why the only people who can get any sort of use out of it have nothing but contempt for art and artists.
That is true, and ideally this would be a tool for artists (I would love to save time on backgrounds and things for example, using AI to fill in the parts of work I find tedious and time consuming and just fix it up as needed)
Yeah, under a socialist system this wouldn't even be a question, this would be a uniformly wonderful tool that would enable a level and scale of arts production never before imagined provided some selection was put into place as to what could actually get published. Like I've been poking at setting up a rotoscoping pipeline to see if it's at all possible to bridge quick and janky CGI and traditional cell animation with basic hand rotoscoping and an AI cleanup and detailing pass followed by interpolation over multiple frames with ebsynth or something, but I need a new stylus because the battery in mine died and it isn't one where that's replaceable.
unfortunately it also doesn't generate wholly new art, it creates a collage of existing work,
This is inaccurate: it's much, much fuzzier than that, and is more about picking up and recombining concepts and aesthetics - it's weird and repetitive, but tends to be repetitive in the same way artists can be when they work out an approach to a pose and then just keep doing slight variations on it even when that doesn't make sense (like old comics did this a lot) or when they're sticking too close to a reference image. Toss in controlnet and guide it more and it breaks away from that.
but doesn't attribute any of the art of the other artists used to make it. So even if it were a tool used by artists, it would be effectively stealing art from other artists in the process.
I think the clearest refutation of the property angle is to look at two things: who has the power to claim ownership over the training data (hosting sites, major corporations, and social media sites) and whether or not the training data only being properly licensed ahead of time would make a difference in the harm the technology causes. Like who profits if we end up saying "AI trainers must pay royalties to the proper institutions"? Reddit, imgur, meta, deviantart, tumblr, etc, all of whom claim ownership over their users' posts and are already selling that access, because as far as they're concerned it's not the artists being infringed upon but a misuse of their hosting services. Similarly, if Disney or the like came out with an AI trained on its own private library of works and began replacing animators with it and renting it out to selected studios would that make it ethical? Of course not: it is unethical because of who uses it (techbros and corporations) and its consequences (devaluing skilled labor), not because it violates property rights.
And the problem ultimately is any art space that allows AI art to be used is flooded with it. Look at the front page of deviantart for an example. Used to be an actually interesting art website with unique and interesting stuff, now it is just the same generic hundreds of pieces of AI art because it takes 0 effort to actually make. The market gets flooded and actual artists can't be seen by potential supporters because those supporters would have to wade through a mountain of shit to find their work. So it actively becomes detrimental to artists in any space it is allowed, which is why the only people who can get any sort of use out of it have nothing but contempt for art and artists.
Yep, and it's only going to get worse. Some solution to the AI art spam on social media will have to be found, but even worse is what the use of generative AI in professional environments is going to do. Animators are already overworked and underpaid, and that's only going to get worse when these tools get integrated into their workflows and one worker ends up expected to do the work of what now would be an entire team.
That's why I'm focusing on what this is: an extremely powerful and destructive piece of capital that already exists. We can't stop it from existing or stop capitalists from making things worse with it, all we can do is seize upon it and find ways to use it ourselves - that is try to predict how it's going to be put to work professionally and use it to enable and empower smaller independent teams of artists to do with consumer grade hardware what would previously have required a full studio with many millions of dollars worth of invested capital to accomplish.
In practice it's gonna be like a bigger version of what happened with the advent of easily accessible 3d rendering tools: that shit was truly awful and it infested everything, but gradually the low-grade stuff has become mostly filtered out and some professionals have emerged who actually use the medium well. Since there's no putting it back in the bag, the only thing left is to try to exploit it and springboard off it to new heights however possible.
I think the clearest refutation of the property angle is to look at two things:
I should've clarified there, I was using very poor language. I meant more in the sense of artists will adopt ideas and styles from other artists and use them to create their own, but with AI art, it creates a sort of amalgam that looks generic, and if a prompt produces a style that an artist would like to learn from, they can't really easily repeat it, while if they find an artist whose work they admire, they can examine it and learn from it. No idea why I talked about "stealing" for that one. You're completely right in your response to what I said, I should've chosen my words more carefully there to say what I actually wanted to get across.
Other than that, I agree completely, sorry there isn't much to add. I'm more lamenting that this is our circumstance than actually saying anything really useful here. It sucks to be at such a precarious position in my career because of this, and struggling harder than I was several years ago because of this new technology that people say is "great and wonderful" just being nothing bu all around awful for both myself and society as a whole. It isn't the fault of AI art itself though, just the nature of capitalism to make everything worse in the name of profit.
if a prompt produces a style that an artist would like to learn from, they can't really easily repeat it,
A little tangentially, I've ironically found that it can be a good learning and practice tool in one particular way: spotting and fixing its mistakes. Like I originally learned how to edit images well about a decade ago when I went through and digitized and cleaned up hundreds of my grandfather's old slides, and trying to clean up an AI's mistakes has a similar feel to trying to clean up lines and mildew spots on an old photo. You have to think about why it's wrong and what technique you can do to fix it without making it more wrong, basically.
It sucks to be at such a precarious position in my career because of this, and struggling harder than I was several years ago because of this new technology that people say is "great and wonderful" just being nothing bu all around awful for both myself and society as a whole. It isn't the fault of AI art itself though, just the nature of capitalism to make everything worse in the name of profit.
Yep. Everything capitalists will use it for is bad, and the response I've settled on is to encourage leftists and artists to try to take that ball and run with it, to exploit it the same way capitalists will but for the sake of independent works or agitprop instead. The only positive I see to it is that it's like if past skilled tradesmen being made obsolete by new tech could just summon up that capital for themselves, like if a simple hammer could have been made to also be an industrial press just by telling it how to be that, because that's what we can do with open-source AI so to speak - that wouldn't make the factories less bad, but it would have meant the factory owners lacked a monopoly on industrial capital.
but unfortunately it also doesn't generate wholly new art, it creates a collage of existing work, but doesn't attribute any of the art of the other artists used to make it. So even if it were a tool used by artists, it would be effectively stealing art from other artists in the process.
dear fucking god please stop upholding capitalist ideas about IP rights.
I agree, but unfortunately as an artist under capitalism, this is how my fellow artist act. They get incredibly protective of "their style" and it can completely ruin someone's career if they are accused of "stealing" art. And it's just kind of rude in general to draw inspiration from something and not attribute the original artist, which was more my original point, that AI art deprives people of the ability to even know who the original artist was. Though the way I phrased it was incredibly poorly worded.
Hot take: Artists should be able to not have their life's work automatically fed into the plagiarism machine without their compensation or consent. Like I'm not going to pretend that Mickey Mouse being copyrighted for a century is a normal thing, but people having their labor exploited for the profit of the wealthy is kinda the thing we're supposed to be against, no?
i think you shouldn't get to opt out of the remix machine but the corporations shouldn't be able to exploit it for profit exist. Having your work not become part of the commons is the same shit as a century of copyright. Anything we do about these generative models that allows corpos to continue to use them is a bandaid at best.
Well I disagree. You should have a fundamental right to opt out of these things. Even in a perfect world where everything is just and every artist can support themselves, I see no reason it shouldn't require the creator's consent. Surely, with no financial pressures to corrupt things, many creatives would willingly contribute to these models, and we wouldn't need to resort to this ugly, non-consensual scraping.
I just think, fundamentally, there should be some level of control the artist has over these things. You asked me earlier if art should "belong to everyone", and I guess I don't think it should, at least not fully or without restriction. I'm not against stuff like fanart and fanfiction and things like that, not in the slightest, but the idea of having my work taken in that way, mechanistically, even in a non-artistic context, like the conversation we're having right now, feels so thoroughly violating that I just can't support it. It feels like in the minds of a lot of people, the only option an artist should have to avoid these things, to avoid being scraped, is to seclude themselves, or at least their work, and to completely shut people off from experiencing it. I don't want that, but I don't want to be scraped either. Is it so strange? Am I really the weird one for wanting a middle ground, where the humans are allowed to see me and the AI isn't?
i think that feeling is probably rooted in capitalism and precarity? whatever fan works you're imagining and fan works "with an advanced computer" are the same.
i do think we should have some protection against e.g. political candidates we don't endorse using our art, or corporations profiting from our work, but something automatic like how covers work in music seems pretty sane.
Since the Copyright Act of 1909, United States musicians have had the right to record a version of someone else's previously recorded and released tune, whether it is music alone or music with lyrics. A license can be negotiated between representatives of the interpreting artist and the copyright holder, or recording published tunes can fall under a mechanical license whereby the recording artist pays a standard royalty to the original author/copyright holder [...] even if they do not have any permission from the original author.
rare copyright law w.
if somebody wants to make art and not actually share it for metaphysical reasons i really don't respect that and don't think shit like city or asinine stunts should be validated, but that's a huge tangent.
Again, I'm not against any kind of voluntary arragement, but the first part of this comment, the first two sentences, just don't feel right to me. I'm writing an effortpost as we speak, maybe I'll put that up later. Still gotta organize my thoughts on that.
This doesn't sound far off from a Marxist understanding of it, honestly. It's built off the uncompensated labor of millions of artists. Once in its mature form I'm sure it could be fairly accurately modeled as an enclosure of commons at the very least.
how is it an enclosure? the "original" works are all exactly where they were when they were scraped. (and the "original" of a digital thing is a real fucken weird concept too, the first time that image existed was in the computer's RAM, or maybe the pixels in a particular state on the artist's monitor, the one that you see on the internet is like 5 generations of copy already)
like fuck these companies and the peter theil types behind them to death but superman 4-ing fractions of pennies from the take a penny tray isn't theft just because you do it a billion times, and copying isn't theft at all.
I think we're talking past each other. Your argument applies if we're talking about a liberal concept of ownership, or maybe judging the morality of ai, but that's unrelated to a material analysis of it. Generative technology requires massive datasets, which are the result of millions of hours of labor. This isn't a moral claim at all, it's simply trying to describe the mechanism at play.
Enclosure in the digital space isn't an exact parallel to enclosure acts in medieval England. I usually see it applied to the Open Source ecosystem: the products of volunteer labor are enclosed or harvested or adopted by private corporations. Google and Microsoft and Apple all built empires on this mechanism.
I mentioned a "mature" stage because I think the next step is more forceful enclosure and hoarding of datasets. The usability of the internet is quickly decreasing in lockstep with the development of AI, a dialectical evolution. It's eating away at the foundation upon which it builds itself.
100% AI art is only useful for replacing stock images and clip art. All the low effort stuff. MS paint memes might get replaced by ai stuff but I doubt that.
Using AI in conjunction with human made artwork, as another tool at the artists disposal, is where it can be useful. Theres all sorts of AI powered postprocessing thats been around for a while. But you could also do draw a spaceship and have AI generate a star field background for you.
Obviously, Capital will take a while to figure this out. It will attempt to use AI to solve its highest priorities first. Replacing all artists and scamming old people are more important than making good art.
Ehh, I generally don't like the idea of making AI do something in a piece of art unless the thing it's doing is so utterly inconsequential to that thing's artistic merit that it isn't worth a real person's time. Like, I wouldn't be mad if it got used to make 500 sand textures for the new Ubisoft game, for example, because no human being should be made to do that much work for such a minute impact. Even then, of course, there's still the issue that all the current models are founded on theft and are being used explicitly as a tool to extort the very same people they victimize, so this fun hypothetical question of "how much of our art should we let the computer do" is sadly tainted by the fact that it isn't actually "just the computer" doing it and, in fact, it is really the mimicked expertise of hundreds of thousands if not millions of hours of artistic creation being ground into grey paste and sold back to us for the benefit of big tech.
I’m no programmer, but I think limiting AI in any fashion would require limiting the general usage of code. Again, could be wrong, but isn’t “AI” just machine learning?
It's an image created by an algorithm that smashes pieces of existing art in its database to create a "close enough" image to what the prompt is. It isn't really "learning" at all, and the "AI" part is just a marketing buzzword. I see it like those NFTs and things, it's a cool fancy new technology thing, with no real practical applications right now, so people are being convinced of some hypothetical future benefit while the only people actually benefiting at present are grifters.