Pirating Adobe software is exactly what they want you to do. Their business model relies on businesses paying for their license because people already know how to use their software, in large part because people pirate it, and also they have deals with schools to teach their software.
What Adobe actually doesn't want you to do is to learn the software of their competition, since that's how they will lose money in the long term.
The main reason we need to push for open source alternatives is this. The more people learn how to use them the more content around them we get and more people take interest in using it and helping develop it (and donate to it).
When you use the AI services in Photoshop, it tries to connect to their servers. But to crack Photoshop, you need to blacklist all those servers. If you try to, say, use the automatic background removal tool, Photoshop will give you a message saying that it will run the (worse) version locally because it can't connect.
Not that I'd know or anything. A friend of a friend told me. Basically a stranger. Don't even know his name.
Nice! I got it right after the latest version came out but that's been a while. They do sales pretty regularly though. It's definitely not as massive as Adobe wrt features, but they cover the essentials well.
When i see stuff like that I want to buy it. And then I remember I'm not a graphic designer. And unfortunately I'm terrible at any type of layouting or drawing. Be it webdesign or otherwise.
It looks like a great tool though. And very fairly priced.
Exactly my thoughts. Adobe is not the police and they should not be the ones trying to deter crime by any definition. How many horrible things have governments done to "protect the children"?
I'm betting the reason they want access to "moderate" your projects is to train their AI. Literally looking to steal artists work before it's out the door.
A fun way to combat this would be to get every artist to add giant, throbbing dicks to everything they create in Photoshop with the hope that it creates the thirstiest, nastiest AI model out there.
Not just dicks, but dicks mixed with other art so it just completely pollutes the training data and the AI has no idea how to draw anything without it kind of looking like a dick. Dicks with human and animal faces, boats shaped like dicks, dick buildings and landscapes etc.
It would take an immense amount of bad data to actually work, but it would be funny.
Riiiight. And, pray tell Adobe, why in the everloving fuck woul you ever need to "review" private content that's not posted anywhere? Stop acting like you're the goddamned pre-crime agency from Minority Report and keep your dirty paws off stuff people are creating privately.
You are providing tools, and that's it. I can do horrible, illegal shit with my drill, but it doesn't give Black&Decker any right to break into my house to do random checks and see if I'm drilling through kneecaps instead of wooden planks...
Because when someone presents you a lengthy document. One that describes all the ways they claim ownership of your work (and work in progress) - in detail - it only matters how much they really mean what's written down? Let me spare you the sarcasm and just say this doesn't communicate the professionalism professionals are demanding. Quite the opposite.
Interesting, we get to either hate them for going full big brother, or hate them for going full adobe in the first place. It's nice to have a choice sometimes.
“Adobe does not train Firefly Gen AI models on customer content. Firefly generative AI models are trained on a dataset of licensed content, such as Adobe Stock, and public domain content where copyright has expired.”
This references a single particular product. lol. If they're training a model by a different name with customer data, it would still be a true statement.
The points about lawyers and NDA's hit the nail on the head. I thought something similar with the Windows Recall debacle. That's a juicy set of data for anyone looking to find journalist sources or scrape a hospital's network. In every case it relies on the end user (business or individual) to know how to disable those features with GPOs/registry options... There's no way 100% of them realize the issue and have the knowledge to fix it.
Where did you read that? I can bet it wasn't the TOS, because that's not in there. The TOS allows Adobe to review anything you create with its products using manual or automated means, and maybe restricted to normal screening for CSAM and such (although it's really ambiguous about what they'll actually do with it).
They just wanna review your work 😀. What if you're trying to put a penis on Trump's face and it's too big or it's pointing the wrong way or something? You know. Wouldn't you want to be told stuff like " the police is coming unless you erase this now!" You know, things like that? It would definitely come in handy to catch kids doing nudes of others. Or adults doing nudes of other adults who didn't know. I wouldn't want to end up in a collage of nudes that is 20MBb 1080p or 4K.