It's the new "promotion machine." The first manager: "I saved this company x dollars using AI, promote me." The new manager: "I increased productivity by x percent getting rid of AI, promote me." Repeat.
I remember 10 years or so ago working with a guy who was trying to sell me on the wonders of Eclipse. "It writes all this boilerplate for you!" I was more interested in writing in languages that were less shit and required less boilerplate.
Based, too bad it's not as easy to find jobs to feed the family (me) with better languages usually simply by virtue of them being newer and having less adoption
I find DRY often turns into an antipattern because decoupling is far more important than code reuse in practice. Having a function that supports many different code paths can make it very brittle because any changes to it need to consider all the possible use cases. However, if you have two separate functions that are similar, but can evolve independently then you don't have that problem. As a rule, it's better to duplicate code first, and then extract parts that turn out to be general purpose functionality once that's identified through usage, and to put that code into libraries.
DRY is usually helpful if you don't use it in situations where you have like 2 semi-different things. If they're actually the same and you have 3 or more then the level of abstraction is worth it almost always.
Writes condensed configurations and properties files in 3 different languages instead. Cloud deployment uses yet another source of configurations and properties.
Doesn't write documentation for configuration and properties.
Boilerplate is bad because it's fundamentally just noise. When you read the code you want to be able to tell what the purpose of the code is and what the problem it solves. Ideally, code should be expressing that as clearly as possible. Having a lot of boilerplate is typically an indication that the language semantics don't allow you to express the solution in a clear way and you have to write a lot of incidental code. The more code you have to read the more cognitive overhead there is in understanding it and keeping it all in your head.
well why is it good? why not just assume the boilerplate as the default and require the user to override it if they want to do something fancy?
it's just busywork to always need to write the same stuff, and it also makes the code less readable and many people look at all that boilerplate and nope the fuck out.
This is why python is so good for getting people to realize that programming isn't magic, you just write the equivalent of one short sentence and BAM text in the terminal, no need to import the basic ability to print text which is so incredibly inane.
It's the most boring thing of the technical side of the job especially at the more senior levels because it's so mindnumbingly simple, uses a significant proportion of development time and is usually what ends up having to be redone if there are small changes in things like input or output interfaces (i.e. adding, removing or changing data fields) which is why it's probably one of the main elements in making maintaining and updating code already in Production a far less pleasant side of job than the actual creation of the application/system is.
It's not really that hard to implement AI as far as I can tell, even if it does produce garbage results. Any CEO that thinks otherwise is getting bamboozled.
Not that I'm defending AI, boilerplate is still boilerplate and a crappier product is a crappier product. But they'll take that trade off anyway which is why heads need to roll, lol