If you're making software for actual end-users, you HAVE to give it a goddamn GUI, or else you suck, your software sucks, and nobody is going to use your damn software.
You see this shit SO much more often than you would think. And the infuriating thing is, it seems to be most common among programs that are INCREDIBLY complex and sophisticated.
It'll be like this:
"What does my program do? Glad you asked. It simulates stress patterns in glass and ceramics, after they come out of a kiln. You can specify any melting temperature, adjust the composition of elements in the glass, and the ambient temperature of the cooling and tempering stages."
"Wow, can you show me how it works?"
"Sure! <opens a command line and starts typing commands>"
"O-oh. Do you have any plans to add a graphical user interface?"
"HAHAHAHAHHA, no. That's never happening. And here I thought you were serious about using advanced software, and being an intelligent person."
Obviously, that last part is just kinda implied. But sometimes, when users request a GUI, the goddamn developer will kinda get in their face, like that.
They always fall back on the position of "well, I developed this shit for free, for your ungrateful ass. So you can build your own fucking GUI."
But the thing about that is...no. And fuck you. I shouldn't have to be two-thirds of a fucking developer, in order to use the fucking software.
If you can figure out how to simulate molecules, or draw 3D stereograms, or translate hieroglyphics, or any other RIDICULOUSLY COMPLICATED SHIT, making a graphical user interface should be nothing to you. You should be able to do it in a fucking afternoon.
IT DEFINITELY SHOULD BE THE EASY PART, FOR YOU.
All the rest of us, who aren't programmers? We envy programmers, and their ability to really connect with computers, on that deep logic level.
If we could do that shit, we would. But a lot of us have tried, and we realize it's not a good use of our time. We can do cool stuff with software, but it's just not ever going to be worthwhile for us to struggle through the act of creating software.
Also, I hasten to add that I have put in my time, using command line interfaces. I used DOS, I used BBS systems, I have used modern command-line-only programs. I know how to do it, but I DON'T WANT TO.
I don't want to have to memorize commands. I don't consider a GUI workflow to be some kind of weird luxury. It has been a basic part of modern software, for around 40 years at this point. Literally get with the program, guys.
If you're serious about making software, get your shit together and implement a fucking GUI from the very first release. Nobody ought to be taking you seriously, if you refuse.
People who do things they like for free are called "hobbyists". These hobbyists, unsurprisingly, only do what they want to do. And sometimes they don't want to make a GUI.
The fact that they share the results of their hobby for free at all is just a bonus.
I honestly do get that. And I realize I am overstating my case, in a way that basically makes me an asshole.
I think the problem is often rooted in how projects advertise themselves. The small dev team is like "WE'RE PASSIONATE ABOUT GETTING ________ INTO THE HANDS OF USERS, BECAUSE WE KNOW THE ONLY OTHER SOFTWARE THAT DOES _______ IS A 27-YEAR-OLD APP THAT ONLY WORKS ON A SPECIFIC REVISION OF WINDOWS 95."
But then the damned app is command-line only, and it just feels like it was all a tease.
But if you're this passionate about the stress levels of ceramic whatever, presumably you've trudged through countless dense works of academia. After that how big of a step is it to learn to type a few keywords into a command line? You're not required to learn a whole programming language.
I've dealt with that kind of software that was a paid for product.
Part of the issue is that the software is that it is very specialized, so there isn't a big demand for the software. At that point, it is usually cheaper to train the users instead of making the program easier to run.
You also run into problems where a GUI can cause a false sense of confidence in the work. People end up trusting the program completely instead of using it as a tool that could cause problems.
That's a hard argument to rebut. The only thing I can really refer to is VisiCalc. The OG spreadsheet program. The original "killer app."
The whole thing about a spreadsheet is that it's a realtime pseudo-GUI. It's not a command-line-interface thing, and that's VITAL TO ITS FUNCTION, AT THE END-USER LEVEL.
Of course, the computer can do any of the spreadsheet calculations by means of querying a CLI. But the thing about a spreadsheet is that you just change one element on the screen, and all the other elements auto-update, in response to the altered field.
That functionality was a major-ass deal for doing some summing and dividing, so you don't fuck up your payroll and your expense reports. It increased productivity. It reduced clerical errors.
But you're over here, like "well, can't you just use the command line to enter the new parameters for your molecular glass simulation?"
My argument: isn't that shit just a little more complex than a spreadsheet with some basic monetary figures on it? If a GUI is a game-changer for juggling a few dollar-figures in a monthly budget document, it's EVEN MORE CRITICAL in any application where both the data and the calculations are more complicated.
The GUI paradigm isn't a frivolous luxury, which is only made for those of us who are just too lazy to get savvy with the CLI game. GUI has real and exclusive benefits. The more complex the data entry, the sillier it becomes to just say "you don't really NEED that GUI, you just want it for convenience."
This raises a point though. A command line tool from '95 will likely recompile and run just fine with maybe a warning or two, while the GUI app is no longer supported because GUI frameworks are notoriously fickle things that go obsolete all the time.