Honestly, it's still ridiculous to me how slow Python, Java, JS, Ruby etc. continue to feel, even after decades of hardware optimizations. You'd think their slowness would stop being relevant at some point, because processors and whatnot have become magnitudes faster, but you can still feel it quite well, when something was implemented in one of those.
Many of these have C-bindings for their libraries, which means that slowness is caused by bad code (such as making a for loop with a C-call for each iteration instead of once for the whole loop).
I am no coder, but it is my experience that bad code can be slow regardless of language used.
Bad code can certainly be part of it. The average skill level of those coding C/C++/Rust tends to be higher. And modern programs typically use hundreds of libraries, so even if your own code is immaculate, not all of your dependencies will be.
But there's other reasons, too:
Python, Java etc. execute their compiler/interpreter while the program is running.
CLIs are magnitudes slower, because these languages require a runtime to be launched before executing the CLI logic.
GUIs and simulations stutter around, because these languages use garbage collection for memory management.
And then just death by a thousand paper cuts. For example, when iterating over text, you can't tell it to just give you a view/pointer into the existing memory of the text. Instead, it copies each snippet of text you want to process into new memory.
And when working with multiple threads in Java, it is considered best practice to always clone memory of basically anything you touch. Like, that's good code and its performance will be mediocre. Also, you better don't think about using multiple threads in Python+JS. For those two, even parallelism was an afterthought.
Well, and then all of the above feeds back into all the libraries not being performant. There's no chance to use the languages for performance-critical stuff, so no one bothers optimizing the libraries.
They do have optimizations however they are interpreted at runtime so they can only be so fast
Frankly you won't notice much unless the program is doing something computation heavy which shouldn't be done in languages such as JavaScript and Python
That's because it's not relevant. Speed can be compensated for either by caching or outsourcing your load, if there's such a huge need to process large amount of data quickly. In day to day work I can't say I have ever ran into issues because code was executing slow. Normal operation Python is more than capable of keeping up.
On the other side of the coin you have memory management, buff and stack overflows and general issues almost exclusive to C, which is something you don't have to worry about as much with higher level languages. Development with Python is simply faster and safer. We as developers have different tools, and we should use them for their appropriate purpose. You can drive nails with a rock as well, but you generally don't see carpenters doing this all day.
You can sometimes deal with performance issues by caching, if you want to trade one hard problem for another (cache invalidation). There's plenty of cases where that's not a solution though. I recently had a 1ns time budget on a change. That kind of optimization is fun/impossible to do in Python and straightforward to accomplish Rust or C/C++ once you've set up your measurements.
You can sometimes deal with performance issues by caching, if you want to trade one hard problem for another (cache invalidation). There's plenty of cases where that's not a solution though. I recently had a 1ns time budget on a change. That kind of optimization is fun/impossible to do in Python and trivial to accomplish Rust or C/C++ once you've set up your measurements
Especially since languages such as Python and JavaScript are really good a event programing where you have an event that runs a function. Most of the CPU time is idling anyway.
True, plus the bloated websites I see are using hundreds of thousands of lines of JavaScript. Why would you possibly need that much code? My full fledged web games use under 10,000.
They aren't as fast as a native language but they aren't all that slow if you aren't trying to use them for performance sensitive applications. Modern machines run all those very quickly as CPUs are crazy fast.
Also it seems weird to put Java/OpenJDK in the list as it is in its own category from my experience
Java is certainly the fastest of the bunch, but I still find it rather noticeable how long the startup of applications takes and how it always feels a bit laggy when used for graphical stuff.
Certainly possible to ignore that on a rational level, but that's why I'm talking about how it feels.
I'm guessing, this has to do with just the basic UX principle of giving the user feedback. If I click a button, I want feedback that my click was accepted and when the triggered action completed. The sooner those happen, the more confident I feel about my input and the better everything feels.
It is always a question of chosing the right tool for the right task. My core code is in C (but probably better structured than most C++ programs), and it needs to be this way. But I also do a lot of stuff in PERL. When I have to generate a source code or smart-edit a file, it is faster and easier to do this in PERL, especially if the execution time is so short that one would not notice a difference anyway.
Or the code that generates files for the production: Yes, a single run may take a minute (in the background), but it produces the files necessary for the production of goods of over 100k worth. And the run is still faster than the surrounding processes like getting the request from production, calculating the necessary parameters, then wrapping all the necessary files with the results of the run into a reply to the production department.
As someone who only codes solutions for himself I don't relate. All the extra time I would spend writting a C solution it would never attone the runtime loses of doing it in maybe python.
I used to write extensively with C++, but it has been a long time since speed mattered that much to one of my applications. I still marvel at the cache-level optimizations some people come up with, but I'm in the same mindset as you now.
My workload split of Data Movement vs Data Transformation is like 95:5 these days, which means almost all the optimizations I do are changing batch/cache/page/filter settings. I can do that in any language with http bindings, so I choose those that are faster to write.
I used c++ in college, and I think it's useful to know c because so much relies on it. That said if I'm going to do something that needs performance I'll look to go first, then rust if go isn't a good fit, but that's mostly because I know go better. Both are excellent languages.
If I just need something functional quick and easily I'll turn to Python. If I need a net service quick node.js is great.
I use c++ whenever possible because I like classes and objects and having more versatility to make more dynamic programs. I made an entire kernel that way one time because fuck the police.
Haha I love it. c++ is definitely super useful. I never got that deep with it but I've certainly benefited from many things written in c++. Wrote small things and I've had to debug it on occasion just to get something working. It usually ended up being a compiler flag I had to set. I ended up going into web and network related stuff after college. Perl was my goto back then but I'm loving these newer languages and the thought put into some of it. For example the struct, interfaces, and type systems in go could probably replicate a lot of what you would use the classes and objects for.
I was a huge C++ fan back when I was doing a bunch of competitive programming. If I need a performant project nowadays, I look to golang first. It gives me the speed of a compiled language with the usability of high-level language. I still solve the occasional Advent of Code in C++, though :)
I decided to run it on system without bash as neofetch replacement. It feels unreasonably complex, I spent unreasonable amount of time fighting with cmake. On normal system I have pfetch in my bashrc, since it's basically instant, and neofetch for screenshots. It's not instant, but I don't run it every second or something.
And btw I do love speed and simplicity of C, but for fetch tool, POSIX shell is the best choice, I think.
Why does speed even matter this much for a program most people only run once to show off their new builds? Or do these programs have some other purpose than printing system specs?