All three Linux distros managed to beat Windows 11 while using Vavle's Proton compatibility layer.
Arch and other Linux operating systems Beat Windows 11 in Gaming Benchmarks::ComputerBase benchmarked three different Linux operating systems and found that all three can achieve better gaming performance than Windows 11.
I don't understand why new Linux folks immediately go for Arch-based distros and insist on using Nvidia GPUs. Like, are you guys into suffering or something?
I used endeavorOS (basically pure Arch with a GUI installer) and I have had 0 issues with Nvidia GPUs, in fact it was a smoother experience than anything else.
Yeah that's a bummer, but in Windows that's still a good card for gaming/CUDA. Nvidia, unfortunately, is a lot like Apple. They do have some neat tech, but they lock it behind both price and exclusivity. That's great for C-Suite pockets, but very anti-consumer at its core.
When I started using Linux circa 2008 Nvidia was the way. ATI/AMD would never work as well. Fast forward and I still use Nvidia because of cuda cores and davinci resolve for video editing. I've just been on the Nvidia card game for a long time. I have no problems with it still.
As for arch base, I started that in 2015. Just found it more flexible and AUR is awesome. So much more software that I could not get on a debian based system.
I never understood what you people do with your machines, through the years i must have used at least half a dozen nvidia gpus and never had any real issues.
Of course early on you had to compile your drivers in the kernel yourself, but then I'm not even sure ati had drivers at the time. And that's how you configured the kernel anyway.
I think a lot of users kinda jump in the deep end, which is fine, but expect their experience to be flawless. Then when it inevitably isn't they get upset and disheartened. I get that.
Why I eventually settled on Ubuntu. I did Red Hat 5 in the 90s, built a Linux From Scratch system, and daily drove Gentoo for a number of years. Got sick of solving NP-complete problems in Gentoo package management. Combine that with lots of documents saying "this is how it works in Ubuntu, and everywhere else you'll have to figure it out for yourself". I don't have time for that shit.
Hell, Ubuntu is more straightforward to get TensorFlow working with Nvidia GPUs than it is on Windows. Nobody uses TensorFlow on pure Windows; you want to use WSL. To do that, you have to setup a passthrough layer to give WSL direct access to the GPU. There have been like three ways to do that over the years, and if you hit the wrong instructions in Google, you'll have to back out everything you did and make sure you start again clean. Which might mean a full reinstall. On Linux, you install the Nvidia drivers, install TensorFlow with the GPU flags, and you're done.
The reason is very simple, Arch has tons of software & all available in it's repository (need more software you can check AUR)
The other reason is flexing to other users
For NVIDIA case it's not that hard especially if you know what you doing, if you're newbie you can use Garuda Linux & it will detect and install NVIDIA driver it self for you
NVidia has a pretty good capture on the gaming market. Especially during the 980/1080 generations. I've also seen a ton of non media people insist on NVidia cuz of shadowplay.
This is speculation, but I don't think most new Linux users are building computers specifically for Linux. They're letting their computers age, then considering Linux when they see the cost of the new generation of NVidia hardware.
Let us not overestimate the general publics knowledge of hardware compatibility and operating systems in general. I think they conceptualize it like replacing a brick in a Lego wall. They'd have no reason to suspect it wouldn't work.
I moved from a gtx 1060 to an RX580 and it has been terrible, recording in obs is horrible to the point that the cpu yields better results and now a recent kernel version broke the power meter on all the polaris gpus.
Damn, sorry to hear that, my experience with the 480 was really good. Admittedly AMD wasn't quite caught up yet with hardware video encoding at the time that card was designed (basically a reskinned 480). Specifically, hardware video encoding has gotten drastically better since then on AMD cards.