People did care, which is why people who played games competitively continued to use CRT monitors well into the crappy LCD days.
Heck, some people still use CRTs. There's not too much wrong with them other than being big, heavy, and not being able to display 4k or typically beeing only 4:3.
I remember CRTs being washed out, heavy, power hungry, loud, hot, susceptible to burn-in and magnetic fields... The screen has to have a curve, so over ~16" and you get weird distortions. You needed a real heavy and sturdy desk to keep them from wobbling. Someone is romanticizing an era that no one liked. I remember the LCD adoption being very quick and near universal as far as tech advancements go.
Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?
Hell, modern displays are just now starting to catch up to CRTs in the input lag and motion blur department.
It was brutal putting up with these shitty LCDs for two whole decades, especially the fact that we had to put up with 60Hz and sub-1080p resolutions, when my CRT was displaying a 1600x1200 picture at 85Hz in the 90s! It wasn't until I got a 4K 120Hz OLED with VRR and HDR couple years ago that I finally stopped missing CRTs, cause I finally felt like I had something superior.
Twenty fucking years of waiting for something to surpass the good old CRT. Unbelievable.
CRTs perfectly demonstrate engineering versus design. All of their technical features are nearly ideal - but they're heavy as shit, turn a kilowatt straight into heat, and take an enormous footprint for a tiny window. I am typing this on a 55" display that's probably too close. My first PC had a 15" monitor that was about 19" across, and I thought the square-ass 24" TV in the living room was enormous. They only felt big because they stuck out three feet from the nearest wall!
I think most people that were gaming held onto their CRTs as long as possible. The main reason being, the first generation of LCD panels took the analogue RGB input, and had to present that onto the digital panel. They were generally ONLY 60hz, and you often had to reset their settings when you changed resolution. Even then, the picture was generally worse than a comparable, good quality CRT.
People upgraded mainly because of the reduced space usage and that they looked aesthetically better. Where I worked, we only had an LCD panel on the reception desk, for example. Everyone else kept using CRTs for some years.
CRTs on the other hand often had much better refresh rates available, especially at lower resolutions. This is why it was very common for competitive FPS players to use resolutions like 800x600 when their monitor supported up to 1280x960 or similar. The 800x600 resolution would often allow 120 or 150hz refresh.
When LCD screens with a fully digital interface became common, even though they were pretty much all 60hz locked, they started to offer higher resolutions and in general comparable or better picture quality in a smaller form factor. So people moved over to the LCD screens.
Fast-forward to today, and now we have LCD (LED/OLED/Whatever) screens that are capable of 120/144/240/360/Whatever refresh rates. And all the age-old discussions about our eyes/brain not being able to use more than x refresh rate have resurfaced.
I had a 20-odd inch CRT with the flat tube. Best CRT I ever had, last one I had before going to LCD. Still miss that thing, the picture was great! Weighed a ton, though.