The market cap is now $3.34 trillion, surpassing Microsoft and Apple. Nvidia became the world's most valuable company on Tuesday. The Taiwanese chipmaker...
To illustrate your point, my old GPU, a GTX 1080 from 2016 (basically ancient history - Obama was still president back then) remains a very useful for ML-applications today - and this isn't even their oldest card that is still relevant for AI. This card was never meant for this, but thanks to Nvidia investing into CUDA and CUDA being useful for all sorts of non-gaming applications, the API became a natural first choice when ML tools that run on consumer hardware started to get developed.
My current GPU, an RTX 2080, is just two years younger and yet it's so powerful (for everything I throw at it, including ML) that I won't have to upgrade it for years to come.
Whatever makes RTX work is what accelerations a lot of AI tasks. I’d argue the 1080 is bordering on irrelevant if it wasn’t for the 8 gigs of ram to save it. The 2060 should be much faster despite for gaming being about in par.
I mean, one of the core ideas behind these things is that these are highly capable devices that are receiving updates for several times as long as normal tech, so you can just keep using them for ages.
Apart from the very latest codecs, what else should they do that they aren't already doing?
I believe its missing h265 and av1 hardware support and while it probably has enough performance to handle those codecs in software, I wasn't willing to drop more than 100 euros on a 5 year old device without hardware decoding for them