I've wanted to buy an upgrade to my RX580 for years now, but I'd really like AV1 encoding support. With OBS finally supporting AV1 on all platforms (?), this actually makes sense. But I'm once again reminded how bad the used market for GPUs is in my country atm, so I'll wait for a while longer.
Got a 6700XT second hand about a year ago when the price finally came down from astronomical ridiculous crypto bubble crazy, to almost reasonable. Just looked and they're still going for the same price. Thought this would have dropped a bit by now, but I guess not.
Yes, I've also had an eye on the 6700XT, but I made the bad decision to wait for the new gen and hopefully a price drop for older GPUs. The stable used prices are probably because of people who bought at exorbitant prices who don't want to sell their GPU for nothing, combined with the new gen having the same price to performance ratio.
Now with the 7600XT having 16GB VRAM, I've thought about buying until I noticed it only supports PCIe 4.0 x8, which is half the bandwidth on my PCIe 3.0 x16 slot. It's a B350 board I want to upgrade to a 5800X3D and use for years to come. This means I'm basically forced to either go with a 7700XT, or go with an older 6700XT.
Anyway, waiting years for a new gen isn't an option either, so I'll stay frustrated for a while longer.
The cross-platform OBS software that is popular with game streamers and others live-recording their desktops has finally landed support for AV1 video encoding using Linux's Video Acceleration API (VA-API) interface.
Opened last May was a merge request for the OBS FFmpeg code to add AV1 support for VA-API.
As of Tuesday evening that code was merged.
The code has successfully tested the VA-API AV1 encoding using the Mesa drivers.
VA-API AV1 encoding is available with AMD Radeon RX 7000 series graphics and Intel Arc Graphics when it comes to those with open-source Mesa driver support.
It's unfortunate that it has taken until into 2024 to get this code merged, but nevertheless exciting for the next OBS feature release.
The original article contains 118 words, the summary contains 118 words. Saved 0%. I'm a bot and I'm open source!
Your GPU has a dedicated ASIC that can do the encoding simultaneously. On NVIDIA (not relevant in this case) that would be your NVENC encoder.
AMD and Intel have their own ASIC IP blocks that do encode/decode that's part of the GPU "SoC" but wouldn't consume GPU compute resources (eg CUs). That's how you see people already using GPU encode with obs (non-AV1 codecs) while gaming, and really that's how people like me using Sunshine/Parsec for the host PC for "remote" gaming (mostly for remoting into a Windows machine for the 1 game that cannot be run on Linux nor a VM due to anti-cheat). The only GPU resources you're using are PCIe bandwidth and perhaps some VRAM usage? But I wouldn't call it just dumping it from the CPU to the GPU, you have an ASIC that mitigates the brunt of the workload and AV1 with Sunshine has been amazing, can't imagine now using it for recording my gameplay vids will hopefully be better than H264 (due to lower bitrates and hence smaller file sizes).