Yeah punk. Living that 5700g life. Rocking 3700U as a daily driver, what? Yeah I have a fast graphics card, it's called a 780m you might have heard of it from can barely play Baldurs Gate 3 but it counts as playable. Gaming on integrated graphics is cool because it's a lower TDP option for the environment. Fight me. I'm over here playing 2017's greatest offerings at nearly 60fps 720p and I'm enjoying myself. Oh, Cult of the Lamb? Yes, hahaha, I get over 40 fps in Cult of the Lamb. The only other gamers who can @ are switch and xbox series s gangsters. Mortal Kombat 9? Ah, "the good one" of course.
My setup (R5 3600 and GTX 1660super) can play Cities: Skylines 2 just fine on Windows, but it runs like shit under Linux. I guess this is because of the Shared GPU memory.
We really don't talk enough about how the worst rated game of the Tomb Raider reboot from the B studio for the series ended up being the default benchmark for gaming for the better part of a decade.
Good for Eidos Monteal. Guardians of the Galaxy deserved better, too.
I think this tracks. Last time I checked, it had eerily similar performance at 1080p as a GTX 1080 at 1440p (same settings otherwise), at least with games that don't need more than 4GB of VRAM, like Assassin's Creed Origins.
Yeah, I'm at 1080p and have usually not had any issue with the games I've wanted to play. From Might and Magic Book One (1986) to Monster Hunter World/Iceborne. But I'm very selective with the games I play—usually do not tolerate bugs or unnecessarily resource intensive ones where it would've needed a lot less for the same thing with more care taken.