A prominent open-source dev publishes their findings as to what's going on with Starfield's performance, and it's pretty darn strange.
According to Hans-Kristian Arntzen, a prominent open-source developer working on Vkd3d, a DirectX 12 to Vulkan translation layer, Starfield is not interacting properly with graphics card drivers.
I'm inclined to believe this, and this likely isn't even the whole extent of it. I've been playing on a Series X, but decided to check it out on my Rog Ally. On low, at 720p with FSR2 on, I'd get 25-30fps in somewhere like New Atlantis. I downloaded a tweaked .ini for the Ultra preset and now not only does the game look much better, but the city is up closer to 40fps, with most other areas being 45-60+. Makes me wonder what it was they thought was worth the massive cost that the default settings give, with no real visual improvement.
Another odd thing, if I'm playing Cyberpunk or something, this thing is in the 90%+ CPU and GPU utilization range, with the temps in the 90c+ range. Starfield? GPU is like 99%, CPU sits around 30%, and the temp is <=70c, which basically doesn't happen playing any other "AAA" game. I could buy Todd's comments if the frame rate was crap, but this thing was maxed out... but not getting close to full utilization on a handheld with an APU indicates something less simple.
I'm hoping the work from Hans finds its way to all platforms (in one way or another), because I'd love to use the Series X but 30fps with weird HDR on a 120hz OLED TV actually makes me a little nauseous after playing for a while, which isn't something I commonly have a problem with.
From my experience on the Steam Deck is doesn't matter if I run low graphics or medium graphics (some high settings) the performance is almost the same