Anon fixes their games
Anon fixes their games
Anon fixes their games
Step 1. Turn on ray tracing
Step 2. Check some forum or protondb and discover that the ray tracing/DX12 is garbage and gets like 10 frames
Step 3. Switch back to DX11, disable ray tracing
Step 4. Play the game
I don't even check anymore lol.
True, I've had very few games worth the fps hit
Best use of ray tracing I've seen is to make old games look good, like Quake II or Portal or Minecraft. Newer games are "I see the reflection in the puddle just under the car when I put them side by side" and I just can't bring myself to care.
If I know a game I'm about to play runs on Unreal Engine, I'm passing a -dx11 flag immediately. It removes a lot of useless Unreal features like Nanite
Then you get to enjoy they worst LODs known to man because they were only made as a fallback
Nanite doesn't affect any of the post processing stuff nor the smeary look. I don't like that games rely on it but modern ue5 games author their assets for nanite. All it affects is model quality and lods.
Lumen and other real time GI stuff is what forces them to use temporal anti aliasing and other blurring effects, that's where the slop is.
what's wrong with nanite?
The slideshow Control experience does look stellar for a bit
Control and Doom Eternal are the only exceptions to this rule I've played, but they are very much the exception.
Out of all of these, motion blur is the worst, but second to that is Temporal Anti Aliasing. No, I don't need my game to look blurry with every trailing edge leaving a smear.
TAA is kind of the foundation that almost all real time EDIT: raytracing frame upscaling and frame generation are built on, and built off of.
This is why it is increasingly difficult to find a newer, high fidelity game that even allows you to actually turn it off.
If you could, all the subsequent magic bullshit stops working, all the hardware in your GPU designed to do that stuff is now basically useless.
EDIT: I goofed, but the conversation thus far seems to have proceeded assuming I meant what I actually meant.
Realtime raytracing is not per se foundationally reliant on TAA, DLSS and FSR frame upscaling and later framgen tech however basically are, they evolved out of TAA.
However, without the framegen frame rate gains enabled by modern frame upscaling... realtime raytracing would be too 'expensive' to implement on all but fairly high end cards / your average console, without serious frame rate drops.
Befor Realtime raytracing, the paradigm was that all scenes would have static light maps and light environments, baked into the map, with a fairly small number of dynamic light sources and shadows.
With Realtime raytracing... basically everything is now dynamic lights.
That tanks your frame rate, so Nvidia then barrelled ahead with frame upscaling and later frame generation to compensate for the framerate loss that they introduced with realtime raytracing, and because they're an effective monopoly, AMD followed along, as did basically all major game developers and many major game engines (UE5 to name a really big one).
Honestly motion blur done well works really well. Cyberpunk for example does it really well on the low setting.
Most games just dont do it well tho 💀
Has the person who invented the depth of field effect for a video game ever even PLAYED a game before?
I mean, it works in... hmmm... RPGs, maybe?
When I was a kid there was an effect in FF8 where the background blurred out in Balamb Garden and it made the place feel bigger. A 2D painted background blur, haha.
Then someone was like, let's do that in the twenty-first century and ruined everything. When you've got draw distance, why blur?
Yes, it makes sense in a game where the designer already knows where the important action is and controls the camera to focus on it. It however does not work in a game where the action could be anywhere and camera doesn't necessarily focus on it.
It works for the WiiU games where Nintendo used it for tilt shifts. That's pretty much it
Well, not exactly, but they were described to him once by an elderly man with severe cataracts and that was deemed more than sufficient by corporate.
it works great for games that have little to no combat, or combat that's mostly melee and up to like 3v1. or if it's a very slight DOF that just gently blurs things far away
idk what deranged individual plays FPS games with heavy DOF though
Yeah, especially games with any amount of sniping. Instantly crippling yourself.
the problem with dilf is that you need to put the subject of your life in the middle
What is the depth of field option? When it's on what happens vs when it's off?
Side question, why the fuck does everything in IT reuse fucking names? Depth of field means how far from character it'll render the environment, right? So if the above option only has an on or off option then it is affecting something other than the actual depth of field, right? So why the fuck would the name of it be depth of fucking field??? I see this shit all the time as I learn more and more about software related shit.
No.
Depth of field is when backgroud/foreground objects get blurred depending on where you're looking, to simulate eyes focusing on something.
You're thinking of draw distance, which is where objects far away aren't rendered. Or possibly level of detail (LoD) where distant objects will be changed to a lower detailed model as they get further away.
https://en.wikipedia.org/wiki/Depth_of_field
It's not "IT" naming. It's physics. Probably a century or few old. That's what they're trying to emulate to make things like more photographic/cinematic.
Same with almost all the other options listed.
In this context it just refers to a post processing effect that blurs certain objects based on their distance to the camera. Honestly it is one of the less bad ones imo, as it can be well done and is sometimes necessary to pull off a certain look.
When it's on, whatever the playable character looks at will be in focus and everything else that is at different distances will be blurry, as it would be the case in real life if your eyes were the playable character's eyes. The problem is that the player's eyes are NOT the playable character's eyes. Players have the ability to look around elsewhere on the screen and the vast majority of them use it all the time in order to play the game. But with that stupid feature on everything is blurry and the only way to get them in focus is to move the playable character's view around along with it to get the game to focus on it. It just constantly feels like something is wrong with your eyes and you can't see shit.
Depth of field is basically how your characters eyes are unfocused on everything they aren't directly looking at.
If there are two boxes, 20 meters apart, one of them will be blurry, while aiming at the other.
Put your finger in front of your face. Focus on it. Background blurry? That's depth of field. Now look at the background and notice your finger get blurry.
motion blur is essential for a proper feeling of speed.
most games don't need a proper feeling of speed.
Motion blur is guarenteed to give me motion sickness every time. Sometimes I forget to turn it off on a new game... About 30 minutes in I'll break into cold sweats and feel like I'm going to puke. I fucking hate that it's on by default in so many games.
Motion blur + low FOV is an instant headache.
It really should be a prompt at first start. Like, ask a few questions like:
The answers to those would automatically disable certain settings and features, or drop you into the settings.
It would be extra nice for a platform like PlayStation or Steam to remember those preferences and the game could read them (and display a message so you know it's doing it).
... What?
I mean... the alternative is to get hardware (including a monitor) capable of just running the game at an fps/hz above roughly 120 (ymmv), such that your actual eyes and brain do real motion blur.
Motion blur is a crutch to be able to simulate that from back when hardware was much less powerful and max resolutions and frame rates were much lower.
At highet resolutions, most motion blur algorithms are quite inefficient and eat your overall fps... so it would make more sense to just remove it, have higher fps, and experience actual motion blur from your eyes+brain and higher fps.
my basis for the statement is beam.ng. at 100hz, the feeling of speed is markedly different depending on whether motion blur is on. 120 may make a difference.
You still see doubled images instead of a smooth blur in your peripheral vision I think when you're focused on the car for example in a racing game.
yeah the only time I liked it was in need for speed when they added nitro boost. the rest of the options have their uses imo I don't hate them.
Now... in fairness...
Chromatic abberation and lense flares, whether you do or don't appreciate how they look (imo they arguably make sense in say CP77 as you have robot eyes)...
... they at least usually don't nuke your performance.
Motion blur, DoF and ray tracing almost always do.
Hairworks? Seems to be a complete roll of the dice between the specific game and your hardware.
I love it when the hair bugs out and covers the whole distance from 0 0 0 to 23944 39393 39
Motion Blur and depth of field has almost no impact on performance. Same with Anisotropic Filtering and I can not understand why AF isn't always just defaulted to max, since even back in the golden age of gaming it had no real performance impact on any system.
You either haven't been playing PC games very long, or aren't that old, or have only ever played on fairly high end hardware.
Anisotropic filtering?
Yes, that... hasn't been challenging for an affordable PC an average person has to run at 8x or 16x for ... about a decade. That doesn't cause too much framerate drop off at all now, and wasn't too much until you... go all the way back to the mid 90s to maybe early 2000s, when 'GPUs' were fairly uncommon.
But that just isn't true for motion blur and DoF, especially going back further than 10 years.
Even right now, running CP77 on my steam deck, AF level has basically no impact on my framerate, whereas motion blur and DoF do have a noticable impact.
Go back even further, and a whole lot of motion blur/DoF algorithms were very poorly implemented by a lot of games. Nowadays we pretty much get the versions of those that were not ruinously inefficient.
Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off... and thats a big deal when you're maxing out at 30 to 40ish fps.
(Of course now we also get ghosting and smearing from framegen algos that ironically somewhat resemble some forms of motion blur.)
If only I could just turn off the chromatic aberration in my eyeglasses.
You can get ones with less chromatic aberration, but it'll cost you.
What Anti Aliasing does your glasses use?
I'm on the -4.25 setting but I may be due for a new prescription as newer reality is getting blurry again.
You want to ask your optician for spectacles with glass lenses and they have to be made by Carl Zeiss AG, Jena, Germany.
I’d add Denuvo to that list. Easily a 10-20% impact.
Unfortunately that's not a setting most of us can just disable.
/c/crackwatch@lemmy.dbzer0.com sure you can
Depth of field and chromatic aberration are pretty cool if done right.
Depth of field is a really important framing tool for photography and film. The same applies to games in that sense. If you have cinematics/cutscenes in your games, they prob utilize depth of field in some sense. Action and dialogue scenes usually emphasize the characters, in which a narrow depth of field can be used to put focus towards just the characters. Meanwhile things like discovering a new region puts emphasis on the landscape, meaning they can use a large depth of field (no background blur essentially)
Chromatic aberration is cool if done right. It makes a little bit of an out of place feel to things, which makes sense in certain games and not so much in others. Signalis and dredge are a few games which chromatic aberration adds to the artstyle imo. Though obviously if it hurts your eyes then it still plays just as fine without it on.
I feel like depth of field and motion blur have their place, yeah. I worked on a horror game one time, and we used a dynamic depth of field- anything you were looking at was in focus, but things nearer/farther than that were slightly blurred out, and when you moved where you were looking, it would take a moment (less than half a second) to 'refocus' if it was a different distance from the previous thing. Combined with light motion blur, it created a very subtle effect that ratcheted up anxiety when poking around. When combined with objects in the game being capable of casting non-euclidean shadows for things you aren't looking at, it created a very pervasive unsettling feeling.
Chromatic aberration is also one of the few effects that actually happens with our eyes instead of being an effect designed to replicate a camera sensor.
Except I hate not being able to see my entire field of view clearly, why did we fight so hard for graphics only to blur that shit out past 50 feet?
And film grain. Get that fake static out of here
Most "film grain" is just additive noise akin to digital camera noise. I've modded a bunch of games for HDR (RenoDX creator) and I strip it from almost every game because it's unbearable. I have a custom film grain that mimic real film and at low levels it's imperceptible and acts as a dithering tool to improve gradients (remove banding). For some games that emulate a film look sometimes the (proper) film grain lends to the the look.
Agreed. It fits very well in very specific places, but when not there, it’s just noise
These settings can be good, but are often overdone. See bloom in the late 2000s/early 2010s.
Also the ubiquitous "realistic" brown filter a la Far Cry 2 and GTA IV. Which was often combined with excessive bloom to absolutely destroy the player's eyes.
At least in Far Cry 2 you are suffering from malaria.
Yeah, chromatic aberration when done properly is great for emulating certain cameras and art styles. Bloom is designed to make things look even brighter and it's great if you don't go nuts with it. Lens flares are mid but can also be used for some camera stuff. Motion blur is generally not great but that's mainly because almost every implementation of it for games is bad.
I always hated bloom, probably because it was overused. As a light touch it can work, but that is rarely how devs used it.
It's usually better in modern games. In the 2005-2015 era it was often extremely overdone, actually often reducing the perceived dynamic range instead of increasing it IMO.
All those features sucked when they first came out, not just bloom.
I don't mind a bit of lens flare, and I like depth of field in dialog interactions. But motion blur and chromatic aberration can fuck right off.
I mind lens flare a lot because I am not playing as a camera and real eyes don't get lens flares.
That's fair. I usually turn it off for FPS games. But if it's mild, I leave it on for third person games where I am playing as a camera.
I mean, lens flare does happen in the eye, just much less dramatically because there's only the one lens and everything is round. But "glare" like how the rest of your sight gets washed out because the sun is in your field of view is a manifestation of lens flare. The eyelashes can also produce some weird light artifacts that resemble camera lens flares but it's a different phenomenon.
Same same
Don't forget TAA!
Worst fucking AA ever created and it blows my mind when it's the default in a game.
Shadows: Off
Polygons: Low
Idle Animation: Off
Draw distance: Low
Billboards instead of models for scenery items: On
Alt: F4
Launch: Balatro
I think my PC can run the C64 demake of Balatro in an emulator
Does your PC even have a dedicated GPU? At this point you might as well give up on PC gaming and buy a console.
674fps
Hating on hair quality is a new one for me. I can understand turning off Ray Tracing if you can have a low-end GPU, but hair quality? It's been at least a decade since I've last heard people complaining that their GPU couldn't handle Hairworks. Does any game even still use it?
It could be a twelve year old capture.
Says 24 at the top
PS3-> everything is sepia filtered and bloomed until nearly unplayable.
I will say that a well executed motion blur is just a chef's kiss type deal, but it's hard to get right and easy to fuck up
Personally I use motion blur in every racing game I can but nothing else. It helps with the sense of speed and smoothness.
PS3-> everything is sepia filtered and bloomed until nearly unplayable.
That's just games from that period. It's not excluse to PS3.
Early HDR games were rough. I look back at Zelda Twilight Princess screenshots, and while I really like that game, I almost squint looking at it because it's so bloomed out.
But what about Bloom?
I feel like bloom depends on how intense it is, and if it makes sense to reasonably play the game.
Like, if it's the sun, yeah, bloom is OK.
If it's anything else? Pass.
The preference against DOF is fine. However, I’m looking at my f/0.95 and f/1.4 lenses and wondering why it’s kind of prized in photography for some genres and hated in games?
It is unnatural. The focus follows where you are looking at. Having that fixed based on the mouse/center of the screen instead of what my eyes are doing feels so wrong to me.
I bet with good eye tracking it would feel different.
That makes sense, if you can’t dynamically control what is in focus then it’s taking a lot of control away from the player.
I can also see why a dev would want to use it for a fixed angle cutscene to create subject separation and pull attention in the scene though.
Different mediums. Different perception. Games are a different kind of immersion.
I always turn that shit off. Especially bad when it's a first-person game, as if your eyes were a camera.
Chromatic aberration and Motion blur are the absolute most important to turn off right away for me, but DoF is a close second. I don't mind the other stuff.
i like lens flare its pretty
I like lense flare for a bit if I'm just enjoying the scenery or whatever. If I'm actually playing the game though, turn that shit off so I can actually see
You are supposed to not see
The main problem with these is giving people control of these properties without them knowing how the cameras work in real life.
The problem is that I am not playing as a camera, so why the hell would I want my in-game vision to emulate one?
Sometimes it does look better, but I would argue it's on the developer to pick the right moments to use them, just like a photographer would. Handing it to the players is the wrong way to go about it, their control on it isnt nearly as good, even without considering their knowledge about it.
I like DoF as it actually has a purpose in framing a subject. The rest are just lazy attempts at making the game "look better" by just slopping on more and more effects.
Current ray tracing sucks because its all fake AI bullshit.
The only game with Raytracing I've seen actually look better with RT on js Cyberpunk 2077. It's the only game I've seen that has full raytraced reflections on surfaces. Everything else just does shadows, and there's basically no visual difference with it on or off; it just makes the game run slower when on.
But those reflections in CP are amazing as fuck. Seeing things reflect in real time off a rained on road is sick.
It's also connected to a performance feature. They can load lower resolution textures for faraway objects. You can do this without the blurring effect of DoF, but it's less jarring if you can blur it.
The cost of DoF rendering far outweighs the memory savings of using reduced texture sizes, especially on older hardware where memory would be at a premium
Ray tracing is not related to AI. Why do you think it's fake AI bullshit? It's tracing rays in the same fashion that blender or Maya would. I think you may be confusing this with DLSS?
"real time raytracing" as is advertised by hardware vendors and implemented in games today is primarily faked by AI de-noising. Even the most powerful cards can't fire anywhere near enough rays to fully raytrace a scene in realtime, so instead they just fire a very low number of rays, and use denoising to clean up the noisy result. That's why, if you look closely, you'll notice that reflections can look weird, and blurry/smeary (especially on weaker cards). It's because the majority of those pixels are predicted by machine learning, not actually sampled from the real scene data.
Blender/Maya's and other film raytracers have always used some form of denoising (before machine learning denoising, there were other algorithms used), but in films they're applied after casting thousands of rays per pixel. In a game today, scenes are rendering around 1 ray per pixel, and with DLSS it's probably even less since the internal render resolution is 2-4x smaller than the final image.
As a technologist, I'll readily admit these are cool applications of machine learning, but as a #gamer4lyfe, I hate how they look in actual games. Until gpus can hit thousands (or maybe just hundreds) of rays per pixel in real time, I'll continue to call it "fake AI bullshit" rather than "real time raytracing"
also, here's an informative video for anyone curious: https://youtu.be/6O2B9BZiZjQ
I think Halo Infinite has a good example of a limited ray traced effect (the shadows) and an example of a terrible DoF effect (it does not look realistic at all or visually appealing)
i need some motion blur on otherwise i get motion sickness.
Wait, I've been turning it off to prevent motion sickness. 🤔
My friend is the same way as you haha.
raytracing's the cool kid, keep him in
Bad effects are bad.
I used to hate film grain and then did the research for implementing myself, digging up old research papers on how It works at a scientific level. I ended up implementing a custom film grain in Starfield Luma and RenoDX. I actually like it and it has a level of "je ne sais quoi" that clicks in my brain that feels like film.
The gist is that everyone just does additive random noise which raises black floor and dirties the image. Film grain is perceptual which acts like cracks in the "dots" that compose an image. It's not something to be "scanned" or overlayed (which gives a dirty screen effect).
Related, motion blur is how we see things in real life. Our eyes have a certain level of blur/shutter speed and games can have a soap opera effect. I've only seen per-object motion blur look decent, but fullscreen is just weird, IMO.
On Motion blur, our eye's motion blur, and camera's shutter speed motion blur are not the same. Eyes don't have a shutter speed. Whatever smearing we see is the result of relaxed processing on the brain side. Under adrenaline with heavy focus, our motion blur disappears as our brain goes full power trying to keep us alive. If you are sleep deprived and physically tired, then everything is blurred, even with little motion from head or eyes.
Over 99% of eye movement (e.g. saccadic eye movement) is ignored by the brain and won't produce a blurred impression. It's more common to notice vehicular fast movement, like when sitting in a car, as having some blur. But it can be easily overcome by focused attention and compensatory eye tracking or ocular stabilization. In the end, most of these graphical effects emulate camera behavior rather than natural experience, and thus are perceived as more artificial than the same games without the effects. When our brain sees motion blur it thinks movie theater, not natural everyday vision.
Add DLSS to the list. I've never had an experience where DLSS didn't make my game run better. It always makes the textures worse and the game run worse than just setting it to native resolution and a specific texture quality.
Edit: I reread your message, and I missed the double negative in your sentence. Did you mean games never run better with DLSS?
That is odd. DLSS should definitely net you a handful of frames. Games often run better with ray tracing on and DLSS on quality vs native without ray tracing, sometimes doubling it. Some newer titles I find are only playable (at the very least 60 fps) because of DLSS (which is a whole problem in and of itself). I absolutely prefer running without any sort of temporal AA because of smudges and ghosting.
Rereading my comment, I think I left out the double negative, so you were right to be confused.
If I had to try and diagnose the issue, I think it comes down to the fact that I have an early 2060, which means not just an old card, but an old card with less VRAM. Consistently, I find that DLSS drops textures down to the lowest possible setting or constantly cycles between texture resolutions every few seconds when I can get a consistent 60 fps on medium settings in most games at native 1080p. It may net me a few extra fps, but the hit to quality simply isn't worth it if I can't make out what's what with the texture popping.
Another possible culprit would be shader caching since games are more and more demanding that you use an SSD to stream directly from the hardrive, but I'm not knowledgeable enough to get that deep into it.
Wym? I love DLSS. If I can't get a solid framerate at native resolution, DLSS really brings a lot to the table with a minor loss of quality imo.
Depends on how the game is optimized. For example I get minimal performance enhancement in GTA 5 enhnaced but in Witcher 3 its a solid 10 to 20 fps improvement on my hardware.
The title should be "anon can't afford rtx5090".