Anon turns on raytracing
Anon turns on raytracing
Anon turns on raytracing
Maximise your RTX performance with this one crazy hack!
Ray traced reflections: on
Ray traced everything else: off
I'd argue reflections are nowhere near as nice looking as RTGI. If anything, switch reflections off.
But muh puddles! Night City is nothing without those gorgeous, mirror–like puddles.
Also caustics and volumetrics, if your game has those.
Baked lighting looks almost as good as ray tracing because, for games that use baked lighting, devs intentionally avoid scenes where it would look bad.
Half the stuff in this trailer (the dynamically lit animated hands, the beautiful lighting on the moving enemies) would be impossible without ray tracing. Or at the least it would look way way worse:
Practically impossible for this developer? Maybe. Technically impossible? No.
We do have realtime GI solutions which don't require raytracing (voxel cone tracing, sdfgi, screenspace, etc). None of which require any 'special' hardware.
Raytracing is just simpler and doesn't need as much manual work to handle cases where traditional rasterisation might fail (eg; light leaking). But there's not many things it can do which we can't already achieve with rasterisation tricks.
Raytracing is mostly useful for developers who don't have the time/budget/skillset to get the same visual quality with traditional rasterisation.
However, in an industry which seems to prioritise getting things released as cheaply and quickly as possible, we're starting to see developers rely heavily on raytracing, and not allocating many resources into making their non-rt pipeline look nice.
Some are even starting to release games which require raytracing to work at all, because they completely cut the non-rt pipeline out of their budget.
So I'd argue that you're incorrect in theory, but very correct in practise (and getting even more correct with time).
That's kinda the thing with ray tracing. You can save a lot of work but since you want your game available for gamers that don't have the hardware you still have to do that work...
I'm expecting the next PlayStation to focus on ray tracing to set it apart in the market. They have the volume and it would be good for their exclusive titles.
Edit: Okey, maybe I'm just hoping, rather than expecting. Sony can absolutely screw this up.
Raytracing is cool, personaly I feel like the state that consumers first got it in was atrocious, but it is cool. What I worry about is the ai upscale, fake frame bullshit. While it's cool that the technology exists; like sweet, my GPU can render this game at a lower resolution, then upscale it back at a far better frame rate than without upscaling, ideally stretching out my GPU purchase. But I feel like games (in the AAA scene at least) are so unoptimized now, you NEED all of these upscaling, fake frame tricks. I'm not a Dev, I don't know shit about making games, just my 2 cents.
Raytracing will be cool if hardware can catch it up. It's pretty pointless if you have to play upscaled to turn the graphics up. And as you say, upscaling has its uses and is great tech, but when a game needs it to not look like dogshit (looking at you Stalker 2) it worries me a lot.
I feel like if you have the level of a 3070 or above at 1080p, pathtracing, even with the upscaling you need, can be an option. At least based on my experience with portal rtx.
Personally I have a 3060, but (in the one other game I actually have played on it with raytracing support) I still turned on raytraced shadows in Halo Infinite because I couldn't really notice a difference in responsiveness. There definitely was one (I have a 144hz monitor) but I just couldn't notice it.
No you've pretty much hit it on the head there. The higher ups want it shipped yesterday, if you can ship it without fixing those performance issues they're likely going to make you do that.
The joke is, LCD smear anyway on low framerates.
Optimization is usually possible, but it is easier said than done. Often sacrifices have to be made, but maybe it is still a better value per frame time. Sometimes there's more that can be done, sometimes it really is just that hard to light and render that scene.
It's hard to make any sweeping statements, but I will say that none of that potential optimization is going to happen without actually hiring graphics devs. Which costs money. And you know what corporations like to do when anything they don't consider important costs money. So that's probably a factor a lot of the time.
I think raytracing is fine for games that want a lot of realism. But I'm playing games with monsters and fantasy. My suspension of disbelief isn't going to break because reflections aren't quite right.
But I'm pretty much in the camp of, I want my games to look and feel like games. I like visual cues like highlighting items I can interact with or pick up. So lighting is always non-realistic.
Look at Tiny Glade, it's a great example of what raytracing can bring to a stylized game. (They did use their own raytracing pipeline different from the usual - in their own words, re-stir was overkill for what their game needed). Or like 95% of animated films. Including Arcane but excluding Stray.
It's not a trick, it's just lighting done the way it should be done without all the tricks we need now like Subsurface scattering or Screen space reflections.
The added benefit is that materials reflect more of their natural reflection making all the materials look more true to life.
Its main drawback is that it's GPU costly, but more and more AAA games are now moving toward RT as standard by being more clever in how it handles its calculations.
Yes, I'm sure every player spends the majority of their game time admiring the realistic material properties of Spider-Man's suit. So far I've never seen a game that was made better by forcing RT into it. A little prettier if you really focus on the details where it works, but overall it's a costly (in terms of power, computation, and price) gimmick.
The one benefit I see is that it simplifies lighting for the developer by a whole lot.
Which isn't a benefit at all, because as of now, they basically have to have a non-raytrace version so 90% of players can play the game
But in a decade, maybe, raytracing will make sense as the default
RT also makes level-design simpler for the development team as they can design levels by what-you-see-is-what-you-get method rather than having to bake the light sources.
Where is RTX being forced into? Haven't seen a game where it's not an option you have to toggle on first and it's not like RTX is a lot of additional work for the developer, seeing how it in fact reduces the work necessary to make a scene look the way it should.
Yes, it's stupidly expensive and not every game manages to benefit massively from it, but it can lead to some very pretty environments in games and it seems perfectly valid in those cases.
Also, some people do quite enjoy admiring the way the materials of various things end up looking. Maybe it's not the majority of players, but some people quite like looking at details in the games they play.
Subsurface scattering is not one of the things you get automatically with ray tracing. If you just bounce the rays off objects as would be the usual first step in implementing ray tracing you don't get any light penetration into the object, so none of that depth.
Maybe you meant ambient occlusion?
This. Personally I think you can't really expect gamers to know all of that. The only reason I know this particular fact is cause I'm using Blender. It's a bit paradox, but really just pointless to talk about the technical details of games with gamers.
raytracing still needs to do subsurface scattering. It can actually do it for real though. It also "wastes" a lot of bounces, so is usually approximated anyway
Games visuals are riddled with shortcuts and simplification.
You don't think the way the water moves when your characters steps on a puddle, the smoke rises from fires or the damage on the walls are Physics Simulations, do you?!
It's all a variation of a procedural noise such as Perkin Perlin Noise, particle effects, or at best (for example, ocean simulation) some formulas that turn out to look good enough.
(Want to see Physics Simulations in 3D generated worlds, look at Special Effects in Films).
Improving one element of game space visual fidelity - reflections - is nice but it's unclear that it's worth its downsides (more expensive hardware, slower performance) given how everything else is still one big pile of "good enough" shortcuts.
Even with raytracing there is still a lot of shortcuts and trickery under the hood. Ray tracing is the “cheating” form of path tracing.
Soooo, there's a missing part here. The point (and drive) behind raytracing isn't making games beautiful, it's making them cheaper and less man-hour intensive to make/maintain.
The engine guys spend manyears every year working on that non-raytraced engine so it can do 150. They've done every cheat, every side step, and spent every minute possible making it look like they haven't done anything at all.
The idea is that they stop making/updating/supporting non-raytracing engines and let the GPU's pick up the slack. Then using AI to artificially 'upgrade' the frame rate with interpolation.
Don't forget that temporal smear. I like to apply vaseline directly onto my monitor instead.
Don't forget the 10 shadow copies of my car/weapon following me around. It's like someone really liked having a trailing mouse cursor and thought everything should have it
It's not just a time limitation either tho, it also opens up a lot of room for artistic direction and game design
I don't think you could possibly make something like Control's shiny black blocks world look decent without raytraced reflections.
Also anything with significantly large dynamic geometry usually either needs like half of the level file size to be duplicated for every possible state, or some form of raytracing, to work at all. (There's also things like voxel cone tracing that do their own optimized tracing but they also don't really work in 100% of situations and come with their own visual downsides)
To see how far rasterization has been stretched, and how that holds back development - Path of Exile 2 has a tech talk about their bare minimum settings. Artists weren't allowed to rely on anything that could be turned off. They begged the programmers for specific gimmicks, and turned that cheap nonsense into a million blades of grass, raymarched cracks in translucent ice, and soft shadows with no Peter Panning.
Or, picking one specific trick: ambient occlusion was half of why Crysis humbled $5,000 PCs. There's a slide deck for how a superior version of the same effect was achieved in Toy Story 3 on the Wii.
Real-time raytracing was unobtanium for decades because we kept moving the goalposts. The entire 3D games industry is built on cheating around simple parallel techniques being too expensive. By the time hardware catches up to where doing something the simple way is feasible, complex software has faked a wild variety of other effects. Meanwhile: games are designed to rely on what's available. All of the tells for proper path-traced lighting have either been faked or avoided. Games don't even do mirrors, anymore.
There's a reason RTX shows off games from the late 1900s.
It's like when the unity game engine came out, somehow IMO, instead of having to program the whole thing up to your specific game, now everyone could make a 3D platformer.
It does, again IMO, take the soul out of games.
Heh, I can appreciate that. That was also said when people stop using ASM and again when games started running in Windows. Not running games in dos felt really icky.
The earliest publicly available engines were id software engines. Whenever id developed a new one, they released the old one for free. That's why we got a lot of doom clones and those doom clones became whole new genres of games. Thief, half-life, counterstrike, duke nukem, serious Sam, Wolfenstein, call of duty and many many many more games are direct descendants of developers playing with open source engines.
If your argument is that games are worse because developers don't need to build their own engines anymore, you are dead wrong.
The first F.E.A.R. had excellent dynamic lighting, I'd argue it had the epitome of relevant dynamic lighting. It didn't need to set your GPU on fire for it, it didn't have to sacrifice two thirds of its framerate for it, it had it all figured out. It did need work on textures, but even those looked at least believable due to the lighting system. We really didn't need more than that.
RT is nothing but eye candy and a pointless resource hog meant to sell us GPUs with redundant compute capacities, which don't even guarantee that the game'll run any better! And it's not just RT, it's 4k textures, it's upscaling, it's Ambient Occlusion, all of these things hog resources without any major visual improvement.
Upgraded from a 3060 to a 4080 Super to play STALKER 2 at more than 25 frames per second. Got the GPU, same basic settings, increased the resolution a bit, +10 FPS... Totes worth the money...
Edit: not blaming GSC for it, they're just victims of the AAA disease.
Edit 2: to be clear, my CPU's an i7, so I doubt it had much to do with the STALKER bottleneck, considering it barely reached 60% usage, while my GPU was panting...
Edit 3: while re-reading this, it hit me that I sound like the Luddite Boss, so I need to clarify this for myself more than anyone else: I am not against technological advancement, I want tech in my eyeballs (literally), I am against "advancements" which exist solely as marketing accolades.
I remeber reading the real sell to developers is less calculations, currently textures have to be designed for different lightening, which would require pre rendering same textures across multiple lightings. And that is time and resource intensive for developers.
Ray tracing is a simpler solution. I'm not an expert, but that seemed sensible to me.
Honestly, this wouldn't have been an issue, ever, if we wouldn't have switched to "release fast, fuck quality, crunch ya' plebs!" It's yet another solution for a self-generated problem.
I heard the Source 2 editor has (relatively offline, think blender viewport style) ray tracing as an option, even though no games with it support any sort of real time RT. Just so artists can estimate what the light bake will look like without actually having to wait for it.
So what people are talking about there is lightmaps, essentially a whole other texture on top of everything else that holds diffuse lighting information. It's 'baked' in a lengthy process of ray tracing that can take seconds to hours to days depending on how fast the baking system is and how hard the level is to light. This just puts that raytraced lighting information directly into a texture so it can be read in fractions of a millisecond like any other texture. It's great for performance, but can't be quickly previewed, can't show the influence of moving objects, and technically can't be applied to any surface with a roughness other than full (so most diffuse objects but basically no metallic objects, those use light probes and bent normals usually, and sometimes take lightmap information although that isn't technically correct and can produce weird results in some cases)
The solution to lighting dynamic objects in a scene with lightmaps is through a grid of pre baked light probes. These give lighting to dynamic objects but don't receive it from them.
Cpu % usage is not a great stat. If on a 10 core CPU the main thread is maxed and the others are on 20% it would read 28% overall but you're still CPU limited.
Even the 7800x3d is cpu limited in stalker 2 in any npc area
Sorry, yeah, forgot the deets. 9700k, none of the cores were overworked, 60% seemed to be average usage across them.
And, yeah, checked in NPC-heavy areas, where the stuttering, lag, and frame times were the worst, and I didn't have it set to "Ridiculous" - using a combination of High for textures and Med for effects (like shadows and lighting), running it at 1080p on the 3060 and 1440p on the 4080 Super (bumped it up to native, basically). Exclusively on SSD, 32 Gigs of RAM.
Edit: no upscaling because the input lag was horrid.
Really? Ambient occlusion used to be the first thing I would turn on. Anyways, 4k textures barely add any cost to the GPU. That's because they don't use any compute, just vram, and vram is very cheap ($3.36/GB of GDDR6). The only reason consumer cards are limited in vram is to prevent them from being used for professional and AI applications. If they had a comparable ratio of vram to compute, they would be an insanely better value compared to workstation cards, and manufacturers don't want to draw away sales from that very profitable market.
Early 3D graphic rendering was all ray-tracing, but when video games started doing textured surfaces the developers quickly realised they could just fake it with alpha as long as the light sources were static.
Unless you consider wireframe graphics. Idk when triangle rasterization first started being used, but it's more conceptually similar to wireframe graphics the ray tracing. Also, I don't really know what you mean by 'fake it with alpha'.
I never turn it on, the visual difference is too unimportant to warrant such a huge cost in hardware resources (and temperature). It looks different if you have side-by-side screenshots, or if you turn it off and on in-game, but the difference is several orders of magnitude too slight to be worth it. Higher frames-per-second is more important than realistically-simulated light beams. You can't really have both in large AAA games.
as someone who has worked in visual fx for 20 years now, including on over 15 films and 8 games, raytracing is most definitely not simply a marketing tool.
I have seen FEW games that really benefit from RT. RT is a subtle effect because we'we got pretty good at baking and faking how light should look.
But even if its just a subtle effect, it adds so much, the feeling of the lighting is (for me) better wit RT, light properly propagates, bounces, dynamic geometry is properly lit. It's just so much of these, on the bigger scale, tiny upgrades that make the lighting look a lot better.
It just sucks that the performance is utter shit right now. I hope in few years this will be optimised and we won't need to sacrifice 1\2 of the framerate just to get lighting that feels right.
But you can bake additive environment lighting as well.
You can even bake additive lighting in layers, at least for things like street lamps, like coming out of a window onto a street, mostly static objects that can be turned on/off or broken...
And then just only use truly dynamic lighting for... people with lamps, flashlights, cars, truly dynamic stuff.
But that takes time, attention to detail, good map/level design, a bit of extra logic to handle everything... and the AAA paradigm is crank out flashy bullshit that runs like ass... unless you check out our marketing partner's newest GPU!
Not everything, but most advanced dynamic lighting stuff that people associate with RT... can be done in an optimized way, leaving only a few elements to be truly fully, dynamically, brute force rendered every scene.
But, its about 95% easier for a game dev to (or management to tell them to) just let the game engine they paid for a liscense to (almost always UE) to handle it via assuming the end user has a GPU that costs as much as an entire PC 2 years ago.
Long, long gone are the days where game studios were largely defined by having their own engine, tailored to work optimally with the kinds of games they make.
Nearly no AAA game studios bother to make engines these days, nearly none of them have competent enough coders to actually make one... thats all subcontracted out now.
What games do you know that really benefit from RT? So far I'm only aware of Metro Exodus enhanced edition and probably Cyberpunk (haven't played it yet though). Witcher 3 has some noticable changes sometimes but eh. In every other game it feels completely useless.
It was quite nice in Elden Ring with the glow of the Erdtree
The game from the screenshot, Alan Wake 2.
Also Control by the same company, but to a lesser degree.
The change is generally more subtle than people expect but it adds to the overall atmosphere, which is important for these games.
The best examples of raytracing are in applying it to old games, like Quake II or Portal or Minecraft.
Newer games were already hitting diminishing returns on photo realism. Adding ray tracing takes them from 95% photo realistic to 96%.
I disagree - adding RT to games that weren’t designed for it often (but not always) wrecks the original art direction.
Quake II is a great example; I think the raytraced version looks like absolute ass. Sure, it has fancy shadows and reflections, but all that does is highlight how old the assets are.
Portal with ray tracing is a really cool demo, and Ive used it on the past to show off ray tracing. But man its just not as pretty as the old portal because it lacks the charm, its like those nature photos that are blown out with HDR
Same with Minecraft. Minecraft looks like crap, and improving the lighting, shadows and so on just shows that off even more.
Minecraft is a game that's deliberately not about the looks.
often (but not always) wrecks the original art direction.
Which is sometimes a nice benefit. Not to talk about the "layer" in a specific color that suddenly goes away if you enable levelsplus in Reshade. The most extreme example i've seen was Elex 1.
Ray tracing is just a way for nvidia to proprietize a technology then force the industry to use it all to keep Jensen in leather jackets. Don't buy his cards; he has too many leather jackets!
When I had a PS5 and Cyberpunk, I would sometimes switch ray tracing on and off to see if it made a huge difference. Well, the frame rate would be capped at 30 with it on...and I suppose if I stopped and looked around for a bit, it was noticeable, but honestly, I preferred the higher framerate. I've yet to see a game that really benefits from RT.
It's mostly developers that benefit from RT long-term. Not now while it's optional, but once it becomes a requirement, they can cut a couple of time-intensive steps from the development pipeline.
Can't wait until my GPU needs 1000W to run :'(
But maybe finally games will get working mirrors again.
Skyrim has "ray traced" shadows in certain places and works great. I was in a cave once and hiding behind a cliff. An enemy was wandering around the next room and I was able to use the shadow cast on him by a torch to observe his movements without having his actual body in my field of view.
All this modern RT nonsense does is make things look slightly better than screen space reflections and tank performance.
That's actually one specific torch!
I would expect that to be a normal rasterized shadow map unless you can find any sources explicitly saying otherwise. Because even 1 ray per pixel in complex triangulated geometry wasn't really practical in real time until probably at least 2018
Since you can achieve that effect with only a few rays traced instead of hundreds used for soft shadows. But honestly, the same effect could be achieved dynamically with maybe 10 rays and a blur filter.
But you NEED the green and expensive GPU, otherwise you are missing out!!!!
rt is a marketing trick very few games are made in a way that makes it look better
Ray tracing isn’t supposed to make things look better, it’s supposed to save development time
If you spend enough time on lighting you can make static lights look better but that’s just it, it takes longer so it costs more
Raytracing is being pushed so hard by the industry because it makes things easier for devs as opposed to making the games look better for the customer.
There is absolutely nothing about raytracing which makes it "easier" for devs compared to a traditional render pipeline.
The extra performance rquirements alone mean you're going to be doing more work elsewhere to make up for it, and that's ignoring the current bugs/quirks with RT in whatever engine you're using.
The extra performance rquirements alone mean you're going to be doing more work elsewhere to make up for it, and that's ignoring the current bugs/quirks with RT in whatever engine you're using.
No worries, we got upscaling and frame generation now!
Yes, you can skip simulating GI with many small lights. Not a game dev, I work in animation. Up until fifteen years ago this was still a regularly taken approach to lighting scenes, before the advent of pathtracers
I thought the ultimate goal was to encapsulate the lighting system entirely inside the engine to stop programmers/artists needing to micro manage light sources. Presumably if a game only supported ray tracing then they could interact with an environment at the object level and trust the lighting would work in a life-like manner without having to be confected as part of development.
I am waiting for the GPU's to use the rotating kinetic power of the fans to feed back into the GPU to give them ERS boost like in formula 1, when scenes become to graphically demanding. If you steal my idea that is intellectual theft and I will be sad!
I think RayTracing is pushed so hard by the industry because it gives manufacturers an excuse to force consumers to buy better cards to get "the very best". I have a 4070 and I never use RT.
i wish they would just make more gpu so they aren't so supply limited geez
Why would they want to do what's bad for them and good for us? They're a corporation:)
That's why I will literally never support Nvidia. AMD isn't perfect but at least they play nice with open source.
I'm running a 4070s
CP2077 with RT is around 50fps with dips. Without RT I sit at 90fps with max settings and 144p
Ray tracing is cool, problem is, it is still in beta basically. Once hardware catches up and you can still get good FPS then it won't be an annoyance
Meme creator is clearly blind.
The Finals uses rxgti and the only difference between raytracing off and on is the speed at which the lighting changes within a scene.
I see little benefit from raytracing in many games. And would prefer the implementation be used in other games.
I haven't played the finals myself but as of the pre-release version when I watched a video about it lighting didn't update at all without raytracing enabled. It is pretty hard to get any sort of dynamic lighting without raytracing. If not impossible, depending on how you define raytracing. But basically if they have a dynamic lighting feature that works without 'raytracing' they have to create a whole other GI system using world-space probes and maybe even dynamically voxeliIng the entire scene. Neither of which are easy on performance, but usually not as bad as normal hardware RT and restir. Neither of those are good at reflections or fine detail, which is why games that want to look better than that usually switch to doing it the normal way.
I think they're using the SHaRC implementation in RXGI for when you disable raytracing.
They're forthcoming about using rxgi. That said it does work very well even on non-raytracing capable hardware.
never got the point of raytracing. screen space reflections look the same and save so much performance.
I always try it both on and off for a while and see if that specific game gets nicer enough to where it's worth having on. About a third of the games I've played with RTX looked better enough to keep it on. Some that really blew my mind all the way through with RTX on were Ghostwire: Tokyo, Cyberpunk 2077, Avatar: Frontiers of Pandora and Star Wars Outlaws.
“I have a shitty computer and can’t play my single player game at imperceptibly fast frame rates boo boo boo.”
We’ve gotten so good at faking most lighting effects that honestly RTX isn’t a huge win except in certain types of scenes.
The difference is pretty big when there are lots of reflective surfaces, and especially when light sources move (prebaked shadows rarely do, and even when, it's hardly realistic).
A big thing is that developers use less effort and the end result looks better. That's progress. You could argue it's kind of like when web developers finally were able to stop supporting IE9 - it wasn't big for end users, but holy hell did the job get more enjoyable, faster and also cheaper.
Cyberpunk and Control are both great examples - both games are full of reflective surfaces and it shows. Getting a glimpse of my own reflection in a dark office is awesome, as is tracking enemy positions from cover using such reflections.
But, it takes a lot of work by designers to get the fake lighting to look natural. Raytracing would help avoid that toil if the game is forced RT.
Gamers needs expensive hardware so designer has less work. Game still not cheaper.
The issues come if you know how they're faking them. Sure, SSR can look good sometimes, but if you know what it is it becomes really obvious. Meanwhile raytraced reflections can look great always, with the cost of performance usually. It's sometimes worth it, especially when done intelligently.
Not true. Screen space reflections consistently fails to produce accurate reflections.
Screenspace isn't the only way to draw reflections without RT. It's simply the fastest one.
Most gamers aren't going to notice, and I can count on one hand the number of games that actually used reflections for anything gameplay related.
There are cases where screen space can resolve a scene perfectly. Rare cases. That also happen to break down if the user can interact with the scene in any way.