Skyrim has “ray traced” shadows in certain places and works great. I was in a cave once and hiding behind a cliff. An enemy was wandering around the next room and I was able to use the shadow cast on him by a torch to observe his movements without having his actual body in my field of view.
All this modern RT nonsense does is make things look slightly better than screen space reflections and tank performance.
That’s actually one specific torch!
It is unknown why it has this function, or why Bethesda left it in
Just Bethesda things
I’ve seen the effect in other places, though I guess technically they can stick that torch wherever they want as you explore.
If you mod, that’s likely why you found it in other places. The wiki isn’t kidding when it says it is found in only one place in the game (in vanilla at least.)
I would expect that to be a normal rasterized shadow map unless you can find any sources explicitly saying otherwise. Because even 1 ray per pixel in complex triangulated geometry wasn’t really practical in real time until probably at least 2018
I’m not sure how it worked, all I know is that it was real time and would react to player models, enemies or other things that would move in unpredictable ways, but only for specific light sources.
Yeah, that’s just rasterized shadow mapping. It’s very common and a lot of old games use it, as well as any modern game. Basically used in any non-raytraced game with dynamic shadows (I think there’s only one other way to do it, just directly projecting the geometry, only done by a few very old games that can only cast shadows onto singular flat surfaces).
The idea is that you render the depth of the scene from the perspective of the light source. Then, for each pixel on the screen, to check if it’s in shadow, you find it’s position on the depth texture. If it’s further away than something else from the perspective of the light, it’s in shadow, else it isn’t. This is filtered to make it smoother. The downside is that it can’t support shadows of variable width without some extra hacks that don’t work in all cases (aka literally every shadow), to get sharp shadows you need to render that depth map at a very high resolution, rendering a whole depth map is expensive, it renders unseen pixels, doesn’t scale that well to low resolutions (like if you wanted 100 very distant shadow catching lights) etc.
Raytraced shadows are actually very elegant since they operate on every screen pixel (allowing quality to naturally increase as you get closer to any area of interest in the shadow) and naturally support varying shadow widths at the cost of noise and maybe some more rays. Although they still scale expensively many light sources, some modified stochastic methods still look very good and allow far more shadow casting lights than would ever have been possible with pure raster.
You don’t notice the lack of shadow casting lights much in games because the artists had to put in a lot of effort and modifications to make sure you wouldn’t.
Since you can achieve that effect with only a few rays traced instead of hundreds used for soft shadows. But honestly, the same effect could be achieved dynamically with maybe 10 rays and a blur filter.
Meme creator is clearly blind.
as someone who has worked in visual fx for 20 years now, including on over 15 films and 8 games, raytracing is most definitely not simply a marketing tool.
Ray tracing is just a way for nvidia to proprietize a technology then force the industry to use it all to keep Jensen in leather jackets. Don’t buy his cards; he has too many leather jackets!
amd cards can handle raytracing too though… soooooo.
As I’m sure you already know the proprietary part comes from the implementation and built in hardware support for said implementation, which AMD is not compatible with (not in any usable way at least)
AMD also has hardware support for raytracing and both are using the same API for raytracing. Nvidia just has a head start and deeper pockets.
This isn’t Cuda or Gameworks where the features depend on Nvidia hardware, it’s more like Tessellation where they can both do it but Nvidia cards did it better so they pushed developers into adding it into games.
I have seen FEW games that really benefit from RT. RT is a subtle effect because we’we got pretty good at baking and faking how light should look.
But even if its just a subtle effect, it adds so much, the feeling of the lighting is (for me) better wit RT, light properly propagates, bounces, dynamic geometry is properly lit. It’s just so much of these, on the bigger scale, tiny upgrades that make the lighting look a lot better.
It just sucks that the performance is utter shit right now. I hope in few years this will be optimised and we won’t need to sacrifice 1\2 of the framerate just to get lighting that feels right.
But you can bake additive environment lighting as well.
You can even bake additive lighting in layers, at least for things like street lamps, like coming out of a window onto a street, mostly static objects that can be turned on/off or broken…
And then just only use truly dynamic lighting for… people with lamps, flashlights, cars, truly dynamic stuff.
But that takes time, attention to detail, good map/level design, a bit of extra logic to handle everything… and the AAA paradigm is crank out flashy bullshit that runs like ass… unless you check out our marketing partner’s newest GPU!
Not everything, but most advanced dynamic lighting stuff that people associate with RT… can be done in an optimized way, leaving only a few elements to be truly fully, dynamically, brute force rendered every scene.
But, its about 95% easier for a game dev to (or management to tell them to) just let the game engine they paid for a liscense to (almost always UE) to handle it via assuming the end user has a GPU that costs as much as an entire PC 2 years ago.
Long, long gone are the days where game studios were largely defined by having their own engine, tailored to work optimally with the kinds of games they make.
Nearly no AAA game studios bother to make engines these days, nearly none of them have competent enough coders to actually make one… thats all subcontracted out now.
What games do you know that really benefit from RT? So far I’m only aware of Metro Exodus enhanced edition and probably Cyberpunk (haven’t played it yet though). Witcher 3 has some noticable changes sometimes but eh. In every other game it feels completely useless.
It was quite nice in Elden Ring with the glow of the Erdtree
The game from the screenshot, Alan Wake 2.
Also Control by the same company, but to a lesser degree.
The change is generally more subtle than people expect but it adds to the overall atmosphere, which is important for these games.
The first F.E.A.R. had excellent dynamic lighting, I’d argue it had the epitome of relevant dynamic lighting. It didn’t need to set your GPU on fire for it, it didn’t have to sacrifice two thirds of its framerate for it, it had it all figured out. It did need work on textures, but even those looked at least believable due to the lighting system. We really didn’t need more than that.
RT is nothing but eye candy and a pointless resource hog meant to sell us GPUs with redundant compute capacities, which don’t even guarantee that the game’ll run any better!
Upgraded from a 3060 to a 4080 Super to play STALKER 2 at more than 25 frames per second. Got the GPU, same basic settings, increased the resolution a bit, +10 FPS… Totes worth the money…
F. E. A. R. And Riddick EFBB were beautiful games, I remember GTA SA coming out a few months later and thinking, WTF are these graphics.
Seriously, even GTA III had a MUCH better defined atmosphere and feel than SA!
And the Riddick games were, indeed, gorgeous! In a grim as hell way, but gorgeous!
It was fun though, despite the shitty GFX, was definitely my favourite of the series.
Agreed, it was a major step forward in terms of mechanical complexity, but I couldn’t shake the feeling that everything I saw was made out of plastic, y’know? Cars included.
To be fair, the “sunny California” setting and atmosphere is probably as bland and generic as it gets. It didn’t give designers a lot to work with.
It’s not just that, Vice City was set in Miami, I’d argue it had similar vibes/aesthetics (accounting for the difference in time and setting), and it felt significantly more cohesive and well-designed in terms of aesthetics.
I remeber reading the real sell to developers is less calculations, currently textures have to be designed for different lightening, which would require pre rendering same textures across multiple lightings. And that is time and resource intensive for developers.
Ray tracing is a simpler solution. I’m not an expert, but that seemed sensible to me.
Honestly, this wouldn’t have been an issue, ever, if we wouldn’t have switched to “release fast, fuck quality, crunch ya’ plebs!” It’s yet another solution for a self-generated problem.
I don’t know who is downvoting you, but release fast at cost of quality definitely makes the problem worse. Because people keep buying half baked unfinished games.
Oh, I always remove the default upvote:))
And thoroughly agreed, currently playing Dune Awakening. It’s been about two weeks since launch. I fell through the map twice, and keep getting ganked by solo mobs who pin me to walls, because I’m unable to move or dodge. A single enemy. Blocking my movement completely. Not to mention NPCs being displayed as blank templates which load in when you’re barely 2m away from them, twitchy directionals, and many other points of minor annoyance, which just add up…
And I’m not dumping on Dune, it’s a good game! But it’s VERY rough around the edges…
Cpu % usage is not a great stat. If on a 10 core CPU the main thread is maxed and the others are on 20% it would read 28% overall but you’re still CPU limited.
Even the 7800x3d is cpu limited in stalker 2 in any npc area
Sorry, yeah, forgot the deets. 9700k, none of the cores were overworked, 60% seemed to be average usage across them.
And, yeah, checked in NPC-heavy areas, where the stuttering, lag, and frame times were the worst, and I didn’t have it set to “Ridiculous” - using a combination of High for textures and Med for effects (like shadows and lighting), running it at 1080p on the 3060 and 1440p on the 4080 Super (bumped it up to native, basically).
I that’s part of it… The 9700K is like 7 years old at this point, and I’m all for holding off updates if it does what you want, but eventually you’ll have to if you want to actually take advantage of that 4080
Well aware of that, but no game has ever had issues with it so far, so…
And I even run it without any OC, because it handles everything I throw at it juust fine.
It’s not a trick, it’s just lighting done the way it should be done without all the tricks we need now like Subsurface scattering or Screen space reflections.
The added benefit is that materials reflect more of their natural reflection making all the materials look more true to life.
Its main drawback is that it’s GPU costly, but more and more AAA games are now moving toward RT as standard by being more clever in how it handles its calculations.
Even with raytracing there is still a lot of shortcuts and trickery under the hood. Ray tracing is the “cheating” form of path tracing.
raytracing still needs to do subsurface scattering. It can actually do it for real though. It also “wastes” a lot of bounces, so is usually approximated anyway
Yes, I’m sure every player spends the majority of their game time admiring the realistic material properties of Spider-Man’s suit. So far I’ve never seen a game that was made better by forcing RT into it. A little prettier if you really focus on the details where it works, but overall it’s a costly (in terms of power, computation, and price) gimmick.
RT also makes level-design simpler for the development team as they can design levels by what-you-see-is-what-you-get method rather than having to bake the light sources.
Development and design can use RT all day long, that’s not the issue. They have the benefit of not having to run ray tracing in real time on consumer hardware. At the end of the day, unless they want to offload all of that computation load onto the customer forever (and I really mean all RT all the time), they’ll eventually have to bake most or all of that information into a format that a rasterizer can use.
Where is RTX being forced into? Haven’t seen a game where it’s not an option you have to toggle on first and it’s not like RTX is a lot of additional work for the developer, seeing how it in fact reduces the work necessary to make a scene look the way it should.
Yes, it’s stupidly expensive and not every game manages to benefit massively from it, but it can lead to some very pretty environments in games and it seems perfectly valid in those cases.
Also, some people do quite enjoy admiring the way the materials of various things end up looking. Maybe it’s not the majority of players, but some people quite like looking at details in the games they play.
There aren’t many but the new Indiana Jones and Doom games require ray tracing
To be fair… At least those 2 actually perform well.
Indiana Jones can run at high settings 1080p NATIVE at like 80 fps on a 3060, and Doom ran at like 80 FPS medium settings quality upscaled 1440p on my RX 6800XT which is like bad for RT lol
The one benefit I see is that it simplifies lighting for the developer by a whole lot.
Which isn’t a benefit at all, because as of now, they basically have to have a non-raytrace version so 90% of players can play the game
But in a decade, maybe, raytracing will make sense as the default
I’ve always said that, because the baseline GPUs are the RTX 3060 and the RX 6700 (consoles equivalent)… And those GPUs aren’t doing amazing RT so, what’s the point in pushing it so hard NOW for the 1% of users with a 4090 or whatever?
Subsurface scattering is not one of the things you get automatically with ray tracing. If you just bounce the rays off objects as would be the usual first step in implementing ray tracing you don’t get any light penetration into the object, so none of that depth.
Maybe you meant ambient occlusion?
This. Personally I think you can’t really expect gamers to know all of that. The only reason I know this particular fact is cause I’m using Blender. It’s a bit paradox, but really just pointless to talk about the technical details of games with gamers.
Games visuals are riddled with shortcuts and simplification.
You don’t think the way the water moves when your characters steps on a puddle, the smoke rises from fires or the damage on the walls are Physics Simulations, do you?!
It’s all a variation o procedural noise such as Perkin Noise, particle effects to at best (for example, ocean simulation) some formulas that turn out to look good enough.
(Want to see Physics Simulations in 3D generated worlds, look at Special Effects in Films).
Improving one element of game space visual fidelity - reflections - is nice but it’s unclear that it’s worth its downsides (more expensive hardware, slower performance) given how everything else is still one big pile of “good enough” shortcuts.
RT is of course a shortcut too, it’s not an exact representation of how light actually behaves…
That’s the thing: Ray Tracing as implemented on Graphics Cards (which is a subset of what’s done in offline rendering for things like Film) only makes 3D rendering environments a bit more realistic in the domain of lighting, not even the same as reality, and this domain is only a small part of the big fucking pile of shortcuts used for realtime 3D rendering, so this improvement leaves all other ways a game space diverges from reality the same.
Mind you, this partial Ray Tracing thing tainting shadows next to brightly lit colored objects and doing proper realtime reflections for all reflective surfaces would be great if one didn’t have to actually upgrade one’s hardware and the performance loss was small, but that’s not the case yet.
We’ve gotten so good at faking most lighting effects that honestly RTX isn’t a huge win except in certain types of scenes.
The issues come if you know how they’re faking them. Sure, SSR can look good sometimes, but if you know what it is it becomes really obvious. Meanwhile raytraced reflections can look great always, with the cost of performance usually. It’s sometimes worth it, especially when done intelligently.
Not true. Screen space reflections consistently fails to produce accurate reflections.
Screenspace isn’t the only way to draw reflections without RT. It’s simply the fastest one.
Most gamers aren’t going to notice, and I can count on one hand the number of games that actually used reflections for anything gameplay related.
What I’m talking about is drawing accurate reflections and I don’t know any other technique that produces the same accuracy as RT
Reflection probes are one way. Basically a camera drawing a simpler version of the scene from a point into a cubemap. Decent for oddly shaped objects, although if you want a lot of them then you’d bake them and lose any real time changes. A common optimisation is to update them less than once a frame.
If you have one big flat plane like the sea, you can draw the world from underneath and just use that. GTA V does that (like ten years ago without RT), along with the mirrors inside. You could make that look better by rendering them in higher resolution.
https://www.adriancourreges.com/blog/2015/11/02/gta-v-graphics-study-part-2/
Where RT is visibly better is with large odd shaped objects, or enormous amounts of them. I can’t say it’s worth the framerate hit if it takes you below 60fps though.
I haven’t personally played a game that uses more than one dynamic reflection probe at a time. They are pretty expensive, especially if you want them to look high resolution and want the shading in them to look accurate.
That’s like saying that “physics simulation is the only technique that produces accurately shaped water streams” - technically true but generally not a sufficient improvement over the shortcuts currently in use to make up for the downside that the technically most precise method is slow as fuck.
Game making is at all levels finding shortcuts and simplifications (even games about the real world are riddled with simplifications, if only the gameplay rules being a simplified version of real world interactions because otherwise it would be boring as shit) and in the visual side of things those are all over the place even with RT (the damage on the walls, the clouds in the sky, the smoke rising from fires or the running water on the streams aren’t the product of Physics Simulations but, most likely, the use of something like Perkin Noise or even good old particle effects to fake it well enough to deceive human perception).
Yeah, sure RT is, technically speaking in terms of vidual fidelity alone, better than the usual tricks (say, using an extra rendering step for the viewpoint of the main reflective surfaces such as mirrors). Is the higher fidelity (in, remember, a game space which is in many other ways riddled with shortcuts and simplifications) sufficient to overcome its downsides for most people? So far the market seems to be saying that it’s not.
CDProjektRed just showcased The Witcher 4 running RT with 60 fps on a PS5. Bullshit its too slow to be available for most people.
If you think that video is representative of the release game’s actual performance and fidelity, I have several bridges to sell you.
I don’t see them lying but that’s on you I guess
From an article about it:
Now, it should be stressed that this is a build of The Witcher 4 specifically designed to show off Unreal Engine’s features. Yes, it’s running on a standard PS5, but it’s not necessarily indicative of the finished product.
So that’s like saying “under laboratory conditions it has been demonstrated to work”.
If you know what to look for you can notice it (mainly light bouncing of objects and tainting shadows with the color of those objects, such as the shadow above the green canvas here), but the difference to the non-RT version when one doesn’t know what to look for is minimal and IMHO not enough to justifying upgrading one’s hardware, especially considering that so much of the rest (the water in the streams, the snow in the mountains, the shape of the mountains themselves, the mud splash when a guy is thrown into the mud, the folliage of the plants and so on) has those visual shortcuts I mentioned.
Yeah, sure, it’s nice than shadows next to strongly lit colorer surfaces get tinted with the color of that surface, but is that by itself worth it upgrading one’s hardware?!
When most games with RT in them deliver that performance on one generation old hardware and all environments, then you will have proven the point that for most gamers it has no significant negative impact on performance.
Depends on how you define “accurate”. Even full ray tracing is just an approximation based on relatively few light rays (on an order of magnitude that doesn’t even begin to approach reality) that is deemed to be close enough where increasing the simulation complexity doesn’t meaningfully improve visual fidelity, interpolated and passed through a denoising algorithm. You can do close enough with a clever application of light probes, screenspace effects, or using a second camera to render the scene onto a surface (at an appropriate resolution).
But, it takes a lot of work by designers to get the fake lighting to look natural. Raytracing would help avoid that toil if the game is forced RT.
Gamers needs expensive hardware so designer has less work. Game still not cheaper.
I took pickes and tomatoes off my burger, where’s my $0.23 discount damn it?!
Let’s assume cutting out tomatoes and pickles saved $0.23 per hamburger.
McDonald’s serves 6.5 million hamburgers a day.
That’s $500 million extra yearly profit for their shareholders.There’s actually a decent analogy there I think. The hamburger won’t cost less, because the service of customization it itself less efficient: serving customers with their preference of with/without is more expensive than just pickles for all. Likewise I imagine making a game that looks OK with/out RT is extra work than just with.
There really isn’t.
The op comment was that gamers need to buy expensive hardware so that developers could cut on features/optimization.
The follow-up reply likened it to customizing your burger, but the better analogy (and the one I assumed) would be for McDonald’s to remove all tomato and pickles (saving money), and the user had to buy it themselves to add to the burger.
There is no analogy. It’s comparing returning costs per product (you need a new tomato per 5 burgers) to a one time costs that can be cut during development. And additional copies of a game don’t generate more costs.
thats the same logic behind the really high hardware requirements nowadays.
studios just wanting to save time and cut corners, and you have to offset that with really expensive cards.
The difference is pretty big when there are lots of reflective surfaces, and especially when light sources move (prebaked shadows rarely do, and even when, it’s hardly realistic).
A big thing is that developers use less effort and the end result looks better. That’s progress. You could argue it’s kind of like when web developers finally were able to stop supporting IE9 - it wasn’t big for end users, but holy hell did the job get more enjoyable, faster and also cheaper.
Cyberpunk and Control are both great examples - both games are full of reflective surfaces and it shows. Getting a glimpse of my own reflection in a dark office is awesome, as is tracking enemy positions from cover using such reflections.
I have only ever seen Cyberpunk in 2k res, ultra graphics, ultra widescreen, ray-tracing and good fps at a friend’s house and it does indeed look nice. But in my opinion there are too many reflective surfaces. It’s like they are overdoing the reflectiveness on every object just because they can. They could have done a better job at making it look realistic.
Oh, they are definitely intentionally overdoing it since 90% of said reflective surfaces are ads, often reflecting other ads in there. The game is such an assault of advertising that I’ve found myself minding the advertisements in RL public spaces a lot more.
For any other game, I’d agree, but cyberpunk being full of chrome is an aesthetic that predates the video games by a fair margin, haha.
My problem is more with wet surfaces and the likes. Walking around the city it feels like every little water puddle is mirror and a spoon can also reflect way too much. I don’t mind shining chrome body parts.
Ray tracing isn’t supposed to make things look better, it’s supposed to save development time
If you spend enough time with on lighting you van make static lights look better but that’s just it, it takes longer so it costs more
Raytracing is cool, personaly I feel like the state that consumers first got it in was atrocious, but it is cool. What I worry about is the ai upscale, fake frame bullshit. While it’s cool that the technology exists; like sweet, my GPU can render this game at a lower resolution, then upscale it back at a far better frame rate than without upscaling, ideally stretching out my GPU purchase. But I feel like games (in the AAA scene at least) are so unoptimized now, you NEED all of these upscaling, fake frame tricks. I’m not a Dev, I don’t know shit about making games, just my 2 cents.
Raytracing will be cool if hardware can catch it up. It’s pretty pointless if you have to play upscaled to turn the graphics up. And as you say, upscaling has its uses and is great tech, but when a game needs it to not look like dogshit (looking at you Stalker 2) it worries me a lot.
I feel like if you have the level of a 3070 or above at 1080p, pathtracing, even with the upscaling you need, can be an option. At least based on my experience with portal rtx.
Personally I have a 3060, but (in the one other game I actually have played on it with raytracing support) I still turned on raytraced shadows in Halo Infinite because I couldn’t really notice a difference in responsiveness. There definitely was one (I have a 144hz monitor) but I just couldn’t notice it.
The joke is, LCD smear anyway on low framerates.
No you’ve pretty much hit it on the head there. The higher ups want it shipped yesterday, if you can ship it without fixing those performance issues they’re likely going to make you do that.
Optimization is usually possible, but it is easier said than done. Often sacrifices have to be made, but maybe it is still a better value per frame time. Sometimes there’s more that can be done, sometimes it really is just that hard to light and render that scene.
It’s hard to make any sweeping statements, but I will say that none of that potential optimization is going to happen without actually hiring graphics devs. Which costs money. And you know what corporations like to do when anything they don’t consider important costs money. So that’s probably a factor a lot of the time.
I don’t know but Path Tracing makes CBP2077 and Alan Wake 2 looks like a real next gen game.
Cy Ber Punk? interesting
There is a real reason to not use the “C + P” initialism in online chat these days… on some platforms it’s likely to be flagged & reported by automods/bots/Eye of Sauron.
But I love CP!! It’s so next gen!
That’s him, officer, right there ☝️🤓
Early 3D graphic rendering was all ray-tracing, but when video games started doing textured surfaces the developers quickly realised they could just fake it with alpha as long as the light sources were static.
Unless you consider wireframe graphics. Idk when triangle rasterization first started being used, but it’s more conceptually similar to wireframe graphics the ray tracing. Also, I don’t really know what you mean by ‘fake it with alpha’.
Baked lighting looks almost as good as ray tracing because, for games that use baked lighting, devs intentionally avoid scenes where it would look bad.
Half the stuff in this trailer (the dynamically lit animated hands, the beautiful lighting on the moving enemies) would be impossible without ray tracing. Or at the least it would look way way worse:
Practically impossible for this developer? Maybe. Technically impossible? No.
We do have realtime GI solutions which don’t require raytracing (voxel cone tracing, sdfgi, screenspace, etc). None of which require any ‘special’ hardware.
Raytracing is just simpler and doesn’t need as much manual work to handle cases where traditional rasterisation might fail (eg; light leaking). But there’s not many things it can do which we can’t already achieve with rasterisation tricks.
Raytracing is mostly useful for developers who don’t have the time/budget/skillset to get the same visual quality with traditional rasterisation.However, in an industry which seems to prioritise getting things released as cheaply and quickly as possible, we’re starting to see developers rely heavily on raytracing, and not allocating many resources into making their non-rt pipeline look nice.
Some are even starting to release games which require raytracing to work at all, because they completely cut the non-rt pipeline out of their budget.So I’d argue that you’re incorrect in theory, but very correct in practise (and getting even more correct with time).
That’s kinda the thing with ray tracing. You can save a lot of work but since you want your game available for gamers that don’t have the hardware you still have to do that work…
I’m expecting the next PlayStation to focus on ray tracing to set it apart in the market. They have the volume and it would be good for their exclusive titles.
Edit: Okey, maybe I’m just hoping, rather than expecting. Sony can absolutely screw this up.
Maximise your RTX performance with this one crazy hack!
Ray traced reflections: on
Ray traced everything else: offAlso caustics and volumetrics, if your game has those.
I’d argue reflections are nowhere near as nice looking as RTGI. If anything, switch reflections off.
But muh puddles! Night City is nothing without those gorgeous, mirror–like puddles.
When I had a PS5 and Cyberpunk, I would sometimes switch ray tracing on and off to see if it made a huge difference. Well, the frame rate would be capped at 30 with it on…and I suppose if I stopped and looked around for a bit, it was noticeable, but honestly, I preferred the higher framerate. I’ve yet to see a game that really benefits from RT.
It’s mostly developers that benefit from RT long-term. Not now while it’s optional, but once it becomes a requirement, they can cut a couple of time-intensive steps from the development pipeline.
Can’t wait until my GPU needs 1000W to run :'(