Exactly… games are quite often, in motion. I think we should abolish TAA and go back to forward rendering techniques with MSAA, the hardware can do it no problem. But a lot of games now are made with UE5s out of the box settings, which while they might be great for movies and VFX (I don’t know), they are absolutely terrible for games.
RTRT is probably the future yeah, once it’s matured to a point that it runs as aswell as traditional games, it’ll be great. Right now, it can’t and GPUs are more expensive with their “RT” cores.
For certain games it can work well like slower story games where you can immerse in the world, but for a lot of other games it’s a pointless loss of performance for details nobody will really notice.
But the devs get to slap “ray tracing” in the marketing and can take cash off Nvidia for RTX advertising. The “hype” around it has kind of died down anyway now.
Come on bruh. They could stop selling gaming gpus this second and barely notice. It’s one of the selling features but far from the only reason.
It was hyperbole, they sell them for AI now… aha. My point is that the latest, even last gens, GPUs are so so powerful, yet games seem to get worse and worse performing and in quality (I’m mostly talking big games here, ones you’d expect to built well, the ones chasing graphics). UE5 is normally the culprit.
But NVIDIA has clearly switched to selling software updates with their GPUs and marketing that. DLSS etc. I’m obviously talking about their gaming division here in isolation, it’s clear they don’t exactly need gaming sales to survive as a business now.
I’ve nothing against emerging tech, if it’s going to actually give us more performance. The fact new games still struggle to get 240fps on 1080p, raster performance, is insane. I honestly think 4K 120fps should be the baseline performance we should be hitting, raster, without any upscaling, frame gen bullshittery.
Exactly… games are quite often, in motion. I think we should abolish TAA and go back to forward rendering techniques with MSAA, the hardware can do it no problem. But a lot of games now are made with UE5s out of the box settings, which while they might be great for movies and VFX (I don’t know), they are absolutely terrible for games.
RTRT is probably the future yeah, once it’s matured to a point that it runs as aswell as traditional games, it’ll be great. Right now, it can’t and GPUs are more expensive with their “RT” cores. For certain games it can work well like slower story games where you can immerse in the world, but for a lot of other games it’s a pointless loss of performance for details nobody will really notice. But the devs get to slap “ray tracing” in the marketing and can take cash off Nvidia for RTX advertising. The “hype” around it has kind of died down anyway now.
It was hyperbole, they sell them for AI now… aha. My point is that the latest, even last gens, GPUs are so so powerful, yet games seem to get worse and worse performing and in quality (I’m mostly talking big games here, ones you’d expect to built well, the ones chasing graphics). UE5 is normally the culprit. But NVIDIA has clearly switched to selling software updates with their GPUs and marketing that. DLSS etc. I’m obviously talking about their gaming division here in isolation, it’s clear they don’t exactly need gaming sales to survive as a business now.
I’ve nothing against emerging tech, if it’s going to actually give us more performance. The fact new games still struggle to get 240fps on 1080p, raster performance, is insane. I honestly think 4K 120fps should be the baseline performance we should be hitting, raster, without any upscaling, frame gen bullshittery.