I did take a look at them before posting. If I am the only one, obviously there is something my vision does not pick up. Such disparity in appreciation of graphics has been happening more frequently in recent years. I will be glad if you guys can pinpoint what you like about this new thing.
I did take a look at them before posting. If I am the only one, obviously there is something my vision does not pick up. Such disparity in appreciation of graphics has been happening more frequently in recent years. I will be glad if you guys can pinpoint what you like about this new thing.
Biggest thing IMO is the indirect lighting. Even the newest Tomb Raider game is far behind on that.
I noticed that after playing games with RTGI and getting used to it, I'm way more put off by games with bad or no GI. Maybe this explains the disparity in graphics appreciation you're talking about.
Biggest thing IMO is the indirect lighting. Even the newest Tomb Raider game is far behind on that.
I noticed that after playing games with RTGI and getting used to it, I'm way more put off by games with bad or no GI. Maybe this explains the disparity in graphics appreciation you're talking about.
Yep this is why I try really hard to clear my backlog of older games first. The newer stuff completely ruins the old stuff for me. It’s ok if I switch genres but games in the same genre I try to follow release order.
Yep this is why I try really hard to clear my backlog of older games first. The newer stuff completely ruins the old stuff for me. It’s ok if I switch genres but games in the same genre I try to follow release order.
I'm also waiting to play these new games with fancy RTX effects, because in my experience RTGI can massively improve the visuals but it is borderline useable on my 4070. Rather wait a couple years and play them once I can run them well on a midrange ($600 midrange GPU lol) PC. Still don't plan on doing the $1K GPU thing.
I'm also waiting to play these new games with fancy RTX effects, because in my experience RTGI can massively improve the visuals but it is borderline useable on my 4070. Rather wait a couple years and play them once I can run them well on a midrange ($600 midrange GPU lol) PC. Still don't plan on doing the $1K GPU thing.
a 4070 is no joke, and I think it isn't the GPU fault, it's just that the hardware isn't ready for full raytracing -or whatever nVidia calls Path Tracing nowadays-. We are getting there, and RT certainly broke the Moore's Law like 10000X times over, 'cos in the past a full raytraced image took several days, as seen in the old SIGGRAPH from the late 90s.
I just hope Intel does something interesting and in some years I can retire my A770 16GB which is way less powerful than a 4070, so yeah you are covered for a couple of years.
Yep this is why I try really hard to clear my backlog of older games first. The newer stuff completely ruins the old stuff for me. It’s ok if I switch genres but games in the same genre I try to follow release order.
Here I’ll note a few observations on Intel’s raytracing implementation, using both their official documentation and profiling tools. I’ll also compare with AMD’s raytracing approach. Nvidia would be interesting too, but unfortunately I don’t have enough info on their hardware raytracing for a comparison with them.
from the link @pharma shared in a different thread.
Manor Lords doesn't have any DirectX Raytracing support, though it does have an embree.2.14.0.dll file in the game folders. Embree is an Intel ray tracing library with support for CPUs and GPUs, but it's not clear what exactly it's being used for in this particular game right now. There's also an OpenImageDenoise.dll file, presumably for future support of ray tracing and denoising (maybe).