Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Maybe but what if it never happens? What if the modern rendering pipeline is fundamentally broken and no amount of talent can fix it? It may be time for a complete paradigm shift. The current situation on PC is appalling.
Then we will be in for 5-6 more years of meh visuals. Hopefully the final console gen can deliver a breakthrough.
 
But it would be a strange comparison to make anyway, given that DESORDRE is a really simple game, with really simple geometric shapes, and that all the stuff that gives Starfield it's character (detailed models, detailed texturing, atmospherics, animated meshes, NPCs, dense vegetation) is missing from every screenshot I've seen of it, anyway.

It's so different from Starfield that I can't see anything meaningful coming from comparing the two.
Cyberpunk would make a better comparison. The lacking graphic features in DESORDRE are present, plus some.
 
PureDark is implementing Reflex, too.

So therein lies the issue. When would you be satisfied with the performance of UE5 and Starfield? When a 4090 is operating 2x-3x faster than a 6900XT? Are you trying to make the argument that a properly optimized starfield the 4090 is anywhere between 140fps-210fps at 1080p? Because this is what you're suggesting. Is there any data today that suggests a 4090 can go 50% faster?
In 4K the 4090 is only 50% faster than a 6900XT in Starfield: https://www.dsogaming.com/pc-performance-analyses/starfield-pc-performance-analysis/
Do you think this represents the true performance of a 90 TFLOPs GPU? A GPU which has 9x the compute performance of a PS5?

This is a scene from Starfield. You can see there is nothing to render here:
What takes so long to render a frame?

For example Shadow the Tomb Raider runs with over 100FPS and RT shadows in 4K:
Control runs with over 60 FPS in 4K with Raytracing: https://abload.de/image.php?img=controlscreenshot2022x7i8o.png

Here is another example from Immortals of Aveum: https://abload.de/image.php?img=immortalsofaveumscree0ifh7.png
Same question. This scene is just wrong. The lighting is PS3 level. And still needs 16ms to render this in native 3440x1440.

It takes so many ressources and man years to produce a chip like AD102. And yet game engine developers still do not care how good their products run on modern GPUs.
 
Last edited:
It takes so many ressources and man years to produce a chip like AD102. And yet game engine developers still do not care how good their products run on modern GPUs.

That seems like a silly comment, its not like they can add every bell and whistle cost free as the features / techniques are "released". Especially since they are limited to what tech they can develop in-house and/or what is available in their 3rd party engine within the allotted time/money they have

Also their main goal, I hope, is to make a game that people want to play, not implement all the latest gfx tech. Like for instance Starfield, have they ever claimed that it will be the best looking game with the newest tech or run at a gazillion fps? If not, why expect it? I can understand hope for it, but its budget and time allocation and initial plans that decides the scope and features that goes into the game. And then how well they are able to execute according to the schedule decides what gets cut and what is kept.

The devs have their own goals, judge them by those not arbitrary goals forums come up with. And not buy their games if they do not make what you want to play.

Total digression, I read this
https://www.amazon.com/Making-8-bit...sprefix=write+8+bit+games+in+c,aps,177&sr=8-1
the other day, good fun if you are into gaming and programming, especially if you think about the cost or power of todays cpu/gpu.

This was fun insight from that book
The Midway 8080 architecture generates video from a black-and-white frame buffer, using a whooping 7KB of RAM. (In 1975, the production run of Gun Fight used $3 million worth of RAM, estimated to be 60 percent of the worlds supply).
 
Last edited:
PureDark is implementing Reflex, too.


In 4K the 4090 is only 50% faster than a 6900XT in Starfield: https://www.dsogaming.com/pc-performance-analyses/starfield-pc-performance-analysis/
Do you think this represents the true performance of a 90 TFLOPs GPU? A GPU which has 9x the compute performance of a PS5?

This is a scene from Starfield. You can see there is nothing to render here:
What takes so long to render a frame?

For example Shadow the Tomb Raider runs with over 100FPS and RT shadows in 4K:
Control runs with over 60 FPS in 4K with Raytracing: https://abload.de/image.php?img=controlscreenshot2022x7i8o.png

Here is another example from Immortals of Aveum: https://abload.de/image.php?img=immortalsofaveumscree0ifh7.png
Same question. This scene is just wrong. The lighting is PS3 level. And still needs 16ms to render this in native 3440x1440.

It takes so many ressources and man years to produce a chip like AD102. And yet game engine developers still do not care how good their products run on modern GPUs.
Modern developers are npt as talented as elite developers from the 2000s.. those guys can create beautiful games with way less power like 1.8 teraflop or 500glops lile ps3 lol... all these modern games should be running 120fps at 4k on a 4090
 
This is a scene from Starfield. You can see there is nothing to render here. What takes so long to render a frame?

Yeah there’s no way that scene is pushing 99% usage on a 4090 for only 60fps. Would love to know what’s really going on under the hood.
 
Modern developers are npt as talented as elite developers from the 2000s..
Or they're just as talented but the tech has become more complex?
And yet game engine developers still do not care how good their products run on modern GPUs.
Can we please move off the "engine devs don't care" comments? It can be hugely demoralising. The people working on these engines likely want to do the best they can out of personal pride like most of us in our jobs. Reasons for poor engine performance are multifarious, from system complexity to economic pressures, and I doubt mostly due to well-paid software engineers who can't be bothered to put in the effort. Really, no-one who hasn't created a better engine themselves should be throwing stones.

Can we please just stick to technical analysis and considerations without leaning into 'lazy arse devs don't give a shit' narratives? That sort of talk is guaranteed to discourage the most valuable members of this forum from hanging around. If you just want to bitch, hit Reddit.
 
I've now played through The Last of Us Remake 2 times on PC at very high settings. Having 46 hours of playtime.

The technology is very good on PC and you can immediately tell that the production costs were enormous. If I exclude Dark Souls I haven't seen a game without ray tracing that looks nearly as good as the PC version of The Lats of Us. It even outperforms many titles with ray tracing on PC. See Dead Space, Resident Evil, RoTR, SoTR, Spiderman, etc. However, the pc version does look clearly better than the console version.

Even on PC I recognize a few legacy graphics bugs from Uncharted 4. I'll mention the flickering and low-resolution shadows of deciduous trees. Therefore it would be even better with ray tracing.

Same goes for the GI. With the pre-baked lighting many moving objects stand out. Most of the time the lighting is very homogeneous.

The assets in The Last of Us are very fine and high-resolution. Especially when it comes to the degree of geometry. In addition, the world is incredibly dense. Each wall has been designed with care. The textures were so highly praised in advance. They are good but I still see many blurry surfaces. There is a modification where you play it in the first perspective. You'll see such weaknesses more quickly. Cyberpunk's 2077 textures look sharper with mods.

The optimization should be better. The game is slower than Control at the same display resolution. DLSS also helps a lot in the game when it comes to image quality and performance.

The game had an oversharped image that you couldn't turn off in the menu. I had to go into the game data via HexEditor and set the sharpness to zero. Since DLSS blurs the image much less you could see the over-sharpening there more clearly than with TAA.

Overall, The Last of Us is technically very good. I'm not as quick to praise Sony games on tech as other people but what I see here on PC deserves the praise. Visually it's one of the best games I know.
 
Last edited:
Or they're just as talented but the tech has become more complex?
The proof is in the pudding. Many of the elite devs from the early 2000s and late 90s played a huge role in authoring many of the techniques that are built upon today. They worked with serious hardware resource scarcity that brought about innovation as a necessity for advancement. Something newer devs don't contend with to the same degree. Has tech become more complex? Perhaps but, the man power has scaled along with it. Furthermore, many games today show serious regression in the technology stack when compared to old counterparts. Interactivity and physics have certainly declined in comparison to older games with the exception of a few developers/studios(Nintendo for example with Zelda).
Can we please move off the "engine devs don't care" comments? It can be hugely demoralising. The people working on these engines likely want to do the best they can out of personal pride like most of us in our jobs. Reasons for poor engine performance are multifarious, from system complexity to economic pressures, and I doubt mostly due to well-paid software engineers who can't be bothered to put in the effort. Really, no-one who hasn't created a better engine themselves should be throwing stones.
Everyone is subject to criticism for their work. Should we also praise developers who do a good job despite not creating our own game engine? Its a very interesting suggestion but, if I spend my hard earned dollars/pounds/euros on a game, I feel like I earned the right to criticize the product. Criticism does not exist in a vacuum and the criticism stems from a comparison of work put out by their peers. I agree that its not likely that they don't care. As you said, there are other constraints but at the end of the day, people are paying for their work. It's not surprising that frustrations start to seep out as a general trend of poor performance emerges. Insulating yourself from criticism only leads to worse outcomes both for the developers and consumers. You only need to look at EA's Madden team to understand that...

It is often better and cheaper to take criticism as early on in the cycle. This way you can take corrective action when the cost of said action is comparatively cheap than to wait till later in the cycle. The saint row devs found this out the hard way.
 
The proof is in the pudding. Many of the elite devs from the early 2000s and late 90s played a huge role in authoring many of the techniques that are built upon today. They worked with serious hardware resource scarcity that brought about innovation as a necessity for advancement. Something newer devs don't contend with to the same degree.

What are some examples of these "elite developers" who shipped games up to your arbitrary standards? I can't think of anything in 2004 launching at native 4k/120fps/ultra and looking like a generational leap from the previous gen, but I do recall a lot of 720p/~50fps games on high end pcs. Sometimes I seriously wonder if you've ever played the games you're holding up on period hardware.
 
What are some examples of these "elite developers" who shipped games up to your arbitrary standards? I can't think of anything in 2004 launching at native 4k/120fps/ultra and looking like a generational leap from the previous gen, but I do recall a lot of 720p/~50fps games on high end pcs. Sometimes I seriously wonder if you've ever played the games you're holding up on period hardware.
Doing my best to be polite, I think that you should do your best to familiarize yourself with the products put out in that era. The rendering techniques that were used, the hardware it was accomplished on, and how those techniques have shaped the trajectory of rendering today. Talking about 4k/120fps as if that's an accomplishment of anything is frankly misguided.
 
Last edited:
Metroid Prime Remastered runs at 900p/60FPS on a 300 GFLOPs machine. Does Immortals of Aveum look 300x better than this? Because my 4090 has 90 TFLOPs and cant get 60 FPS in 4K...
 
Far Cry 1, Half Life 2, Doom 3, and many many others.
Only half life, the one of those with the least sophisticated rendering, is as consistently over 60fps at 1024x768 on high end gpus or the time. All of those games are phenomenal accomplishments, just like the very best games of today.

Doing my best to be polite, I think that you should do your best to familiarize yourself with the products put out in that era. The rendering techniques that were used, the hardware it was accomplished on, and how those techniques have shaped the trajectory of rendering today.
oh yeah the lazy devs of today would never achieve anything like “stencil shadows” or “release a demo after the fact that uses hdr intemediate buffers”. Virtual texturing, virtual geometry, real time gi with both screen space and world space caches, gpu driven rendering, fine grained per tri real-time culling… all kids stuff by comparison I guess. :rolleyes:

If you want to talk about familiarizing yourself with the techniques, I would suggest implementing a hl2 era forward renderer in dx9 and then implementing literally any modern technique in dx12 or vulkan.
 
Status
Not open for further replies.
Back
Top