2023 IQ is unacceptable [spawn]

The issue is when you have extremely high quality assets butted up against a jarringly low quality asset.. which can still happen quite frequently even in the highest end games.

I think this is where tech like Nanite is really going to shine. Allowing much more of that fine grained micro detail to be properly lit and shadowed. The more actual geometry we get in games, the better. I find a lot of time when you zoom in the camera on some asset and you have really beautifully modelled and textured/lit objects.. it often exposes the flaws and drawbacks of the surrounding objects as they may not be viewed from the intended angle/closeness which was considered when the assets were authored. I think Nanite and technologies such as that will really allow that surrounding detail to remain from all angles and zoom levels.

Of course developers would also be cognizant of not creating too much noise and detail as they very carefully have to guide the player through the environment, and too much unnecessary detail can make that really difficult.

It's really crazy how much thought and work goes into design that us gamers really just don't even consider... but it's absolutely essential lol.
 
The issue is when you have extremely high quality assets butted up against a jarringly low quality asset.. which can still happen quite frequently even in the highest end games.

I suspect in some cases low fidelity assets are bought off the shelf, you know random clutter like soda cans and chairs and fire hydrants that may not be worth the time or effort to author from scratch. But yeah it definitely jumps out more for objects that are significantly lower fidelity than the rest of the scene.

I think this is where tech like Nanite is really going to shine. Allowing much more of that fine grained micro detail to be properly lit and shadowed. The more actual geometry we get in games, the better. I find a lot of time when you zoom in the camera on some asset and you have really beautifully modelled and textured/lit objects.. it often exposes the flaws and drawbacks of the surrounding objects as they may not be viewed from the intended angle/closeness which was considered when the assets were authored. I think Nanite and technologies such as that will really allow that surrounding detail to remain from all angles and zoom levels.

It can't come soon enough.
 
I'd say biggest bottleneck is the base 8 GB VRAM or Dx12u features not being utilized such as DStorage for better texture streaming (?) or maybe sampler feedback for smarter texture management.

But I'd still say 8 GB VRAM baseline on PC ranging from 2060 super to 3070ti and now the upcoming 4006ti will pose a huge barrier for developers going forward. Part of me thinks that the reason they continue with PS4 ports is because targeting PS4 allows 6-8 GB cards to function properly in these games.
 
The issue is when you have extremely high quality assets butted up against a jarringly low quality asset.. which can still happen quite frequently even in the highest end games.

Not saying that the game that spawned this thread is doing a good job of it, however...

This is going to becoming increasingly more prevalent and more of a glaring omission with increased focus on correct light behavior as developers start to improve their use of RT and as hardware becomes more capable of hardware assisted RT.

The better the lighting the more and more glaring the deficiencies in world asset quality will become, especially when it comes to geometry and polygon density. Assuming world asset density and quality doesn't increase commensurate (or more) with increased use of correctly behaving RT lighting, then ... at least for me ... games will get worse and worse looking as those deficiencies become more and more unavoidably obvious and in my face.

Currently not a huge problem for me as RT is still generally not performant enough so I'm not having to constantly look at better lighting with still bad world asset quality, but someday the hardware will be good enough that I'd be willing to enable RT in more and more titles, and if world asset quality hasn't improved massively by then, it's going to be a huge eyesore (for me).

Hence why something like Nanite is so needed as any game currently shipping without it just looks incredibly bad in comparison to something as simple as Fortnite, IMO, especially if the currently shipping game has more correct RT lighting.

Regards,
SB
 
I'd say biggest bottleneck is the base 8 GB VRAM or Dx12u features not being utilized such as DStorage for better texture streaming (?) or maybe sampler feedback for smarter texture management.

But I'd still say 8 GB VRAM baseline on PC ranging from 2060 super to 3070ti and now the upcoming 4006ti will pose a huge barrier for developers going forward. Part of me thinks that the reason they continue with PS4 ports is because targeting PS4 allows 6-8 GB cards to function properly in these games.
Developers making ambitious AAA titles will typically target consoles and then 'make it work' on PC. Not to say PC is necessarily neglected, it's just not how developers making games with top tier presentation values typically look at things in terms of baselines. Scaling memory demands and all that is usually quite doable, though doing so without significant hits to fidelity is obviously more difficult.

8GB of VRAM will likely remain well within a sort of proper 'minimum spec' setup that will still be capable of playing the latest games for a good while yet. It's just gonna require varying levels of compromise, depending on the game and effort put into memory optimizations of assets and rendering.
 
Developers making ambitious AAA titles will typically target consoles and then 'make it work' on PC. Not to say PC is necessarily neglected, it's just not how developers making games with top tier presentation values typically look at things in terms of baselines. Scaling memory demands and all that is usually quite doable, though doing so without significant hits to fidelity is obviously more difficult.

8GB of VRAM will likely remain well within a sort of proper 'minimum spec' setup that will still be capable of playing the latest games for a good while yet. It's just gonna require varying levels of compromise, depending on the game and effort put into memory optimizations of assets and rendering.
Well, N64 like textures is a huge compromise, or rather, unacceptable compromise. At least Last of Us and Forspoken expects a huge majority of 6-8 GB users to play with N64 textures somehow.
 
Well, N64 like textures is a huge compromise, or rather, unacceptable compromise. At least Last of Us and Forspoken expects a huge majority of 6-8 GB users to play with N64 textures somehow.
Yeah I am very much so thinking "if this is what you can produce with 8 GB of VRAM, then maybe you need some help". I find it as a review pretty unacceptable and will make a big deal about it any time it happens. I hope it happens less though. At their height, consoles like PS5 and XSX will be using around 10GB of dedicated memory for GPU related tasks perhaps according to those I have asked - shaving off 2+ GB should not mean 1/8 texture res. Do a better streaming system, use virtual textures, use the benefits of DX12U feature set. But I doubt bad porters or bad faith releases from publishers will take the time to actually take advantage of PC hardware (aka TLOU). Still... I will make a big deal about it in video.

Of the titles that have made the user turn down texture quality with 8 GB of VRAM, the select few where texture quality goes to complete "Quake 2" mode are those titles as well that have a score of other technical issues beyond the texture one. 2023 to be honest has so far been an awful year for my reviewing games at DF... only one game has not had a serious endemic issue in it and that was Hi-Fi Rush. Everything else I touched this year that was a multiplatform game has had serious deep seated issues. 2023 is so far worse than 2022.
 
Wild stab in the dark.

More and more veteran programmers retiring and more and more new programmers entering the gaming development pool.

Basically ones that may have started out programming on PC prior to the shift of video game development moving to console first development are likely now getting to the age where they're retiring. So there's fewer and fewer programmers with good PC game development skills (for example a simple skill, like being able to scale a higher quality asset to lower quality levels because of the need to have your game run on multiple hardware targets).

Hell, just look at the sorry state of Windows 11. It's like the programmers that are working on it grew up using Macs in college or something and have no freaking idea why PC was as popular and well used as it was or made it such a powerful albeit more complex professional to use OS compared to MacOS for many professionals.

Regards,
SB
 
More and more veteran programmers retiring and more and more new programmers entering the gaming development pool.
I wholeheartedly agree, I even expressed this very same opinion a while back.

in several cases we've seen developers being unaware of several basic things about PCs, sometimes they don't know about compilation stutters or how to implement a system for PSO gathering/collection, or how windows VRAM management works, or how to even implement basic mouse controls, among several other things that were perfectly outlined in Alex's excellent video about PC ports guidelines, and those are just the tip of the iceberg.

So I would say, it's a combination of both low priority and not enough knowledge about the PC platform, which is a situation that especially affects the new generation of developers who grew up playing console games, as opposed to the older generation of developers (the experts and the old geezers) who grew up fiddling with all sorts of old finicky and rapidly evolving PC hardware.
 
Wild stab in the dark.

More and more veteran programmers retiring and more and more new programmers entering the gaming development pool.

Basically ones that may have started out programming on PC prior to the shift of video game development moving to console first development are likely now getting to the age where they're retiring. So there's fewer and fewer programmers with good PC game development skills (for example a simple skill, like being able to scale a higher quality asset to lower quality levels because of the need to have your game run on multiple hardware targets).

Hell, just look at the sorry state of Windows 11. It's like the programmers that are working on it grew up using Macs in college or something and have no freaking idea why PC was as popular and well used as it was or made it such a powerful albeit more complex professional to use OS compared to MacOS for many professionals.

Regards,
SB
Windows 11 is largely fine. Everybody will find reasons to complain about a new Windows release.

I think your idea that 'old programmers = good, newer programmers = bad' is also quite silly. You act like PC teams in studios are all full of fresh faced 18 year olds or something, and there's literally no middle ground between retiring seniors and these fresh faced kids. As if PC gaming hasn't been a big thing for decades now where more people than ever have been hiring on ever growing teams with large variances in experience levels, like basically always and how it's really supposed to be.

Not to mention that the truly old school folks aren't necessarily more fit for modern development environments, either.

Well, N64 like textures is a huge compromise, or rather, unacceptable compromise. At least Last of Us and Forspoken expects a huge majority of 6-8 GB users to play with N64 textures somehow.
Not saying it's a great situation with those games, but the point was just that these games will remain playable and wont be below the baseline of what PC devs can handle. At some point, we just need to accept that 8GB is like the new 2GB, and that people with such setups should have to expect compromised visuals going forward. And not just in a 'Ultra->High' sort of compromise, but real compromises.
 
Not saying it's a great situation with those games, but the point was just that these games will remain playable and wont be below the baseline of what PC devs can handle. At some point, we just need to accept that 8GB is like the new 2GB, and that people with such setups should have to expect compromised visuals going forward. And not just in a 'Ultra->High' sort of compromise, but real compromises.

Which is well and good if, going from above 8 GB to below 8 GB meant that as in past years going from 3/4 GB down to 512 Mb/1 GB still looked not bad. As it is some games, look absolutely atrociously bad once you dip below 8 GB as referenced by many complaining about "N64" textures.

In the history of PC gaming we've almost never had a situation where texture quality degraded so incredibly badly going from the highest texture setting to the worst. When it happens, it sometimes looks worse than the lowest texture quality setting in 2007's Crysis and that only needed 256 MB of VRAM for minimum specs.

Regards,
SB
 
I think we all agree, that max texture quality in games such as Hogwarts Legacy, Last of Us, Resident Evil 4, Forspoken and others, doesn't look particularly special, they look the same as any other game that doesn't require copious amounts of VRAM (Atomic Hearts, Returnal, Warhammer Darktide, Call of Duty, Battlefield .. etc). These troubled games require more VRAM just because their streaming system is not properly optimized, so when you select lower than max texture settings, the texture quality takes a disproportionate hit.

I think Far Cry 6 and Gears 5 did this the proper way, their High texture quality is very very good, but they offer an Ultra texture pack DLC option for those with bigger VRAM, that slightly increase texture quality. This way they don't impact the visual quality for people with 8GB of VRAM or less (and also save them some much needed storage space), while simultaneously offer slightly more for those with bigger than 8GB of VRAM. All games should do the same.
 
I watched this video about BF2042 on a release PS4 and it reminds me of TLoU on a 6/8GB GPU:

But nobody would blame PS4 users for this technical disaster but on the PC it is exact the opposite.
 
Btw square Enix with forspoken showed that given enough time, the developer can make 8GB cards does have enough VRAM to have textures that doesn't look like blurry blobs.

Kinda weird. Knowing Nvidia cards are the most popular, and Nvidia cards often have stingy vram size. Texture streaming optimization should be done pre-launch instead of post-launch.
 
Which is well and good if, going from above 8 GB to below 8 GB meant that as in past years going from 3/4 GB down to 512 Mb/1 GB still looked not bad. As it is some games, look absolutely atrociously bad once you dip below 8 GB as referenced by many complaining about "N64" textures.

In the history of PC gaming we've almost never had a situation where texture quality degraded so incredibly badly going from the highest texture setting to the worst. When it happens, it sometimes looks worse than the lowest texture quality setting in 2007's Crysis and that only needed 256 MB of VRAM for minimum specs.

Regards,
SB
There's really only one game where it's ruining textures as badly as you're suggesting(TLOU). Other times there might be slow loading of textures or something that looks bad, but this is absolutely a common issue of the past as well.
 
There's really only one game where it's ruining textures as badly as you're suggesting(TLOU). Other times there might be slow loading of textures or something that looks bad, but this is absolutely a common issue of the past as well.
Isn't tlou textures looks fine in 8GB cards since launch? at least when the weird bugs were not triggered. Kinda like horizon at launch.

Its forspoken that consistently loads n64 textures on 8GB cards at launch. It has been fixed tho.
 
Back
Top