Upscaling Technology Has Become A Crutch

It’ll be interesting to see what kind of information starts coming out about handling performance of nanite, lumen, vsms once the first batch of games is out. It’ll be interesting to hear from studios that have a good track record with UE like the Coalition.
 
"Even Fortnite with UE5 features runs like rubbish."

This comment of yours clearly shows that you are talking complete nonsense in console space. Because Fortnite looks great and runs great on Xbox/PS5.

However, it is good to read something else in this topic
valuable opinions from developers on the subject, so it still made sense to create it.
 
@BitByte i can’t buy this one to play around with performance because I have no interest in playing it, but I’m really curious to see how certain settings scale and whether I can get the game to be cpu limited. I’d be really curious to see DLSS ultra performance at 4K. DLSS looks weirdly good in Remnant 2. I’m playing DLSS performance at 1440p, which I never usually even touch balanced. I tried ultra performance but it had too many artifacts, but shockingly didn’t seem a long way from playable.

It’s not totally uncommon to see pc players wishing their hardware was pushed to the limit by something new. Well here it is I guess.
 
am curious if these Reports rom WCCFTech are with or without the Developer day 0 Patch or The reviewer Manual hot fix. That fix dramatically changes the game's frame-times in DF's experience.
This is with the latest patch, 4090 and 12900K, the game is doing 70fps at 4K DLSS Quality! Still not good when the total visual package isn't that much impressive to be honest. Yeah the characters look good, but the environment not so much.

Oh and there are tons of precompilation stutter too.

 
Immortals of Aveuym runs at 70fps on a 4090, using 4K DLSS Performance. This is not acceptable.

Worse yet, the performance evaluation tool bugs out and assigns 25 points from the CPU bugdet to each level of AF. So 16X AF costs about 100 points which is half the CPU budget on powerful 7800X3D.

Ah, so is this thread just going to be a complaint thread for any game not well optimized now? As if that's something completely new?

Anyways, I was always a little worried about this game. It's a 'brand new AAA studio'(read: still not big) studio trying to swing for the fences with their first game using the latest and most demanding engine technologies and releasing on all major platforms. And I hear the game is like 15 hours long, which is actually quite lengthy for a more linear-style single player FPS campaign. Was always going to be a lot for the devs to handle and achieve.
 
When you rely on upscaling, make your game upscaling friendly. FSR 2.2 doesnt really work with these transparent and particle effects in Immortals. Everytime you switch the magic mode the whole animation is a ghosting shadow.
 
Or they keep having developers/publishers trying to make a product without ballooning the costs even furter.
The costs are ballooning because the scope is ballooning. In the PS360 Gen, games were like 12-16 hours with long games being like 40 hours and very few games that went past that time scale. Now, games that used to be 12-16 hours(ex.TLOU) are now almost 50 hours with many games going past that time period. It's obvious the scope is the problem...
 
"Even Fortnite with UE5 features runs like rubbish."

This comment of yours clearly shows that you are talking complete nonsense in console space. Because Fortnite looks great and runs great on Xbox/PS5.

However, it is good to read something else in this topic
valuable opinions from developers on the subject, so it still made sense to create it.
Please keep up with the context of this discussion. The comment you quoted was in reference to the 4090 which does not run great not consoles which uses lower settings than the 4090. I do not consider Fortnite struggling to stay over 60 FPS on a 4090 as great but that's just me....

 
@BitByte i can’t buy this one to play around with performance because I have no interest in playing it, but I’m really curious to see how certain settings scale and whether I can get the game to be cpu limited. I’d be really curious to see DLSS ultra performance at 4K. DLSS looks weirdly good in Remnant 2. I’m playing DLSS performance at 1440p, which I never usually even touch balanced. I tried ultra performance but it had too many artifacts, but shockingly didn’t seem a long way from playable.

It’s not totally uncommon to see pc players wishing their hardware was pushed to the limit by something new. Well here it is I guess.
I won't be purchasing it either. It doesn't look remotely interesting to me. I'll wait for DF and Daniel Owen to buy the game and cover it.
 
The costs are ballooning because the scope is ballooning. In the PS360 Gen, games were like 12-16 hours with long games being like 40 hours and very few games that went past that time scale. Now, games that used to be 12-16 hours(ex.TLOU) are now almost 50 hours with many games going past that time period. It's obvious the scope is the problem...

Not that I have any info, but intuitively , to me at least. Creating and promoting one 50 hour game is cheaper than make 3 x 16 hour games.
I also seem to have read that reviewers and gamers saying that games should be longer, but I might be remembering this wrong.
 
Not that I have any info, but intuitively , to me at least. Creating and promoting one 50 hour game is cheaper than make 3 x 16 hour games.
I also seem to have read that reviewers and gamers saying that games should be longer, but I might be remembering this wrong.
Maybe but as a result, you're placing your eggs in one basket. The cost of failure is much higher and the effect it has on a studio is much higher. To date, there are very few games with a gameplay loop to justify a full 50 hours. A lot of the time, it's copy and paste that outstays it's welcome(glares intensely at Assassin's creed). Marketing budgets too have drastically increased which along with the increased scope is drastically increasing the cost of the game. I haven't even touched on bad project management which leads to massive waste of time for the duration of the project(glares intensely at cyberpunk 2077).
 
Not that I have any info, but intuitively , to me at least. Creating and promoting one 50 hour game is cheaper than make 3 x 16 hour games.
Each game built on the first. Uncharted 1 required a new engine. Uncharted 2 refined the engine. Uncharted 3 took it further. Same for Gears, or BG:Dark Alliance or whatever. When companies had their own in-house engines, they used them for multiple titles, costs were lower and recouperated sooner. Now the risk is higher for bigger rewards, mitigated by GaaS so when your 10 year in the making title is finally released and you're $400 million in deficit, you can monetise it for the following ten years.

A graph of game costs and publisher/studio profits would show if relative costs are increasing or decreasing. But given the number of note-worthy studios and smaller publishers that have folded, I think it fairly evident that costs have gone up.

Edit: Actually, recalling this graph on costs, they are exponential (~10x cost in 12 years) so 3 times 1x cost was a lot less than 1 lot of 10x cost, or even 100x cost.

1692716311656.png
 
Last edited:
Man, I really don’t know about these UE5 game. At least to my eyes they don’t look very impressive but they are incredibly demanding.
It’s not just you. For me as it stands, ue5’s performance cost when compared to the visuals is one of the most unjustifiable engine trade off I’ve seen in a long time.
 
None of the released games look anything like the demos. However these games are using nanite and lumen it just isn’t translating well to perceived visual quality.

Looking back at UE4, we didn't get any top tier visuals on that engine until 2019 so it may take a while here too.
 

Improved image clarity from DLSS 3.5 should make up-scaling from lower internal resolutions more viable with ray-traced titles.

I'm going through this again. If it lives up to the demos, ray reconstruction will look vastly better than native when ray tracing. With ray tracing you have to live with instability/shimmering or de-noise. The demo images show the DLSS ray reconstruction looking much better than DLSS off, especially the reflections. It just does a far better job of not blending out sharp details. This is kind of the battle with "native" rendering - everything is massively under-sampled. Having a really good de-noiser that's integrated with upscaling could make it an easy image quality win over native. Is it better to have a 4k frame that's massively under-sampled, or a 1080 frame with a higher sampling rate that's smartly upscaled, even if the total number of samples was equal in both cases? As with anything else we'll have to see if it lives up to it. Also really curious to see if it can work with a system like lumen. I don't see why it wouldn't as long as you have a buffer that stores the GI and the reflection samples separately.
 
Back
Top