Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Why argue? We can well estimate the infamous laptop can run it at similar perf:
View attachment 5547

To summarize:

1) Desktop 2060 has similar performance to the 2080 Max-Q;
2) Desktop 2060 runs Valley of the Ancients at 1080p 37FPS;
3) Without considering possible engine performance boosts that could have happened throughout last year;
4) 2021's Valley of the Ancients demo is less demanding than 2020's Lumen in the Land of Nanite. I.e. the 2020 demo would show lower performance on any GPU.
5) The 2020 demo on the PS5 ran at 1440p30. I believe it was average 1440p + reconstruction to 4K (which takes away performance too).


Therefore, the claims (or perhaps a botched/hopeful translation) of a laptop with a 2080 Max-Q running the 2020 demo @1440p 40FPS are false.
Tim Sweeney didn't lie and the insults thrown at him in this thread are baseless.
 
4) 2021's Valley of the Ancients demo is less demanding than 2020's Lumen in the Land of Nanite. I.e. the 2020 demo would show lower performance on any GPU.

why would you think that?

We know nanite doesnt scale with geometric complexity. We know this demo had less art time. We know performance was more critical to the previous demo (it was used to market a console, not targeting devs on high end pcs). We know this demo is in many ways a torture test (horrible overdraw - just look at the visualization mode, and they mentioned this kitbash workflow is quite bad). More volumetric fog, etc.
 
Last edited:
Hard to determine which is more important between geometry and RT. Judging current examples though, I find games with high geometry/high quality art to look better than last gen geometry/art with RT added on. R&C and HFW look more pleasing to me than Metro enhanced and Cyberpunk for example.
 
Hard to determine which is more important between geometry and RT. Judging current examples though, I find games with high geometry/high quality art to look better than last gen geometry/art with RT added on. R&C and HFW look more pleasing to me than Metro enhanced and Cyberpunk for example.

Agreed! Though in part I find that this is because simply trying to brute force raytracing, as devs are trying to do now, is just not a win at all. Good lighting is really important, but it needs to make use of the fancy PBR shader pipelines that cropped up this last generation, often needs to emulate multiple spatial bounces and proper area lighting and etc. etc. In order to get RT to run in realtime, all that stuff is often just tossed out the window and there's some all too often noisy and ghosting smooth reflections thrown at your face or somesuch. Even Metro EE only looks good at max settings 4k, and still has failure cases there. Some sort of change, a hybrid pipeline like UE5 is at least trying to do or other clever techniques that have come before, might be needed to get more realtime lighting to look good in all occasions.
 
To summarize:

1) Desktop 2060 has similar performance to the 2080 Max-Q;
2) Desktop 2060 runs Valley of the Ancients at 1080p 37FPS;
3) Without considering possible engine performance boosts that could have happened throughout last year;
4) 2021's Valley of the Ancients demo is less demanding than 2020's Lumen in the Land of Nanite. I.e. the 2020 demo would show lower performance on any GPU.
5) The 2020 demo on the PS5 ran at 1440p30. I believe it was average 1440p + reconstruction to 4K (which takes away performance too).


Therefore, the claims (or perhaps a botched/hopeful translation) of a laptop with a 2080 Max-Q running the 2020 demo @1440p 40FPS are false.
Tim Sweeney didn't lie and the insults thrown at him in this thread are baseless.

Again consoles are probably locked 30 fps here the minimum framerate on 2060 is 28 fps. If it runs locked on console it runs better after we don't know how the demo runs unlocked on consoles.
 
tough demo was in 1440p if I'm not wrong, but generaly rx5700xt on pair with rtx2070 and rtx3060, so little worse than usual (2070super level) but nothing realy suprising with results
It's not clear to me where between a 5xxx and 6xxx a PS5 'RDNA2' tends to be? (I want the closest architecture for development to estimate console behavior and thought RDNA2 should do better, although the IC adds some unknown then.)
 
It's not clear to me where between a 5xxx and 6xxx a PS5 'RDNA2' tends to be? (I want the closest architecture for development to estimate console behavior and thought RDNA2 should do better, although the IC adds some unknown then.)

Probably the 5700 XT here the HW raytracing is not used. The PS5 will run a bit better in theory(11%) but UE5 is currently better optimized for console and there is a little less overhead due to lower level API on consoles(10%?).
 
Last edited:
OMG, so much to go through in this thread...
2 Things come to mind though.

1. The comments about the move to use RT in games, vs Movies, is not an entirely accurate comparison.
Movies are a single 2D representation of the image, games allow movement, interaction and so much more.
I think that the simplified workflows and engines enabled by the use of RT, especially in GI mean that moving to engines that use
RT, earlier in the industry vs. going further in terms of geometric details is a valid comparison.

2. It seems like UE5, is built to address to limitations of Lastgen hardware.
ie. focus on compute, heavy CPU usage, limited support for reliance on HW RT.
As software gets more and more complex, i think we are seeing the point where it takes the entire
generation ( well a console gen anyway) to see the absolute best possible perf form the games,
 
4) 2021's Valley of the Ancients demo is less demanding than 2020's Lumen in the Land of Nanite. I.e. the 2020 demo would show lower performance on any GPU.
.
Did you watch the Nanite stream with Brian Karis? Valley of the Ancients is HEAVIER than Lumen in the Land of Nanite. Nanite is 2x the millisecond cost in Valley of the Ancients than Lumen in the Land of Nanite. Hence why the performance target for Epic settings is 1080p 30 and not the sub 1440p 30 average that Lumen ran at. This is all readily available information on their wiki or from the livestreams.

Also I think GameGPU's average framerates there are not representative of the game experience, an RTX 2060 will actually have lower framerates throughout the demo than what they report (If I recall, GameGPU does benches of a number of cards and interpolates the results of others based upon these significant points). Their RTX 2060 numbers look inaccurate while their RTX 3090 or RX 6800XT numbers look right.
 
Probably the 5700 XT here the HW raytracing is not used. The PS5 will run a bit better in theory(11%) but UE5 is currently better optimizedfor console and there is a little less overhead due to lower level API on consoles(10%?).
Oh, i forgot missing RT was my primary reason to rule out 5xxx :) So i better still wait to get soem 6xxx...

I don't think (eventually upcoming) arguments of 'optimized for console > PC' or 'optimized for AMD > NV' add so much weight here.
The Nanite code i've skimmed over looks more 'straight forward' than 'specifically optimized'. There are no mutations for different GPUs, no practices like 'buffer to LDS first to have linear writes and avoid VRAM atomics', no subgroup optimizations, no fp16.
(I mean, there likely is some of such stuff and i just did not notice due to custom shading language and skimming.)
Beside primitive / mesh / vertex shader and API differences there is not so much else i guess. (Can't comment Lumen or streaming.)
 
Agreed! Though in part I find that this is because simply trying to brute force raytracing, as devs are trying to do now, is just not a win at all. Good lighting is really important, but it needs to make use of the fancy PBR shader pipelines that cropped up this last generation, often needs to emulate multiple spatial bounces and proper area lighting and etc. etc. In order to get RT to run in realtime, all that stuff is often just tossed out the window and there's some all too often noisy and ghosting smooth reflections thrown at your face or somesuch. Even Metro EE only looks good at max settings 4k, and still has failure cases there. Some sort of change, a hybrid pipeline like UE5 is at least trying to do or other clever techniques that have come before, might be needed to get more realtime lighting to look good in all occasions.
Ya I don't know what the future will bring for RT. Just my view on current software/footage. I've yet to see footage of HW RT in an actual game offering up something as visually pleasing as the increased geometry/art asset boosts.
 
4) 2021's Valley of the Ancients demo is less demanding than 2020's Lumen in the Land of Nanite. I.e. the 2020 demo would show lower performance on any GPU.
Nope, the opposite is true, the newer demo is much heavier.
5) The 2020 demo on the PS5 ran at 1440p30. I believe it was average 1440p + reconstruction to 4K (which takes away performance too).
Dynamic resolution 1440p, not fixed 1440p.
Therefore, the claims (or perhaps a botched/hopeful translation) of a laptop with a 2080 Max-Q running the 2020 demo @1440p 40FPS are false.
Nope, the claim of the engineer is completely valid.
Also I think GameGPU's average framerates there are not representative of the game experience,
They tested only the first scene from the demo, they didn't extend their test to the giant area.
And framerate is probably locked on consoles. We don't know if the demo run at a higher framerate.
Locked because they are using dynamic resolution.
 
They tested only the first scene from the demo, they didn't extend their test to the giant area.
Good choice for a GPU benchmark. Issues like particles causing drops seem a matter of ongoing work to do. So a scene without much action and movement is well suited to see base GPU performance.

Well, i keep a bit doubtful this scales down to mobiles well. Probably Lumen is just too expensive, and aggressive automated LOD reduction might end up looking too bad. I guess some people will still work on manual content with detail per platform.
 
I've yet to see footage of HW RT in an actual game offering up something as visually pleasing as the increased geometry/art asset boosts.
Meh, that's just a matter of taste, so it's kind of useless to discuss.
Personally, I liked the early DXR demos, such as the Reflections and the SOL, way more than uniformly lit scenery in the Ancients demo.
There are tons of games with great geometry, Metro Exodus mentioned here many times is one for example, it uses tesselation with displacement maps extensively almost everywhere - on ground surfaces, on walls, on trees, on hair, on characters, on enemies, even ropes are tesselated to fine levels, that's super highly detailed game, yet nobody here paid attention to this while discussing the game and that speaks a lot :D
So better wait for some real games before judging what is more visually pleasing, personally I would take both any time and I don't see any real reasons why there should be any compromises.
 
Last edited:
Back
Top