Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Please guys... don't let this level of "analysis"/snark convince me to stop bothering to answer questions here.

I really want to believe people are here to discuss tech and learn in good faith. Can we the idiotic "please validate my platform/purchase/pc master race/console fanboy" stuff that seems to always be the undertone of these conversations to reddit?

If that is ment towards my post, you might have misunderstood it. I quoted Clukos's post because i actually didnt agree with his findings (that PCIE would be the bottleneck somehow), not towards the engine or the people behind it themselfs, as obvious from my other posts in this topic.
 
I really want to believe people are here to discuss tech and learn in good faith. Can we the idiotic "please validate my platform/purchase/pc master race/console fanboy" stuff that seems to always be the undertone of these conversations to reddit?

Hey man, UE is a fantastic resource, especially for people like myself who can only dabble as a hobby. What you're all putting out is amazing, and what is in the hands of mere mortals like myself is frankly unbelievable.

I know it's easy for the world to make you cynical, but you're working on something amazing and a lot of us really, really appreciate that.
 
If that is ment towards my post, you might have misunderstood it. I quoted Clukos's post because i actually didnt agree with his findings (that PCIE would be the bottleneck somehow), not towards the engine or the people behind it themselfs, as obvious from my other posts in this topic.
Apologies, I didn't mean it to be specifically directed at you, but I also probably misinterpreted the response. I'll edit my post appropriately.
 
Task manager is unfortunately quite unreliable for such things because Windows moves threads around between cores all the time. The bottleneck will almost certainly be a single thread somewhere (and thus the CPU's single-threaded performance) - usually the game, render or RHI thread - but you'd have to use a tool that understands user mode threads (like UnrealInsights) to see.

Not shure about reliability, but used MSI Afterburner also, and the results are quite similar.
 
Man, having foliage with as much polygons as Nanite allows in games would be a game changer.

Foliage in modern games still looks so fake, like 2D bitmaps (which they probably are).

If Nanite is not doing well with foliage, maybe Mesh Shading could help. Mesh Shading also allows for extremly efficient culling, which could come in very handy when rendering a ton of foliage on screen.
 
Test of Nanite foliage in 5.1: https://www.artstation.com/artwork/5BbdlO


No idea how it works, it seems like the disjointed cluster LODs would cause a lot of cracks. But one could just disregard that, use standard LODs with clusters and hey cracks disappear. Or maybe some crack filling strategy. Either way visibility buffer with alpha test is doable, as Forbidden West shows.

Apparently nanite got some form of support for world position offset.
 
I have found a way to improve performance in the Matrix demo further.

Disabling any upscaling and running at native 1080p seems to run best on my machine.

Strangely, upscaling be it TSR or DLSS sometimes seems to tank framerate, or not improve it by much.

I wonder what is going! Performance TSR and DLSS at 1440p also run much slower than native 1080p, despite running at a much lower res. Strange things going on indeed.

Nevermind: in the shipped build I used, DLSS Quality is enabled by default. I thought it was defaulting to TSR.
 
Last edited:
@Dictator I am sure you are hard at work on your Matrix video as we speak. I really wonder to which extent the city sample uses HW-Raytracing.

Maybe you could stack up the 5700 against a 2060 Super again and see how they perform.

I wonder if it really is like you said in the Matrix console demo video, HW-RT offering better speed and better quality at the same time. The 2070 should perform a lot faster if the Matrix demo is using RT acceleration properly. I do not have access to a card without HW-RT so comparing it is difficult, but from what I have been seeing in videos on cards without HW-RT and with HW-RT, the quality is pretty similar and both can use the Epic setting. So non HW-RT cards are running these effects in software which likely does mean you get much better performance on cards with HW acceleration.

Or the second possible outcome is the difference in visual quality is just pretty small and cards with RT acceleration perform actually slower as they use more advanced effects automatically. Which would be very, very bad for RTX users like me.

I am very sure though this comment is entirely irrelevant as you are testing that out right now lol.
 
Last edited:
There's two passes there that last for a grand total of... what... 250us or so that have high PCIe throughput. This is both completely normal (various passes will hit different bottlenecks in a given GPU/SKU), and completely unrelated to 3-order-of-magnitude longer stutters that occur on the CPU, not the GPU.

[Edit] I misinterpreted PSman's response.

I was mostly curious to understand why my GPU is never fully saturated in this demo, not necessarily the stutters.

Most likely candidate is of course CPU, I just thought it’s interesting there’s some passes hammering the PCIe bandwidth, like I said though I’ve only spent a small amount of time looking at this demo :p
 
Hey managed to download from the mega links, had to reboot my PC after every file downloaded to play around the IP dl limits.

Is it me or the graphics are not as impressive as hyped? There are plenty of triangles on display a given, but i am still seeing muddy textures, lighting and reflections, spotted many jaggies in the distance, most of the structures are repeated and dull. The metahumans look not much different to today's open world NPC, just a bit more detailed skin textures.

There is also some weird artifacting screen glitches when i run side ways or swing the camera about. Anybody facing this strange visual glitch? It is something like the dejavu effects in the original Matrix! clippy gltichy!
 
Not shure about reliability, but used MSI Afterburner also, and the results are quite similar.
Anything that is giving you system-level "core usage" is not going to tell you anything useful in terms of single-threaded CPU bottlenecks. You need something that understands user-mode threads (unrealinsights, vtune, and that sort of thing).

Honestly in a game engine like this if the CPU isn't near 100% loaded it's safe to assume it's likely a CPU bottleneck.
 
Back
Top