Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

Fortnite now runs on UE5.1 with significant graphics improvements!

fortnite-unreal-engine-5-1-screenshot-4-1920x1080-8e223414c54c.jpg


fortnite-unreal-engine-5-1-screenshot-2-1920x1080-4a60783b91fa.jpg



Wow. So Fortnite is the first UE5 game to be actually released and benefit from UE5.1.

@Andrew Lauritzen you guys should really update the recommended GPU specifications for Nanite. Epic says its RTX 2080/ RX 5700 while the RX 5700 is a lot slower than the 2080 and performs more like a 2060 Super. Now that was before UE5.1. As Nanite uses Mesh Shaders for geometry bigger than a pixel in UE5.1, chances are the 2060 Super will run a lot faster compared to the RX 5700.
 
Last edited:
Fortnite now runs on UE5.1 with significant graphics improvements!

fortnite-unreal-engine-5-1-screenshot-4-1920x1080-8e223414c54c.jpg


fortnite-unreal-engine-5-1-screenshot-2-1920x1080-4a60783b91fa.jpg



Wow. So Fortnite is the first UE5 game to be actually released and benefit from UE5.1.

@Andrew Lauritzen you guys should really update the recommended GPU specifications for Nanite. Epic says its RTX 2080/ RX 5700 while the RX 5700 is a lot slower than the 2080 and performs more like a 2060 Super. Now that was before UE5.1. As Nanite uses Mesh Shaders for geometry bigger than a pixel in UE5.1, chances are the 2060 Super will run a lot faster compared to the RX 5700.

That's awesome! I've never been remotely interested in Fortnite but I'm absolutely going to give this a try today. And I see it supports hardware RT as well. Presumably this is going to be a fairly standard feature in UE5 titles. Will be interesting to see what visual and performance differences it brings on different hardware, particularly in the case of those recommended specs.
 
but why does it bother you that some would think PS5 has some super saiyan secret sauce ?
In the end it's available to everyone that games always look better on PC, and that often XsX has a slight edge in perf of resolution.
Facts do the correction, then if they don't wanna see it, just don't bother, just like we don't bother about PC fanboys who shit on absolutely everything console related in a console section of a forum, or flat earthers !
Let them have their "shinning" moment with that missing RT on XsX, until it's fixed. or better, put them on ignore.

Speaking for myself here, I despise any form of misinformation and want to challenge it wherever it crops up regardless of the subject. Your flat earther example is a good one. No way I'm letting people spread that rubbish in my presence unchallenged.
 
Fortnite now runs on UE5.1 with significant graphics improvements!

fortnite-unreal-engine-5-1-screenshot-4-1920x1080-8e223414c54c.jpg


fortnite-unreal-engine-5-1-screenshot-2-1920x1080-4a60783b91fa.jpg



Wow. So Fortnite is the first UE5 game to be actually released and benefit from UE5.1.

@Andrew Lauritzen you guys should really update the recommended GPU specifications for Nanite. Epic says its RTX 2080/ RX 5700 while the RX 5700 is a lot slower than the 2080 and performs more like a 2060 Super. Now that was before UE5.1. As Nanite uses Mesh Shaders for geometry bigger than a pixel in UE5.1, chances are the 2060 Super will run a lot faster compared to the RX 5700.
@Dictator 👀
 
"Individual trees have around 300,000 polygons, and each stone, flower, and blade of grass is modeled."

This really is the next generation isn't it...I remember it being a big freaking deal that horizon zero dawn's thunder jaw was 500k polygons, an insane number for a single enemy.


.Made even more impressive because every polygon on screen in killzone 3(one of the most advanced games graphically of the 7th gen and still amazing looking today) at any given time was about 250k. And we now have individual trees out of however many running at full 60fps with hw RT and all the bells and whistles?!

My mind is on fire
 
Last edited:
"Individual trees have around 300,000 polygons, and each stone, flower, and blade of grass is modeled."

This really is the next generation isn't it...I remember it being a big freaking deal that horizon zero dawn's thunder jaw was 500k polygons, an insane number for a single enemy. Apparently every polygon on screen in killzone 3(one of the most advanced games graphically of the 7th gen) at any given time was about 250k. And we now have individual trees out of however many running at full 60fps with hw RT and all the bells and whistles?!

My mind is on fire

HW-RT is only on PC.
 
Fortnite now runs on UE5.1 with significant graphics improvements!

fortnite-unreal-engine-5-1-screenshot-4-1920x1080-8e223414c54c.jpg


fortnite-unreal-engine-5-1-screenshot-2-1920x1080-4a60783b91fa.jpg



Wow. So Fortnite is the first UE5 game to be actually released and benefit from UE5.1.

@Andrew Lauritzen you guys should really update the recommended GPU specifications for Nanite. Epic says its RTX 2080/ RX 5700 while the RX 5700 is a lot slower than the 2080 and performs more like a 2060 Super. Now that was before UE5.1. As Nanite uses Mesh Shaders for geometry bigger than a pixel in UE5.1, chances are the 2060 Super will run a lot faster compared to the RX 5700.

Whoa that is a nice surprise! Nanite, Lumen and VSMs with multiple quality settings and support for HWRT and you can turn it all off if you want to. It’ll make for some really interesting benchmarks.

Who has the @Dictator signal.
 
For Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution to be available in Fortnite on your PlayStation 5 or Xbox Series X|S, make sure the "120 FPS Mode" setting (in the "Graphics" section of the Video settings) is set to off.
In Fortnite Nanite and Lumen are now by default in the 60fps mode on consoles. This needs to be thoroughly tested!

HW-RT is only on PC.
What high number of polygons have to do with HW-RT?
 
HW-RT requires a restart, so that will make comparisons very difficult, especially considering this is such a dynamic game by nature. I would be really super interested in comparisons though, maybe Alex can cover it if he's finished with his current PC coverage. I especially would like to see a comparison in image quality and performance of the RTX 2060 Super with HW-RT and SW-RT against the RX 5700 with SW-RT only. Could be really interesting, especially because mesh shaders are now supported.
 
Last edited:
And it supports HW-RT reflections and global illumination. I hope that makes you less worried about HW-RT support in future UE5 titles.
That depends on how much the visual difference is between HW and SW Lumen and how it performs. If it is performant and makes a striking difference, then it would make me less worried for sure, especially considering there's a ton of foliage in Fortnite.

I will test this as soon as possible. Sadly, the servers are currently offline. Either way, very excited to try this out! This is the first UE5 game after all and its even running on the latest version.
 
That depends on how much the visual difference is between HW and SW Lumen and how it performs. If it is performant and makes a striking difference, then it would make me less worried for sure, especially considering there's a ton of foliage in Fortnite.
Agreed. Whether any visual "concessions" need to be made to achieve similar performance would be of interest from the HW and SW Lumen aspect.
 
I'm not falling into any type of persona, I'm a person who greatly appreciates how developers like ND and Santa Monica extracted great, consistent performance out of a PS4 with gorgeous visuals. I'm just criticising the people who fall into the "PS5 is super special" personas with a humorous tone. I'm one of the people who is highly vocal about how games keep stutter, how NVIDIA spams low VRAM cards, you can all see them on my past posts. That's not a sig of a PC persona stereotype. If it is, I don't know what to tell you.

I don't need a dev to tell my anything. Especially considering how they released a stutter fest game on PC and semi-fixed in in 24 HOURS. I wouldn't trust the devs of Callisto Protocol even if they say water is wet at this point. They knowingly released a broken product that could had a fix in mere hours, same most likely applies to Xbox. So claiming there needs to be "shift in code, adapt the code", I simply don't think so. There are potential people that would see the reason of game's stutters as the lack of hardware decompression if the game wasn't released on 8th gen. This is the type of mentality I'm really starting to get sick, naturally, I just joke with it, because at this point it is what it is, a joke material, since it is a huge joke itself.

Its all they got. RT, ML have rooted deep. Not it helps with entry gpu level (6600xt/3060) and a mere 2x ram increase either.
They have all the right to have their time now. I personally dont mind it, its all baseless with not really much to back it up. Let then have it, sportsmanship you know.
 
Back
Top