Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Just an example, not actual running it. The Epic guy wanted to emphasize that you don't need very top hardware to run UE5.

I thought this was expected, UE5 is going to run on mobile devices and current gen consoles.
PS5 wasn’t made to run just UE5. Other games may require more or less ssd bandwidth. Can’t build the hardware around one software implementation.

Yes but now I think Cerny's presentation makes a lot more sense. A lot of people were saying that Cerny focused on things which were not that important because of the tflops gap. After UE5 reveal a lot of the things he said are aligned with what we have seen from Epic. If they've been working closely, I wonder whether internal Sony game engines are opting for a similar approach. I mean it is going to be an interesting generation if MS goes for a more "standard" approach while Sony pushes in new directions and how that affects both 1st party and 3rd party games.
 
I thought this was expected, UE5 is going to run on mobile devices and current gen consoles.


Yes but, now I think Cerny's presentation makes a lot more sense. A lot of people were saying that Cerny focused on things which were not that important because of the tflops gap. After UE5 reveal a lot of the things he said are aligned with what we have seen from Epic. If they've been working closely, I wonder whether internal Sony game engines are opting for a similar approach. I mean it is going to be an interesting generation if MS goes for a more "standard" approach while Sony pushes in new directions and how that affects both 1st party and 3rd party games.

Indeed, the engineer may've wanted to downplay ssd needs but his comment on optimizing data layout for it to run on his high end nvme laptop gave the game away. That was only possible on a fixed flight path, not an open ended path. And they wouldn't do it unless it was necessary, and probably because it taxed even the nvme.
 
Well, Huang did say in a slide back in late 2019 that a "next gen console" is less than RTX 2080 on notebook.

5urok2y09m641.jpg

Xbox Series X is very similar performance to desktop 2080RTX according to DF and that laptop part is similar performance to 2070RTX (or even 2060RTX depending on the variant) so that laptop > next gen isn't right.

https://www.notebookcheck.net/NVIDIA-GeForce-RTX-2080-Max-Q-Graphics-Card.386276.0.html
 
i guess if Sony allowed Epic to show that demo on PS5, their first party showing for reveal must be at the very least as impressive.

People must not forget that the Unreal V tech demo is exactly that and what Sony will show will probably be actual games which if they get close to what Epic showed will be more impressive considering the real constraints of an actual game compared to a tech demo.
 
everything seen in the epic demo can be achieved in a game, they insisted on that, BUT the hard thing would be to keep that asset quality through a whole game. And monster game size.
The first "flying through the city" trailer of spiderman 2 will be "Epic" :D
 
What do you mean by your first question, could write it out with numbers?
The assumption is that if there's spare GPU power available, the resolution would bump up above 1440p. We'd expect that when performance demands are high, resolution drops below 1440p to hit a stable 30 fps, and when demands are low, resolution would go up above 1440p with that stable 30 fps framerate. Technically though, there's nothing stopping the engine from scaling resolution below 1440p to maintain framerate, but not scaling res up above that and instead scaling framerate up above 30 fps, other than that being a weird choice. But if 1440p was picked for some other reason, it could be the case that some scene can run at 40 fps, 1440p, and are being capped at 30fps, capping framerate. As such, I wouldn't rule that out as nonsensical just yet, even though I can't think of a scenario where that makes sense, but as I don't know how the engine works and what the different bottlenecks are, I'm not in any position to hazard a guess. ;)

Okay, I have hazarded a guess - 1440p is a fixed buffer size. That could be selected for a particular reason to have a static geometry* buffer that can be optimised for its fixed size, and then from that, we can render lower framebuffers but not higher ones. Well, you could upscale, but that could be seen as pointless if the majority of the game is rendering 1440p in very good clarity, why up that fidelity a tiny amount at some points?

* Or anything else in what likely involves complex scene representation.
 
The assumption is that if there's spare GPU power available, the resolution would bump up above 1440p. We'd expect that when performance demands are high, resolution drops below 1440p to hit a stable 30 fps, and when demands are low, resolution would go up above 1440p with that stable 30 fps framerate. Technically though, there's nothing stopping the engine from scaling resolution below 1440p to maintain framerate, but not scaling res up above that and instead scaling framerate up above 30 fps, other than that being a weird choice. But if 1440p was picked for some other reason, it could be the case that some scene can run at 40 fps, 1440p, and are being capped at 30fps, capping framerate. As such, I wouldn't rule that out as nonsensical just yet, even though I can't think of a scenario where that makes sense, but as I don't know how the engine works and what the different bottlenecks are, I'm not in any position to hazard a guess. ;)

Okay, I have hazarded a guess - 1440p is a fixed buffer size. That could be selected for a particular reason to have a static geometry* buffer that can be optimised for its fixed size, and then from that, we can render lower framebuffers but not higher ones. Well, you could upscale, but that could be seen as pointless if the majority of the game is rendering 1440p in very good clarity, why up that fidelity a tiny amount at some points?

* Or anything else in what likely involves complex scene representation.
We were just told dynamic resolution targetting 4K where the Demo spends most of it's time at 1440p. I do not think it is something overly complex. It is 1440p because that is what the Gpu can manage with the complexity of the lighting and nanite. Next gen graphics, especially GI, being expensive sounds about right and aligns with expectations.

Why imagine some scenario we have never Heard of before instead of how dynamic resolution works in basically any case?
Simplest explanation is often...
 
We were just told dynamic resolution targetting 4K where the Demo spends most of it's time at 1440p.
If they said that, I missed.

Why imagine some scenario we have never Heard of before instead of how dynamic resolution works in basically any case?
Because it's a new rendering technique and the information is contradictory - one source saying PS5 is framerate locked and can run faster. The simplest explanation may be the right one, but we shouldn't ignore other possibilities just to be able to assume we know how things are working ahead of Epic telling us in their technical presentation. ;) What's the motivation to think we know everything without questioning various possibilities?
 

Well lol.

Xbox Series X is very similar performance to desktop 2080RTX according to DF and that laptop part is similar performance to 2070RTX (or even 2060RTX depending on the variant) so that laptop > next gen isn't right.

Also thought that, it's only true for the PS5. NV seemed to know something atleast, as this exact max Q GPU resurfices again. UE5 tests must have been done on various platforms, and the tech demo is older then some days.
 
as already said, the Epic engineer said that HIS laptop runs the demo in the editor at 40 fps (so the true performance is higher). His laptop, not the one used by the journalist to make the interview and show the video. Several people translated this, no needs to denial.

Yeah, i know. But originally it was a mp4 file, isn't it.
 
as already said, the Epic engineer said that HIS laptop runs the demo in the editor at 40 fps (so the true performance is higher). His laptop, not the one used by the journalist to make the interview and show the video. Several people translated this, no needs to denial.

Also, read this post on ERA :

I think that’s simply a case of crossed wires over running a mp4 of the demo vs talking about how the tech runs on a laptop. In the technical detail, nothing Sweeney or this Q+A reportedly revealed contradict each other. What that Q+A just more explicitly confirms is that Nanite can dynamically scale triangle fidelity according to your data access. 1080p/2tris per pixel should indeed be in the realm of MB/s rather than GB/s. Sweeney already said this will scale down, it just won’t look as good. In saying ‘this needs MB/s, not GB/s’, I don’t think this engineer is at all claiming you’ll get the same fidelity on MB/s. Simply that the tech can scale it down to that level.

Dunno is ERA forbidden here :

https://www.resetera.com/threads/ti...-awesome-on-both.206223/page-18#post-34243503
 

His translation is accurate. The entire activity was quite focused on cross platform development even though the demo ran on PS5. When he mentioned he doesn’t need fast SSD, he was referring to the demo, and used 2K display, 2 triangles per pixel as example. He also acknowledged that they rely on fast I/O for large game world streaming. Overlapped IO (since UE 4.25 ?), compression, and better disk layout are other tools he uses to help in streaming.

And as I posted several times now, he did not state any resolution, effects, or any other settings when he said their laptops ran the opening scene at 40fps in the editor (assets not fully “cooked”). He used the 40fps laptop example in the context of reinforcing he can/will hit 60fps for the nextgen consoles.

People are getting too negative about the event. The engineers were all positive about their achievements and what they can do. After the event, I have assumed that he can hit 60fps for nextgen consoles (albeit at unknown resolution). Also looking forward to see a live example of world streaming in a game.

EDIT:
And to Epic Shanghai, the event was tedious to watch. I understand that celebration is in order, but there’s too much eating, off topic small talks, cross talk, and unfocused remarks in the show. A geek like me would have watched the presentation back-to-back multiple times. But so far, I gave up every time after about 10 minutes. It’s too difficult to get into.
 
Last edited:
His translation is accurate. The entire activity was quite focused on cross platform development even though the demo ran on PS5. When he mentioned he doesn’t need fast SSD, he was referring to the demo, and used 2K display, 2 triangles per pixel as example. He also acknowledged that they rely on fast I/O for large game world streaming. Overlapped IO (since UE 4.25 ?), compression, and better disk layout are other tools he uses to help in streaming.

And as I posted several times now, he did not state any resolution, effects, or any other settings when he said their laptops ran the opening scene at 40fps in the editor (assets not fully “cooked”). He used the 40fps laptop example in the context of reinforcing he can/will hit 60fps for the nextgen consoles.

People are getting too negative about the event. The engineers were all positive about their achievements and what they can do. After the event, I have assumed that he can hit 60fps for nextgen consoles (albeit at unknown resolution). Also looking forward to see a live example of world streaming in a game.

But i don't get. Of course, i'm a layman. But if that chinese devs ran a .mp4 file in his editor, how he possibly could know that from .mp4 file. He didn't have engine source code in editor so he could done things on the fly, just a .mp4 PS5 demo showed from few days ago??
 
His translation is accurate. The entire activity was quite focused on cross platform development even though the demo ran on PS5. When he mentioned he doesn’t need fast SSD, he was referring to the demo, and used 2K display, 2 triangles per pixel as example. He also acknowledged that they rely on fast I/O for large game world streaming. Overlapped IO (since UE 4.25 ?), compression, and better disk layout are other tools he uses to help in streaming.

And as I posted several times now, he did not state any resolution, effects, or any other settings when he said their laptops ran the opening scene at 40fps in the editor (assets not fully “cooked”). He used the 40fps laptop example in the context of reinforcing he can/will hit 60fps for the nextgen consoles.

People are getting too negative about the event. The engineers were all positive about their achievements and what they can do. After the event, I have assumed that he can hit 60fps for nextgen consoles (albeit at unknown resolution). Also looking forward to see a live example of world streaming in a game.
Yes, I forgot about this part when he said it. The sentiment he's conveying is almost as if next gen consoles can easily reach 60fps after optimization when even his laptop can do it at 40fps. Also yes he didn't specify at any one point that the demo was 1440p on his laptop.
A deep tear down of UE5 couldn't happen soon enough.
 
But i don't get. Of course, i'm a layman. But if that chinese devs ran a .mp4 file in his editor, how he possibly could know that from .mp4 file. He didn't have engine source code in editor so he could done things on the fly, just a .mp4 PS5 demo showed from few days ago??

Sweeney didn’t watch the show. He doesn’t understand mandarin. He probably jumped to the first logical conclusion to him because the question didn’t make sense at all. The 40fps remark was made to imply the engineer had made (good) progress towards 60fps on the nextgen consoles.
 
Back
Top