Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

I didn't conveniently left out anything. He wasn't asked if the same demo could run on PC. He was asked what kind of PC would be needed to run the same demo.To which he answered it's too early to know. Perhaps you can't run it at all, or perhaps you need a 24-core Threadripper so you can dedicate 8+ cores just for software decompression. Or perhaps with a significantly slower I/O you'll need a PC with 64GB of RAM to keep most of the level within the RAM, after a loading screen to decompress the data.

Adding a loading screen is NOT the same demo and will make bench-marking completely useless.
That makes zero sense...
 
Did all X360 games need to be run-able on a HDD-less X360?
Honest question, because I thought after a couple of years a lot of games were demanding the HDD to even run, and by ~2010 at least all AAAs were in that position.
Pretty much all had to be capable of running, but that didn't mean they didn't suffer in some way. There were definite improvements (for the vast majority, at least) when HDD-installation was made possible (late-2008).
 
samaritan looks dated now.

Don't confuse your opinion with facts.
The volumetric lights, the bokeh quality, the rain particles physically driven etc have no match on ps4, etc yet the only dated thing are low poly models and texturing, but is not touched by actual games on ps4/xb1, yet, IMHO

do you think that even Infiltrator UE4 demo "looks dated now"?

 
And what is my interpretation?
Last page you were accusing me of securing a narrative that only PS5's IO makes this possible (i.e. throwing a fanboy accusation, how nice), to which I asked for proof of such and you didn't provide.
Mate, you need to calm down a bit. ;)

The words you wrote in your first contribution to this thread were such that I read them to be saying this was only possible on PS5. I'm not going to bother quoting them and explaining my interpretation as that's a trap where we all go off into one of those internet discussions where people start quoting dictionary entries. :mrgreen: We can go back and forth pointing out each others exact words, and the exact interpretations (does running this demo at lower assets count as running the demo or not? etc), but chasing down who exactly is right or wrong isn't at all constructive.

The main issue is you said there shouldn't be debate on what's going on. Turning up saying, "everyone stop talking now, I've sorted it out and will tell everyone exactly how it is," isn't helpful, and sets everyone up for some more of the same debate about the debate rather than the topic. There's just way too much of that in B3D's console discussions at the moment, and I'm as guilty as everyone else. If that wasn't the sway you meant to present yourself, that's the way it came across to me. If you'd posted, "this is what I think is happening," more than, "it's obvious what's happening; why is this still being talked about?"

Perhaps we all end up with posting habits and choice of language that sometimes get us into ruts when it comes to some discussions? Perhaps some of that is inherited from the wider internet culture and shapes our thought patterns? Whatever, it'd be nice if there was more sharing of ideas as ideas rather than trying to establish a set of ideas as the working factoids. Even if some people believe the demo ran as is on PC when it didn't, or the opposite, that doesn't really matter and isn't worth establishing over pages of tweet and rumour analysis.
 
PS4 could run UE4 elemental demo. Was it looking as good as the PC version ? Hell no. Did actual games caught up or surpassed it ? Yes IMO.
I confess I never really understood the appeal of the elemental demo. It seemed to overlap most of its visuals and feature exposition with Firestrike, and it didn't seem impressive at all.


anyway I've yet to see a game that reach the Samaritan UE3 DX11 demo, from 2013
I'd say the samaritan demo visuals were already largely surpassed by the later games of this gen, especially when running on the mid-gens.
Even the Infiltrator demo seems like something I could see on e.g. Gears 5, which I've been playing on the PC.


Adding a loading screen is NOT the same demo and will make bench-marking completely useless.
That makes zero sense...
Zero sense?
Which globally used 3D rendering benchmark you know of is measuring loading screen times, at the moment?
I'd rather have a 30 seconds loading screen every 30min than not being able to play games with that IQ at all, on my PC.
 
Don't confuse your opinion with facts.
The volumetric lights, the bokeh quality, the rain particles physically driven etc have no match on ps4, etc yet the only dated thing are low poly models and texturing, but is not touched by actual games on ps4/xb1, yet, IMHO

do you think that even Infiltrator UE4 demo "looks dated now"?


Batman Arkham Knight matched Samaritan and yes Infiltrator is also dated. Forza Horizon 4, RDR2, Detroit, etc
Now back to the topic, Do you guys think Epic will do a tech reveal come GDC in August? Tim is claiming he is under NDA.
When do these NDAs usually lift? I'm sure they can't keep silent till next year.
 
Take any open world or racing game on HDD-less X360?

The compute power of the source hardware, nor the method of IO should matter. The same rules surely apply, the difference is what's being streamed.

because of the tech, they compute full detail model from disk to reach the pixel-level details, I suppose:

if the system lacks compute power, it would start from a lower resolution model, to do easier and lighter conversion to pixel level (and/or process less pixels, so lower screen resolution)
if the system lacks ssd speed, it would start from a lower resolution model, to take less I/O bandwidh (and/or use some sys memory to cache the asset streaming)

Yeah, agreed on all points. We know the system scales the assets based on the speed of the IO, that's what makes me raise the "not fully cooked" comment.
 
Last edited by a moderator:
because of the tech, they compute full detail model from disk to reach the pixel-level details, I suppose:

if the system lacks compute power, it would start from a lower resolution model
How so? Store lower resolution data on storage? I don't think that's realistic (you could do platform builds like console, PC, mobile) and you'd have to perform the virtualisation as a course level in the tree/hierarchy. But then if the geometry is encoded in some sort of 2D structured array (like a texture) would you fetch every other 'pixel'? Surely you'd have to fetch the entire geometry detail.
 
The words you wrote in your first contribution to this thread were such that I read them to be saying this was only possible on PS5.
First contribution, about raytracing:

They claimed UE5 will support ray tracing, but it's not being used in this demo.
Though the demo shows just how much the old rasterization still has in it.

Though considering how Epic is seemingly maxing out SSD and shader throughput for geometry and textures, I wonder if we'll see this level of geometry detail paired up with raytracing at all.

And then second, third, fourth, fifth..
¯\_(ツ)_/¯


Batman Arkham Knight matched Samaritan and yes Infiltrator is also dated. Forza Horizon 4, RDR2, Detroit, etc
Now back to the topic, Do you guys think Epic will do a tech reveal come GDC in August? Tim is claiming he is under NDA.
When do these NDAs usually lift? I'm sure they can't keep silent till next year.

Volumetric Dog?

 

Because it is, and you're dismissing that just to secure a narrative that only PS5's IO makes it possible.
Agreed, but I speculated back in this post that the 'choice' of PS5 may not have been a choice at all and it was really PS5 or nothing because the consoles efficient I/O may have compensated for inefficiencies of the early state that the UE5 engine is in. I think we can ignore marketing as Sony uniformly ignored the event, not acknowledging it at all in any social media channels - despite the Epic guys saying that had worked closely with Sony. I'm sure they also worked closely with Microsoft.

Of course "systems integration" could be a number of factors, maybe PS5's devkit is further along and/or the tools are better for showing it off without other OS functionality getting in the way. But it's Tim Sweeney who keeps nodding towards the SSD and I/O so you can only ignore the guy who owns the company for so long.
 
Last edited by a moderator:
not, the PS5 currently outperforms all other platforms for IO speed right now.

Pure transfer speeds? Nope, theres faster, if you really want to. By the time ps5 releases theres 7gb/s drives. I’d rather have optane if it was less expensive though, they perform blazingly fast, more akin to ddr ram then nand.

Perhaps you can't run it at all, or perhaps you need a 24-core Threadripper so you can dedicate 8+ cores just for software decompression.

Doubt that laptop was equiped with a threadripper, or 64gb of ram. Most likely 16 to 32gb ram, and 8 core zen 2, with a blazing fast nvme, rtx 2080. That system is already more powerfull so it isnt so strange it did perform better, framerate usually is tied to gpu and cpu power, in special with the GI tracing and high quality rendering going on. Nvme in there wouldnt have to be a limiting factor either, after what ive seen such a setup can achieve in SC, the amount of detail in that game is beyond anything else, yet this fast travel can be done.
 
Pure transfer speeds? Nope, theres faster, if you really want to. By the time ps5 releases theres 7gb/s drives. I’d rather have optane if it was less expensive though, they perform blazingly fast, more akin to ddr ram then nand.



Doubt that laptop was equiped with a threadripper, or 64gb of ram. Most likely 16 to 32gb ram, and 8 core zen 2, with a blazing fast nvme, rtx 2080. That system is already more powerfull so it isnt so strange it did perform better, framerate usually is tied to gpu and cpu power, in special with the GI tracing and high quality rendering going on. Nvme in there wouldnt have to be a limiting factor either, after what ive seen such a setup can achieve in SC, the amount of detail in that game is beyond anything else, yet this fast travel can be done.

I love how you've determined the entire spec of the engineer's laptop. Strange that such a powerful machine was having issues running the demo.

Not from a GPU perspective since the framerate sounds sufficient - it's almost like there was some other bottleneck.
 
Pretty much all had to be capable of running, but that didn't mean they didn't suffer in some way. There were definite improvements (for the vast majority, at least) when HDD-installation was made possible (late-2008).

I don't know if that's true or not, but a handful of X360 games state on their cases that a Hard Drive is required to play the game. Battlefield 4, Wolfenstein: The New Order, Alan Wake, and Assassin's Creed: Rogue all have this disclaimer (those were the games in my collection that I could find). I don't have a HDD-less Xbox 360 to check whether or not they will play without it.
 
I love how you've determined the entire spec of the engineer's laptop. Strange that such a powerful machine was having issues running the demo.

Not from a GPU perspective since the framerate sounds sufficient - it's almost like there was some other bottleneck.

neither GPU was powerful, RTX 2080 mobile is equivalent to desktop RTX 2070,
the lap top have the 2080 Max-Q version, this is weaker than rtx 2080 mobile, equivalent to desktop RTX 2060. The NVIDIA GeForce RTX 2080 with Max-Q design is the power saving variant of the mobile RTX 2080 with reduced clock speeds and power consumption
 
neither GPU was powerful, RTX 2080 mobile is equivalent to desktop RTX 2070,
the lap top have the 2080 Max-Q version, this is weaker than rtx 2080 mobile, equivalent to desktop RTX 2060. The NVIDIA GeForce RTX 2080 with Max-Q design is the power saving variant of the mobile RTX 2080 with reduced clock speeds and power consumption

My PC is a Surface Pro. Ha.
 
Back
Top