The scalability and evolution of game engines *spawn*

ultragpu

Banned
The engine is designed to cater for 60fps at 4k, my feeling is performance takes more priority than pushing visuals. The first engine demonstration trailer is probably indicative of the general look and feel of what's to come, more or less.
 
The engine is designed to cater for 60fps at 4k, my feeling is performance takes more priority than pushing visuals. The first engine demonstration trailer is probably indicative of the general look and feel of what's to come, more or less.
???
Who said that

They have 4K, they have 60fps. They have PC as well which can scale as high as people want.
They don't have to reduce the visual fidelity bar at all.

They have an engine that is catered towards Halo. That means they aren't designing a game to be a walking forest simulator because that isn't the type of game it's going to be. It's going to be supporting wide, expansive, fast gameplay with air and ground vehicles, with long distance aiming, all sorts of large AOE type weapons.

That is what the engine is catering towards.
 
???
Who said that

They have 4K, they have 60fps. They have PC as well which can scale as high as people want.
They don't have to reduce the visual fidelity bar at all.

They have an engine that is catered towards Halo. That means they aren't designing a game to be a walking forest simulator because that isn't the type of game it's going to be. It's going to be supporting wide, expansive, fast gameplay with air and ground vehicles, with long distance aiming, all sorts of large AOE type weapons.

That is what the engine is catering towards.
It still has to run on 1S, 1X and Lockhart at 60fps with all the stuffs you mentioned so one could assume the base level design, amount of enemy spawn, interaction, geometry lay out etc wouldn't change all that much between the platforms otherwise one version could play quite differently mechanically. Take Halo 5 for example, the main differences between the 1X and 1S are the 4K resolution and improved AF. 99% of the asset quality is the same, as well as lighting, shading, particles, level density.
Now I don't know how the Slip space engine scales for all the versions, just having some speculation and putting current gen into the equation. Would be interesting to see the difference between the 1S and XSX version come release.
 
It still has to run on 1S, 1X and Lockhart at 60fps with all the stuffs you mentioned so one could assume the base level design, amount of enemy spawn, interaction, geometry lay out etc wouldn't change all that much between the platforms otherwise one version could play quite differently mechanically. Take Halo 5 for example, the main differences between the 1X and 1S are the 4K resolution and improved AF. 99% of the asset quality is the same, as well as lighting, shading, particles, level density.
Now I don't know how the Slip space engine scales for all the versions, just having some speculation and putting current gen into the equation. Would be interesting to see the difference between the 1S and XSX version come release.
And why would a engine designed to support all those things be incapable of looking great or incapable of scaling up very high?
 
Last edited:
Just to play devil's advocate - the Series X is ~10x more powerful than the One S in pure flops and a LOT more in practical performance.

The One's games are going to look pretty shoddy for the last few games that are required to be present on both machines.

Edit: what's a 10th-15th of 4k?
 
Last edited by a moderator:
Engine's are just engines. They are a culmination of audio, visual, inputs, systems all put together.
Slipspace engine was designed ground up for next generation. It scales downwards.

Most other companies going into next gen, took their existing engine and moved it upwards.

Like Decima. Like what's Insomniac uses. Unreal 5 scales up and down. All engines scale!

What most people perceive as 'good graphics' comes down to really the art team and production values. Any engine, like Unity, is capable of making CGI type graphics, but Unity as engine is not often used for AAA gaming.

Way too much credit is given to code, hardware, and how much and engine can utilize said hardware, without people giving the real credit to the cost and labour to produce the effects, the details, and the art for a game. Unreal 4 is used to make the Mandalorian!

That's all I have to say. If MS wants a good looking game, they have to pay. The engine isn't going to save them from low production assets.

It's expensive. There are hardware budgets, but restrictions can be removed now that they are headed into next gen.

The need for 1080p/900p on xbox isn't there. Reconstruction techniques are better. X1X doesn't need to be 4K either.

In a world where they will put Doom Eternal on switch, and obviously switch was not the target platform, I'm really surprised by some commentary on here.
 
Last edited:
Just to play devil's advocate - the Series X is ~10x more powerful than the One S in pure flops and a LOT more in practical performance.

The One's games are going to look pretty shoddy for the last few games that are required to be present on both machines.

Edit: what's a 10th-15th of 4k?
about 720pish if you are comparing native to native.
 
And I say this as a Halo fan who has played them all(except Halo 5)....but has Halo ever really been on the bleeding edge in terms of graphics? I guess the first one sorta did at the time... what always stood out to me about Halo was more the physics engine if anything..

I'm expecting good but not mind-blowing stuff with Infinite. At the end of the day it's got a Microsoft budget behind it.
 
And I say this as a Halo fan who has played them all(except Halo 5)....but has Halo ever really been on the bleeding edge in terms of graphics? I guess the first one sorta did at the time... what always stood out to me about Halo was more the physics engine if anything..

I'm expecting good but not mind-blowing stuff with Infinite. At the end of the day it's got a Microsoft budget behind it.
I dunno, I wasn't around back then, well I had the games, but I didn't compare things in this way. I think for a short period of time Halo 3 was considered to be fairly high end. Halo 4 was also fairly impressive for what they accomplished looking at the hardware they worked with.

From what I can see right now, in what they've shown, a lot of high quality DOF, high quality dynamic lighting, materials are looking very good. Huge vistas, landscapes etc.

I'm most impressed by lighting. That's just me. And dynamic lighting is one of the highest computation costs for graphics, so if you can nail amazing dynamic lighting and have amazing graphics alongside that - you've succeeded in my books.

Minecraft RTX is still the tour de force for me, hoping to see more of that.
 
about 720pish if you are comparing native to native.

Looks grimer than that. I calculate either:

1152x648
768x432

It'll more likely be the lower end of those numbers too. Maybe I'm even being conservative. What other compromises can they make?

Goes to show the differences between the two machines though. Wowzer.
 
I dunno, I wasn't around back then, well I had the games, but I didn't compare things in this way. I think for a short period of time Halo 3 was considered to be fairly high end. Halo 4 was also fairly impressive for what they accomplished looking at the hardware they worked with.

From what I can see right now, in what they've shown, a lot of high quality DOF, high quality dynamic lighting, materials are looking very good. Huge vistas, landscapes etc.

I'm most impressed by lighting. That's just me. And dynamic lighting is one of the highest computation costs for graphics, so if you can nail amazing dynamic lighting and have amazing graphics alongside that - you've succeeded in my books.

Minecraft RTX is still the tour de force for me, hoping to see more of that.

Yeah like I'm not saying it looks bad so much as it doesn't have the same fidelity and detail as some games...and that could be down to it having more expansive level design...at least traditionally. My memory is a bit hazy but I think I was more blown away by Gears of War and COD 4 than Halo 3 at the time. I do remember being impressed by Halo 4 but I can't tell if that was more due to the fact it was running on a 360 lol....maybe Infinite could be Xbox One's Halo 4 so to speak...
 
Looks grimer than that. I calculate either:

1152x648
768x432

It'll more likely be the lower end of those numbers too. Maybe I'm even being conservative. What other compromises can they make?

Goes to show the differences between the two machines though. Wowzer.
you doing it by pixel count or by resolution?

1280 * 720 = 921600px
3840*2160 = 8294400px

= 9x pixel difference.
If you 1/2 the frame rate and reconstruct there should be enough room there to make up the power differential.

Even then, it's conservative. It's a 10x raw power differential. We didn't take into account how efficient they are at leveraging each clock cycle.
 
Way too much credit is given to code, hardware, and how much and engine can utilize said hardware, without people giving the real credit to the cost and labour to produce the effects, the details, and the art for a game.
That's probably a relic from previous generations where the engine empowered the artist and there was a clear difference between a 'good' engine and not so good engine. Getting a good looking game on PS2 meant very clever coders who could write an effective engine that got the most from it. These days the hardware is very accessible and any game can use PBR and baked lighting to look good, so there ain't a huge amount to differentiate on overall look, and the difference will come in terms of optimisation and how many FPS a game gets with a particular engine.
 
you doing it by pixel count or by resolution?

1280 * 720 = 921600px
3840*2160 = 8294400px

= 9x pixel difference.
If you 1/2 the frame rate and reconstruct there should be enough room there to make up the power differential.

Even then, it's conservative. It's a 10x raw power differential. We didn't take into account how efficient they are at leveraging each clock cycle.

I was counting by pixels. It's highly unlikely to be a 9x difference between them, only in raw tflops. Didn't Cerny state that the RDNA2 CUs are like 60% larger?
 
If you 1/2 the frame rate and reconstruct there should be enough room there to make up the power differential.
The lower the resolution the worst reconstruction looks, especially if the lower one is then half the framerate.

So could easily be 2160p reconstruction, the 720p version reconstructed then upscaled. Lower one isn't gaining much then as a way to claw back performance.

Although in this particular scenario, you would be dropping settings also, not just resolution.
Graphically will be ways around it, it's game design and what is scoped that suffers more.
 
I was counting by pixels. It's highly unlikely to be a 9x difference between them, only in raw tflops. Didn't Cerny state that the RDNA2 CUs are like 60% larger?
Yes
But CUs are all encompassing. Raw throughput is still raw throughput. Most code may not fully leverage every aspect of a CU (at least not simultaneously).
The real question is whether they designed an engine that will scale without the need of much of the fixed function pipeline. If they can get away with doing as much as possible on compute and forgo the fixed function pipeline - then I'm no longer sure what XBO is capable of in it's fullest. Texture quality will suffer for speed however. Just thinking out loud. Those would be the tradeoffs.

if the game is going to go through the traditional unified shader pipeline, then I think we have a pretty solid idea of it's limitations.
 
Looks grimer than that. I calculate either:

1152x648
768x432
2160p is 3840 x 2160 == 8294400 pixels. 10% of that is 829440 pixels. That's a little under 720p, about 1216 x 684.
you doing it by pixel count or by resolution?
Direct calculation is SQRT(pixels / 144) for a 16:9 aspect screen. That's SQRT(pixels / (16 x 9))

Even then, it's conservative. It's a 10x raw power differential. We didn't take into account how efficient they are at leveraging each clock cycle.
Realistically that far favours a larger delta as XBSX is a significant advance in hardware efficiency at those flops. If we go with the rule of thumb RDNA versus GCN flops is 3:2 or whatever, you'd be looking at maybe more like XBSX being 15x the GPU power.

Hmmm, that's actually a huge difference. :oops: What is that in GPU terms? What games run at 60 fps 4K on a 2070/2080, and how well do those games run at on a...750?
 
I'd hazard a guess that it won't be long before Microsoft drop the mandate to run on both gens.

...which is a good thing to my mind.

Edit: we haven't even considered CPU limitations.
 
Looks grimer than that. I calculate either:

1152x648
768x432

It'll more likely be the lower end of those numbers too. Maybe I'm even being conservative. What other compromises can they make?
.

The single player campaign if we take the Activision approach. :runaway::runaway:

Back to reality.

I suspect dropping RT work, pixel count and draw distances / load will lower the gpu overhead. It's unlikely to be running at the same graphical settings.

I also expect to see much more half rate enemy animation and other cpu saving tricks. This is where there is less wiggle room, likewise asset loading issues on base console.
 
Back
Top