Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
It presumably must work for some games since we have quite a few that adjust resolution on the fly (ID Tech 5 games, Halo 5, etc.) I guess none of those games push the CPU to 100% then.
Surely that's for GPU power, not CPU. The CPU requirements will be the same for H5 at all resolutions, and it just drops resolution when the GPU can't draw everything at full res.
 
Yes, what people seem to mix up here is that when you take a 60fps game and scale it down to 30fps, it'd automatically max out the entire system. But this is not the case, most of the CPU would sit idle a lot, as the underlying game systems and datasets were designed with 16ms frame times in mind.

So, such a scaled back game would look somewhat better at 30fps than at 60fps indeed; but it would still be limited by a lot of the trade-offs made to have it run at that framerate. Thus it'd always be inferior to a game that was designed to run at 30fps from the start, but it would now lose its only real advantage (speed) as well. This choice makes no sense to me, personally.
 
I guess this idea of reduce graphics settings = higher frame rate comes from the PC world where games are usually more GPU than CPU bound. Hence graphics can often be tailored to give a desired frame rate. But on consoles it seems the opposite is usually true and so varying graphics won;t have as great an impact on overall frame rate.
 
Unless the game is originally designed with a small frame time. So scaling to a larger one would be fairly simple.

Or is only GPU limited from the start.
 
Perhaps the Xbox One is still at least a bit GPU bound, in specific areas and or specific games? Or resources are still looking for optimal usage. For instance didn't I read that DX12 features allowed CU based culling and this helped performance? That would suggest a surplus in the CU department but a limit in dedicated hardware elsewhere. Or was that PC only?

Of course the same holds elsewhere, even PC, but with different bottlenecks. For instance PCs have (or should have?) very different interplay between CPU and GPU.
 
Last edited:
Unless the game is originally designed with a small frame time. So scaling to a larger one would be fairly simple.

I'm not sure what you mean here...

A game designed to run at 60fps is built to complete all calculations and data movement within 16ms. It probably makes a lot of trade-offs to achieve this.

Now you decide that the framerate should be scaled back to 30fps as an option. Suddenly the CPU has 33ms to calculate stuff and move data around. But the game systems can't simply scale up... so what would you have the game do in the extra time? Changing AI, simulations, number of enemies etc. is not possible as in that case you'd get a very different game, it'd have to be re-tuned, re-tested, it'd practically be a separate product.

So the game code stays the same. Some graphics related stuff might be scaled up, but the entire point of this hw generation is to move as much of the work to the GPU as possible, so there's less and less to do on the CPU here. So what you get is a lot of wasted CPU cycles and underutilized memory buses; and visuals that are somewhat better than it was at 60fps, but still not fundamentally different.
 
Yes, what people seem to mix up here is that when you take a 60fps game and scale it down to 30fps, it'd automatically max out the entire system. But this is not the case, most of the CPU would sit idle a lot, as the underlying game systems and datasets were designed with 16ms frame times in mind.

So, such a scaled back game would look somewhat better at 30fps than at 60fps indeed; but it would still be limited by a lot of the trade-offs made to have it run at that framerate. Thus it'd always be inferior to a game that was designed to run at 30fps from the start, but it would now lose its only real advantage (speed) as well. This choice makes no sense to me, personally.

could you use this to run split screen?

calculate and render each player alternatively so they get an effective 30fps each
 
Split screen will run the universe/game physics on the one machine and share the resources. Split screen is a rendering overhead, not a CPU one (bar what the CPU needs to do to get the scene rendering), drawing the same game twice per frame.
 
But also remember: taking Halo 5 and running it at 30 fps would make it a completely different game. So it's not like they could just add splitscreen co-op by halving the framerate.
 
Now you decide that the framerate should be scaled back to 30fps as an option. Suddenly the CPU has 33ms to calculate stuff and move data around. But the game systems can't simply scale up... so what would you have the game do in the extra time? Changing AI, simulations, number of enemies etc. is not possible as in that case you'd get a very different game, it'd have to be re-tuned, re-tested, it'd practically be a separate product.
I’d suggest nothing needs to be done. It can remain idle. I agree that it’s not the best use of resource, but you’re not losing out on anything compared to the 60hz game, you’re getting exactly the same. And you definitely don’t want to start changing the whole simulations based on the framerate.
Some graphics related stuff might be scaled up, but the entire point of this hw generation is to move as much of the work to the GPU as possible, so there's less and less to do on the CPU here. So what you get is a lot of wasted CPU cycles and underutilized memory buses; and visuals that are somewhat better than it was at 60fps, but still not fundamentally different.

Isn’t that essentially the same as what you’d get if the game were designed as 30fps anyway? The only loss would be the unused cycles.

Let’s be honest here, some people really do prefer resolution over framerate (I’m not one of them), so an option to play Halo 5 at 1920x1080 @ 30fps (static!) and some better AA and shading could be a bonus to some. Especially in the single player game. I don’t agree that it fundamentally changes the game.

And besides, didn’t TLOU:R actually implement a 30fps option? If I remember correctly that actually had some better shadowing.
 
But also remember: taking Halo 5 and running it at 30 fps would make it a completely different game.

Would it though? If the CPU cycles went unused it'd be the same game just at a lower framerate. I imagine 343i could have splitscreen if they wanted 30fps. It'd be fairly awful for those using it if they're playing competitively with people running theirs at 60fps. They'd get ruined.
 
Well for a start you'd get more latency in the controls, so I'd presume that the way your character moves around in the game world and how weapons and physics work would pretty much get compromised to a level; but at least they'd be completely different from the 60fps online experience. 343 would have to rebalance the whole thing, probably including levels themselves.
 

You seem to completely ignore the fact that 60fps is about a lot more than just the visuals. All the underlying game systems become different as well.

Even Quake 1's comparably simple workings were effected to a level that some jumps on some levels were only possible above a certain framerate. And Halo5 is quite likely a lot more complex than that ~20 years old game.
 
Couldn't you therefore lock the physics? I'm fairly sure all games do it these days, otherwise adding a new CPU to a PC would always cause the physics to go wonky.
 
@sebbbi 's game, Trials, would be really massively impacted by 30Hz. Providing the option for it would be dumb. It would be a relatively terrible experience. And for games like The Witcher, to get the game to run at 60Hz on console they'd have to make massive compromises. I don't see how they could do it without ruining the game. In the end, it's up to the developers.
 
Couldn't you therefore lock the physics? I'm fairly sure all games do it these days, otherwise adding a new CPU to a PC would always cause the physics to go wonky.
Yea usually engines update physics separately from the update code now a days to avoid wonkiness. Fixed update is the term I think most people have coined it.
But having said that, if you are designing animation for a specific FPS, umm I'm not entirely sure Fixed update resolves this issue. @Laa-Yosh might be able to comment more on animation fps.
 
You can do that technically, but it doesn't guarantee you that the gameplay stays intact.

I also find it strange that you're so bent on attempting to prove the validity of a feature that no console game dev would ever consider, but hey, whatever...
 
BTW a simple theoretical parallel would be to take a 30fps game, lock it at 15fps and boost up the graphics. I suppose it should be quite clear how much of the gameplay would suffer there and how it'd be completely different to interact with, compared to the original...
 
I also find it strange that you're so bent on attempting to prove the validity of a feature that no console game dev would ever consider, but hey, whatever...

Wow, that's a bit harsh isn't it? I'm not 'bent' on it, I was told it couldn't be done and I justified it.

If you look back to the original post I actually stated that I thought 343i had made the right choices.
 
Status
Not open for further replies.
Back
Top