Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
So are they saying that even in 1080p the frames run over budget in certain circumstances? Crazy, but atleast its a more consistent 60 instead of constantly stuttering because of going higher.

So at the end of the day, the ps4 and one version perform just about the same but the PS4 does it at 2.2 times the pixel density
 
So are they saying that even in 1080p the frames run over budget in certain circumstances? Crazy, but atleast its a more consistent 60 instead of constantly stuttering because of going higher.

Read a little more:

DF said:
The increase in pixel workload also means that the engine drops frames more often in demanding scenes. At worst we're looking at a drop down to 40fps when the engine is more heavily stressed, while most of the time the dips in performance stick to fluctuating between 50 and 60fps.
I guess sometimes it's over 60 fps, although that's not made clear. It certainly drops lower than 60 though, sometimes down to 40. Is XB1 is a more stable 60 fps, the difference isn't 2.2x
 
DF says that it goes above 60fps fairly frequently. For the most part, the FPS is above 50 and the majority of the time it's at 60. Rarely did it dip below 50 in DF's clip (I think I only saw it dip below 50 2-3 times briefly).
 
DF says that it goes above 60fps fairly frequently.
That's pre-patch. They describe the fluctuating framerate, their investigation, and finding that the game is rendering >60fps causing judder. They then describe the changes post patch and how the judder is improved. It's left to the reader to guess if the subsequent judder is from >60 fps or from drops to lower framerates. The first couple of minutes of their YouTube gameplay comparisons shows frequent drops to fifty-something FPS.
 
OIC.

Well if they say that the judder improved from increasing the resolution, then it's most likely the framerate still going >60fps occasionally. I doubt people would complain about the frame rate dropping to the 50s... that's hardly noticeable.
 
It might be occasionally, but that'll probably be in fringe cases with little going on. The gameplay footage shows it's just about managing 60fps, so there can't be a lot of room to fly up above 60 except when there's much less to draw.
 
OIC.

Well if they say that the judder improved from increasing the resolution, then it's most likely the framerate still going >60fps occasionally. I doubt people would complain about the frame rate dropping to the 50s... that's hardly noticeable.

Speak for yourself, and frame drop from 60 is usually very apparent, it's such a big step down, because if there is no tearing 50 is just an average, any dropped frame will tear or it will be held for two frames (30fps).
It's usually a very obvious stutter, though I admit to being particularly sensitive to frame rate.
 
I admit I was speaking for myself. But I was also judging by the lack of complaints of previous CoD titles on console where the frame rate dipped into the 50s almost all the time (PS3 dipping below 50fps many times).
 
Last edited by a moderator:
So ... they make a game that grosses 1 billion $ on its release day but somehow don't have the resources to check the final resolution and vsync settings before shipping it?

Kind of disillusioning, to be honest.
 
Speak for yourself, and frame drop from 60 is usually very apparent, it's such a big step down, because if there is no tearing 50 is just an average, any dropped frame will tear or it will be held for two frames (30fps).
It's usually a very obvious stutter, though I admit to being particularly sensitive to frame rate.

I thought this whole problem had been solved last-gen with the soft v-sync approach, with careful management of tearing (very top of screen) of course. That's not even getting into different strategies for dynamic framebuffers that I expect to become pretty commonplace. Killzone Mercenaries seemed to be very clever in that regard, and Rage as well.
 
what's the deal with DF's coined phrase "perceptual 60fps"? Usually COD games are optimized to drop frames but give the illusion of being consistently smooth at the target framerate. Usually in their tests this includes drops to above 55-52 in certain areas on 360. So to get that at 1080p is pretty decent. Even if its still an unoptimized piece of garbage.

I'd say BO 1 and 2 looked and performed much better with the same specs level. So i can only conclude that IW is incompetent at their jobs.
 
Now that we have comparisons of the new consoles and comparison to PC versions.

Do we have any idea, say for BF4, how much theoretical PC performance is needed to reach the PS4 result? We know that PS4 has about 1.8 GFlops GPU, but we also know that PC has some major overhead....so it would be interesting to see how much overhead for an actual game with high end graphics as BF4.

If we had these specs, we could also afterwards compare the initial costs of both platforms.

But I am just curious to learn more about the PC overhead...is it 10%, 50% or more for BF4???

DF would be the right people to achieve this imo...
 
In BF4 specifically the benchmarks show PC's with 7850's and 7870's performing at around the same performance as the PS4 so very little overhead in that case - on the GPU side at least. I suspect to build a fully fledged PC with equal or better performance that has all the functionality of the PS4 you're looking at about $750.
 
Now that we have comparisons of the new consoles and comparison to PC versions.

Do we have any idea, say for BF4, how much theoretical PC performance is needed to reach the PS4 result? We know that PS4 has about 1.8 GFlops GPU, but we also know that PC has some major overhead....so it would be interesting to see how much overhead for an actual game with high end graphics as BF4.

But we don't know if the PS4 version is much more than a straight port of the PC version that was tailored around the limitations of the PC. Indeed, considering what DICE mentioned in terms of work they expected to need to get more out of the Mantle API version and not knowing what new bottlenecks they would reveal, if they had already optimised for the PS4 that would have been partly a known quantity, and less work?
 
I just get the feeling that IW and DICE have basically dumped a straight port of the PC version onto the PS4 and dialled back features where necessary and have very little optimisation beyond a few modifications for compatibility where required.
 

I am extremely intrigued by this game now after hearing what they have to say about the anti-aliasing and hence the virtual lack of aliasing in the game. Something that hasn't been accomplished by any game on any console prior to this.

I'll have to see for myself if it actually manages to do this. I'll be quite pleasantly surprised if it does. Although a bit of a shame that the gameplay can't match the fidelity of the graphics.

Regards,
SB
 
I am extremely intrigued by this game now after hearing what they have to say about the anti-aliasing and hence the virtual lack of aliasing in the game. Something that hasn't been accomplished by any game on any console prior to this.

Under the right conditions, MLAA produces some great results as well. But yeah, Ryse seems to do really well in the visual department. So at least it's a hopeful next-gen start of CryEngine.
 
Under the right conditions, MLAA produces some great results as well. But yeah, Ryse seems to do really well in the visual department. So at least it's a hopeful next-gen start of CryEngine.

Sure in the right conditions it can sometimes provide decent quality. But it still isn't terribly hard to find aliasing in any game using it.

I'm still not convinced that Ryse will offer a quality of AA that I find acceptable, but I'm certainly interested in checking it out for myself to see if it can.

Regards,
SB
 
I just get the feeling that IW and DICE have basically dumped a straight port of the PC version onto the PS4 and dialled back features where necessary and have very little optimisation beyond a few modifications for compatibility where required.

I imagine outside of a few like Crytek thats what everybody did. It should of been much easier to port from your typical PC development enviroment to the PS4 especially when there was a lack of final hardware for most of the development cycle.

You can't really blame anyone for it. If you are going to produce an exotic piece of hardware then devs are going to need time to acclimate. Plus, game development on these pieces of hardware aren't academic exercises. Its a business, so the low single digit million userbases aren't going to encourage third parties to throw everything but the kitchen sink at these ports especially when there are 160 million users of the PS3 and 360 out there.
 
Status
Not open for further replies.
Back
Top