Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Just to stir the pot more ...

Shouldn't the comparison between console performance and PC performance always include using something like FRAPS that is recording a video of your game? After all, the consoles are always recording your video. Apples to apples and all that ...
 
PC version of PvZ at Ultra is running higher shadows, LOD and HBAO.

Call Of Duty Ghosts at Ultra, has MSAA/TXAA, NVIDIA HairWork, higher shadows, LOD, textures, Tessellations, HBAO+, reflections and particles.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Plants_vs._Zombies_Garden_Warfare-test-PVZ_1920.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Ghosts-cod-new-1920_msaa.jpg
 
digitalfoundry COD:G analysis:

"If you can throw enough horsepower at the PC version of the game, this offers the definitive experience - but in a world where even a GTX Titan can't run this game at a locked 1080p60 at max settings"

And yes, I do know that the game can run at higher settings.
 
Call Of Duty Ghosts at Ultra, has MSAA/TXAA, NVIDIA HairWork, higher shadows, LOD, textures, Tessellations, HBAO+, reflections and particles.

You do realise that the PS4/Xbox One versions of COD are also using things like tessellation, reflections and particles?
 
digitalfoundry COD:G analysis:

"If you can throw enough horsepower at the PC version of the game, this offers the definitive experience - but in a world where even a GTX Titan can't run this game at a locked 1080p60 at max settings"
Yes it's due to the massive performance hit from the Fur simulation settings, MSAA and the extra levels of soft shadows/tessellation.
Do you have links to where it's documented?
The same DF article you quoted has all the details you need. you can also look here for detailed analysis of the visual settings, consoles deploy High or Normal settings, while the PC can deploy Extra.
http://www.geforce.com/whats-new/guides/call-of-duty-ghosts-graphics-and-performance-guide#4
 
Last edited by a moderator:
Yes it's due to the massive performance hit from the Fur simulation settings, MSAA and the extra levels of soft shadows/tessellation.
The Titan has 4.5tf, do those marginal setting increases really fully occupy those additional 2.6tf it has over the PS4? I'm more likely to believe the developers of the boxes that state their closed nature mean they perform better than the equivalent PC hardware.

In the digitalfoundry video of the PS4's framerate analysis, they're constantly trying to test the game by being in the centre of explosions or with lots of action going on. The PC video that was posted here had no such equivalence.

I'm more likely to believe DF's analysis than hearsay from PC gamers. If they were to update the article stating that the analysis was wrong for some reason then I'd be happy to stand corrected.
 
The same DF article you quoted has all the details you need. you can also look here for detailed analysis of the visual settings, consoles deploy High or Normal settings, while the PC can deploy Extra.
http://www.geforce.com/whats-new/guides/call-of-duty-ghosts-graphics-and-performance-guide#4

Here's the text from the article:

digitalfoundry said:
Lower-resolution textures are utilised, leading to various surfaces featuring less in the way of high-frequency detail. Shadow quality also takes a noticeable dive: harsh edges and stair-step patterns align those elements closely with those found in the current-gen builds of the game, complete with visible transitions between shadow map cascades - something that we were hoping would be a thing of the past with the switch to more powerful hardware.

Moving on to visual effects, we find that particles and alpha appear slightly pared back on both consoles in certain scenes: water splashes in particular take a noticeable hit in resolution on Microsoft's system, looking chunkier and less detailed as a result. Meanwhile, on both consoles SSAO replaces the top-end horizon based alternative available on PC, leading to less impressive ambient shadow coverage across the game.



So, on the whole we are still looking at a massive upgrade over what is possible on current-generation hardware, despite some noticeable nips and tucks from the PC release.

Nowhere does it suggest that the consoles are on "normal" settings. I'd actually be more likely to assume that the consoles have some settings at max. So many surfaces are actually tessellated that I've observed when playing the game, I get light shafts in some areas too.
 
Just to stir the pot more ...

Shouldn't the comparison between console performance and PC performance always include using something like FRAPS that is recording a video of your game? After all, the consoles are always recording your video. Apples to apples and all that ...

I'm not sure that it should. One of the advantages (and disadvantages) of PC gaming is it's flexibility and subsequently it's lack of reliance on a predictable level of performance.

Consoles have to have predicable performance and so they have to allocate some permanent hardware resources to video recording if that's going to be a permanent function. That's not required on the PC. In normal gaming circumstances you won't be recording your game play and this you don't need to account for any performance drop associated with doing so because it doesn't exist. I suppose there's an argument to be made that you should therefore have two PC benchmarks, one with video recording and one without, but that would probably be a little overkill.

It's also worth considering that while FRAPS has a big performance hit there are other video recording methods that don't necessarily have the same hit. Shadowplay for example uses built in hardware on the NV GPU to record video and the performance hit is apparently only a couple of percent, probably comparable to the consoles. So in that sense I guess you could always include shadowplay (and it's AMD equivalent if such a thing exists) but the reason FRAPS was brought up in the current context was as a possible explanation for the unusually low performance in DF's results not seen from other sources. And it's been pretty much discounted already however I guess it's still possible that whatever they are using could be effecting performance in a similar way to FRAPs. Th FRAPS issue with PvZ sounds more like a bug to me than expected behavior.
 
The Titan has 4.5tf, do those marginal setting increases really fully occupy those additional 2.6tf it has over the PS4?

It's well documented that "ultra" settings on PC games can have a disproportionate effect performance compared with the visual benefit they bring. For example, here's a benchmark of Ghosts showing around a 50% performance increase from dropping settings down from Ultra to Extra:

http://linustechtips.com/main/topic...and-1440p-benchmarks-and-graphics-comparison/

And there are tons of extra effects going on in the PC version of Ghosts compared with the PS4 so it's entirely possible the PS4 is running at lower than "Extra" settings although an exact comparison is impossible without knowing this.

Also, quoting the 4.5 TFLOPs as the only measure is hardly accurate. Shader and texture performance may be off the scale compared with the PS4 but memory bandwidth and ROP throughput are only around 65% higher.

I'm more likely to believe the developers of the boxes that state their closed nature mean they perform better than the equivalent PC hardware.

I think that goes without saying but it's a matter of to what degree? As far as I'm aware no developer has specifically stated that they can effectively match or exceed anything the GTX680 can achieve in a PC using the current consoles.

But in the end it obviously comes down to how well optimized each of the versions of the game are. I'm sure that a very well optimized game on the PS4 could match the performance of the same game on a GTX680 if the PC port was fairly terrible. But what if the PC port is very good? What if it makes heavy use of NV Gameworks or Mantle/DX12 features? Or even if it has a specific code path each for Kepler and GCN? Would the consoles still be able to match the performance of a GTX680/HD7970?

In the digitalfoundry video of the PS4's framerate analysis, they're constantly trying to test the game by being in the centre of explosions or with lots of action going on. The PC video that was posted here had no such equivalence.

I'm more likely to believe DF's analysis than hearsay from PC gamers. If they were to update the article stating that the analysis was wrong for some reason then I'd be happy to stand corrected.

It's not really hearsay given that videos of in game action with frame rate counters have already been posted. Here's another for good measure:

https://www.youtube.com/watch?v=JtnHrdQFcDM

I grant the run through from the russian website that provided the detailed benchmarks was too basic to be comparable to DF's test but the 3 YT videos I've posted all feature some heavy combat but non of them get anywhere close to sustained drops in to the mid forties.

Nowhere does it suggest that the consoles are on "normal" settings. I'd actually be more likely to assume that the consoles have some settings at max. So many surfaces are actually tessellated that I've observed when playing the game, I get light shafts in some areas too.

I'm not sure it matters exactly how the settings compare since it's already been established that there's lots moire going on in the PC version. That straight away invalidates any performance comparisons to the PC version running at maximum and thus CoD Ghosts cannot at this stage be used as some kind of evidence that consoles are performing at 680 or above levels.

If the developers would tell us exactly how the settings compare (like Crytek did with Crysis 3) then we'd be able to run apples to apples comparisons but all we know as of now is that the PC version does a lot more and also runs a lot slower. So nothing really.
 
It's well documented that "ultra" settings on PC games can have a disproportionate effect performance compared with the visual benefit they bring. For example, here's a benchmark of Ghosts showing around a 50% performance increase from dropping settings down from Ultra to Extra:

http://linustechtips.com/main/topic...and-1440p-benchmarks-and-graphics-comparison/

And there are tons of extra effects going on in the PC version of Ghosts compared with the PS4 so it's entirely possible the PS4 is running at lower than "Extra" settings although an exact comparison is impossible without knowing this.

Also, quoting the 4.5 TFLOPs as the only measure is hardly accurate. Shader and texture performance may be off the scale compared with the PS4 but memory bandwidth and ROP throughput are only around 65% higher.



I think that goes without saying but it's a matter of to what degree? As far as I'm aware no developer has specifically stated that they can effectively match or exceed anything the GTX680 can achieve in a PC using the current consoles.

But in the end it obviously comes down to how well optimized each of the versions of the game are. I'm sure that a very well optimized game on the PS4 could match the performance of the same game on a GTX680 if the PC port was fairly terrible. But what if the PC port is very good? What if it makes heavy use of NV Gameworks or Mantle/DX12 features? Or even if it has a specific code path each for Kepler and GCN? Would the consoles still be able to match the performance of a GTX680/HD7970?



It's not really hearsay given that videos of in game action with frame rate counters have already been posted. Here's another for good measure:

https://www.youtube.com/watch?v=JtnHrdQFcDM

I grant the run through from the russian website that provided the detailed benchmarks was too basic to be comparable to DF's test but the 3 YT videos I've posted all feature some heavy combat but non of them get anywhere close to sustained drops in to the mid forties.



I'm not sure it matters exactly how the settings compare since it's already been established that there's lots moire going on in the PC version. That straight away invalidates any performance comparisons to the PC version running at maximum and thus CoD Ghosts cannot at this stage be used as some kind of evidence that consoles are performing at 680 or above levels.

If the developers would tell us exactly how the settings compare (like Crytek did with Crysis 3) then we'd be able to run apples to apples comparisons but all we know as of now is that the PC version does a lot more and also runs a lot slower. So nothing really.
Is all of your logic here used to discount digitalfoundry's actual findings? It shouldn't have happened therefore it hasn't? That sounds a tiny bit Orwellian - "ignorance is bliss" and all that ;)
 
Is all of your logic here used to discount digitalfoundry's actual findings? It shouldn't have happened therefore it hasn't? That sounds a tiny bit Orwellian - "ignorance is bliss" and all that ;)

When 1 data point seems to contradict all the others it makes sense to question why that might be. I've posted 3 in game video's so far from similar hardware setups to DF all showing much higher performance to what DF are reporting. I'm not suggesting DF are lying, I put high stock in their articles but something defintely seems strange perhaps a driver or hardware configuration bug that's lowering performance? Or perhaps whatever method they're using to capture framerate is harming performance? Of course it's also possible they are just testing a particularly stressful area of the game but it would have to be crazy stressful (as in about half the performance of the video's we've seen so far) to hit the framerates DF are reporting. And if that's the case, who's to say it's the GPU causing that performance drop and not the CPU? We don't know fast the CPU that they're using is.

PvZ supports Mantle but we know for sure DF aren't using it in that test so there's also every possiblity that the game running on the same CPU with a GPU equal to or even weaker than the 680 could produce a consistent >60fps using the Mantle path. We simply don't have enough data to say whether or not that could be the case and so concluding from this that the consoles must now be performing above the level of a 680 doesn't seem to make a lot of sense.
 
I see the PS4 consistently performing better than the cheap PC GPU and the Xbox One. That what you're expecting to hear?

The PS4's GPU has a wider memory bus, more memory bandwidth, more ROPs and more texture units which should give PS4 the advantage, the R9 270X, which is technically more powerful than the PS4's GPU, achieves considerably better results. Does it not surprise you that a "cheap PC GPU" holds up well with the PS4? Why am I not seeing the "closed box advantages" or the "2X power of equivalent PC hardware"?
 
Last edited by a moderator:
The PS4's GPU has a wider memory bus, more memory bandwidth, more ROPs and more texture units which should give PS4 the advantage, the R9 270X, which is technically more powerful than the PS4's GPU, achieves considerably better results. Does it not surprise you that a "cheap PC GPU" holds up well with the PS4? Why am I not seeing the "closed box advantages" or the "2X power of equivalent PC hardware"?
I think we already are with Tomb Raider, Plants vs Zombies and COD. I would love to see how that 260X compares this time next year.
 
I think we already are with Tomb Raider, Plants vs Zombies and COD. I would love to see how that 260X compares this time next year.

They aren't rendering the same thing so it's not an apples to apples comparison.

Maybe an example would help. Lets say the xb1 version of a game rendered at 60fps 1080p, and the ps4 version of the same game rendered at 54fps 900p but with longer draw distance and nothing more than that, everything else is identical to the xb1 version. Would you then think that the xb1 hardware must be much more efficient than the ps4 because it renders at a more stable frame rate and at a higher resolution? Or would you actually take into account that the ps4 version is doing somewhat more work per frame?

Ok, now what if I told you that the ps4 version actually did even more than that and improved the visuals in more ways that just draw distance including better lod's, better use of tessellation, better shadows and so on. Would you still think the xb1 is the more efficient hardware?

Finally, take thinking and apply it to the pc versions of console games and there is your answer as to why these are not apples to apples comparisons. I mean I can understand why people are confused, some people read "2x more efficient" in dev interviews and think that applies to the entirely of the game and hence 2tf on console is equal to 4tf on pc. But that's not the case.
 
Status
Not open for further replies.
Back
Top