Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Not that I'm doubting brute force rendering which most PC GPUs tend to do... but I don't subscribe to the PC mantra that PCs will always sustain "consistent" frame-rates.

One thing which can affect that is what else they have running on their pc. I use my gaming pc purely for games and for overnight video renders, nothing more so mine runs consistent and clean for gaming. I had a friend that couldn't figure out why he had inconsistent performance on games and it turned out he was leaving a torrent app downloading and seeing constantly in the background among other things.

Also some people still use ancient operating systems which simply don't run as well. I use Windows 8.1 which runs somewhat faster and leaner than previous versions but many stick to os's based on decade old code so it's not surprising they have issues whereas mine runs solid and consistent. Those people running ancient version of Windows will have worse file i/o, worse memory use and on and on.

Regarding benchmarks if they are going by lowest fps and not average then their numbers will always be wrong on pc. That's because sometimes on pc (depending on the game) data will have to cross the pci bus en mass, like on a level change, large visibility change or something like that. That will sometimes cause a stutter. In the real world that's largely irrelevant as it represents 0.0000001% of your gaming time, but it will appear on a min fps measurement which gives the illusion of large slowdowns even if it really isn't. It's why I always personally disregard min and peak fps and instead look more to the average. Min is useless because it doesn't represent reality and peak is useless because who cares about 200fps when you are looking at a wall.
 
Game for game is there any proof of this? Meaning; graphs/articles showing GTX 680 compared to PS4/XB1? I'm not talking about approximation graphs, nor PS4/XB1 PC GPU variants graphs... but actual titles running on both GTX 680 and PS4/XB1 showing these "consistent" performance/frame-rates that the GTX 680 holds over them.

There's plenty of evidence out there but it's a fairly major task to compile it all for every cross platform game available. Pretty much every cross platform game out there has benchmarks available so you should probably check a few of them out, obviously you have to try to account for the higher detail settings of the PC versions in most cases but generally a 7870 seems more than capable of performing at or above the PS4's level.

But anyway, back to PvZ, here's a video of of a 680 + a rather old Q9650 quite comfortably staying well above 60fps throughout the entire 7 minute play through. And that's at the higher than PS4 "Ultra" settings:

https://www.youtube.com/watch?v=VP1hJoGw7WU

EDIT: Further searches of YouTube seem to indicate that people are seeing major performance drop offs when using FRAPS as opposed to not. That's one possible cause for the strange DF results (if they are using FRAPS or perhaps something similar).

EDIT 2: Another video this time on a 770 and a different map. Same result though, Ultra settings, and frame rate well over 100fps most of the time and never falling below 90fps. Sounds very different from DF's sustained drops into the mid forties...

https://www.youtube.com/watch?v=kiQPFdQB00M
 
Under circumstances where the GPU is no longer a bottleneck, a lightening fast PC CPU could easily make frame rates spike much higher than PS4Bone could ever hope to achieve.

In this case, being less consistent would in no way be a bad thing; it would be a demonstration of the immense Thor's Hammer like power of the PC CPU. A Thor's Hammer wielded by the toned, warrior-like man-deitys of the PC Gaming Master Race.
 
EDIT: Further searches of YouTube seem to indicate that people are seeing major performance drop offs when using FRAPS as opposed to not. That's one possible cause for the strange DF results (if they are using FRAPS or perhaps something similar).

Oh if they use fraps then yeah, that would do it. That's like measuring a cars 0 to 60 time by adding another 500 pounds of measuring equipment in the car (+10 car analogy bonus). The only way to get a proper fps count is if it's built into the game, or from benign tools like msi afterburner that have very little overhead.
 
I'm sure that's not the case. Whatever DF uses for recording isn't FRAPS. It would make no sense for them to even benchmark with FRAPS recording in the background. It's probably a capture card.
 
There's plenty of evidence out there but it's a fairly major task to compile it all for every cross platform game available. Pretty much every cross platform game out there has benchmarks available so you should probably check a few of them out, obviously you have to try to account for the higher detail settings of the PC versions in most cases but generally a 7870 seems more than capable of performing at or above the PS4's level.

But anyway, back to PvZ, here's a video of of a 680 + a rather old Q9650 quite comfortably staying well above 60fps throughout the entire 7 minute play through. And that's at the higher than PS4 "Ultra" settings:

https://www.youtube.com/watch?v=VP1hJoGw7WU

This video is perfectly in accordance with DF findings. Thank you for your link. I have watched the whole video: the Vertical sync is OFF:

At 3:23 framerate drops to 62fps
4:10 framerate drops to 61fps
5:46 framerate drops to 59fps

So in a normal playthrough, without purposefully stressing the game, with vertical sync OFF this PC build with a 680GTX is performing roughly like the PS4 game which may similarly occasionnaly drop to 59fps or 58fps.

Depending of the CPU used, and the frequency of the 680GTX used, if they used a Vertical sync like the PS4 version, DF may well have found drops to mid forties during some framerate stress test (close encounters with the most ennemies + alphas etc.) with their 680GTX equipped PC build.
 
Last edited by a moderator:
I'm sure that's not the case. Whatever DF uses.
DF uses an uncompressed (well, losslessly compressed) video capture from the video out port. The DF articles were actually preceded with Richard Leadbetter making this capture hardware available to buy IIRC, for professional use. Framerate measurement is then performed with their proprietary analysis tool comparing sequential frame content.

The other PC measurements though, I've no idea. ;)
 
All right. I think the website digitalfoundry.com is also owned by one of the people that contribute there. Richard Leadbetter, the guy that does most of the articles, has his own profile: http://www.digitalfoundry.org/showcase/profile.html

They've updated it, I remember the old version of the website, apparently, advertised some kind of recording equipment.

EDIT: I confused digitalfoundry.com with digitalfoundry.org. The latter just redirects to Eurogamer's DigitalFoundry section.
 
This video is perfectly in accordance with DF findings. Thank you for your link. I have watched the whole video: the Vertical sync is OFF:

At 3:23 framerate drops to 62fps
4:10 framerate drops to 61fps
5:46 framerate drops to 59fps

So in a normal playthrough, without purposefully stressing the game, with vertical sync OFF this PC build with a 680GTX is performing roughly like the PS4 game which may similarly occasionnaly drop to 59fps or 58fps.

Depending of the CPU used, and the frequency of the 680GTX used, if they used a Vertical sync like the PS4 version, DF may well have found drops to mid forties during some framerate stress test (close encounters with the most ennemies + alphas etc.) with their 680GTX equipped PC build.

FRAPS.

And vsync isn't going to drop you to much less (if practically any) than 59 if you're triple buffering.

680 is way beyond the PS4 GPU both theoretically and practically.
 
It seems a stretch at best to conclude from this that consoles are now performing at or above 680 levels when they clearly haven't in any other game yet released.

Didn't COD Ghosts do something similar, or are we expected to believe that's because it hasn't been properly optimised for PC?

Also, I don't buy people stating that FRAPS negatively effects performance; it's a programme that must take several MB max to run. What's more likely, the frame measurement software doesn't function properly or people can't truly tell the difference between 50fps and 60? I know which I'm more inclined to believe.
 
Also, I don't buy people stating that FRAPS negatively effects performance; it's a programme that must take several MB max to run.
The amount of RAM it uses isn't the issue (and generally isn't unless you're forcing applications to spill horribly into virtual memory, which will of course cause performance to fall off a cliff).

The issue is the amount of compression-related processing and/or HDD access that a high-quality capture might have to do.
 
FRAPS has significant impact when recording video, as in the video pjbliverpool linked to.

Edit: Beaten!
 
Last edited by a moderator:
This video is perfectly in accordance with DF findings. Thank you for your link. I have watched the whole video: the Vertical sync is OFF:

At 3:23 framerate drops to 62fps
4:10 framerate drops to 61fps
5:46 framerate drops to 59fps

So in a normal playthrough, without purposefully stressing the game, with vertical sync OFF this PC build with a 680GTX is performing roughly like the PS4 game which may similarly occasionnaly drop to 59fps or 58fps.

Depending of the CPU used, and the frequency of the 680GTX used, if they used a Vertical sync like the PS4 version, DF may well have found drops to mid forties during some framerate stress test (close encounters with the most ennemies + alphas etc.) with their 680GTX equipped PC build.

I fail to see how a single momentary (well under a second) blip to 59fps in a 7 minute video equates to "sustained drops in performance" or "frame-rate drops down into the mid-forties". Also bare in mind that video was taken with a very old CPU which could have been the culprit for the performance blip.

I'm not seeing how having vsync turned on would have changed that result at all. You'd still have been talking about a rock solid 60fps with 1 momentary blip at higher graphical settings than the PS4 on a practically ancient CPU. And the 770 (only marginally faster than a 680) video I posted performs even better again.
 
Also, I don't buy people stating that FRAPS negatively effects performance.

Lots of PC gamers use FRAPS or other recording software to capture their gameplay videos, anyone of them who has experience of using this software knows that this tanks the frame-rate whenever you record something.

From my experience of using it, the higher the resolution my game is running at, the higher the performance FRAPS will demand for. It also depends on what settings I've set in FRAPS itself, ticking the "Force lossless RGB capture" option will prove even more troublesome for the hardware.

Often you'd find people on YouTube playing on settings lower than they usually do just to compensate for the frame-rate loss by FRAPS.
 
Didn't COD Ghosts do something similar, or are we expected to believe that's because it hasn't been properly optimised for PC?

Not that I'm aware of. I've not seen any benchmarks comparing the two at the same quality settings - baring in mind that the top PC quality setting - "Ultra", takes significantly more performance than the next level down which I'd assume is what the PS4 uses since it's pretty well documented that the PC version has several graphical advantages over the consoles in this game.

The closest I've seen are the following:

http://www.tomshardware.co.uk/call-of-duty-ghosts-pc-performance,review-32840-7.html
http://www.hardwarepal.com/call-duty-ghosts-benchmark-cpu-gpu-performance/4/
 
FRAPS has significant impact when recording video, as in the video pjbliverpool linked to.

Edit: Beaten!

I am sorry but that's argument is not enough. How much does it impact? Does it really impact that much when it runs at 1080p~60fps, at 1440p150fps I can understand it but here?

Also PS4 is still capped at 60fps which has been already shown to impact significantly the absolute minimum framerate in same areas in a game, like in Infamous capped or not capped at 30fps.

This PC video is far from being a framerate stress test like DF usually do and it's only one map and there is still the question of different overclocking between 680 GPUs.

Finally have we already seen one case of Digital Foundry unfairly favoring the PS4 when it's tested against any PC? If they say they have noticed, even rarely, ~40fps drops in their 680 PC, why would they suddenly cheat and somehow make those ~40fps numbers up?

We know Frostbite engine is strongly optimized for AMD GPUs and this game is also optimized for the PS4, would it be so unreal if the PS4, just in this particular game, would outperform slightly a nvidia 680 during only the minimum framerate dips? It doesnt mean the 1.84tflops GPU in the PS4 is more powerful than a 680GTX ,we know it isn't, please don't let you overflow by the "PC master race" argument for one specific game and one specific testing: only looking at the occasionnal absolute minimum fps.
 
Try FRAPS out on your PC and see for yourself. There's no need to be in any doubt about this! Whenever you see a youtube video recorded with FRAPS, know that it's hurting because of FRAPS.

FRAPS is great, but there is a cost to using it.

With your Infamous remarks you may be confusing vsync + triple buffering, with a frame cap. The longest frame time that DF found for both capped and uncapped is 50 ms, equivalent to 20 fps. The way averages (mean) values are calculated will mean that minimum frame rate calculated from a sequence of trailing frames will be lower for the capped version of the game.
 
Status
Not open for further replies.
Back
Top