Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
a shame that the only AMD card they tested was the 7770 but, I don't undestand this quote

"The final PC, utilising a lower-end Radeon HD 7770 with a Core i5 3570K at stock speeds, fared much worse. At 1080p with the highest details and v-sync disabled we averaged around 17fps. Only by dropping to 720p were we able to reach a fairly steady 30fps. Unlocking the frame-rate on this machine ultimately did little more than create additional judder, with frame-rates just barely over 30fps. In general, performance here is significantly slower than that of the Xbox One version"

how can 720P at "fairly steady 30fps" be significantly slower than the Xbox One!?
if they claimed last year the XO version was 720P 20-30FPS?
 
a shame that the only AMD card they tested was the 7770 but, I don't undestand this quote

"The final PC, utilising a lower-end Radeon HD 7770 with a Core i5 3570K at stock speeds, fared much worse. At 1080p with the highest details and v-sync disabled we averaged around 17fps. Only by dropping to 720p were we able to reach a fairly steady 30fps. Unlocking the frame-rate on this machine ultimately did little more than create additional judder, with frame-rates just barely over 30fps. In general, performance here is significantly slower than that of the Xbox One version"

how can 720P at "fairly steady 30fps" be significantly slower than the Xbox One!?
if they claimed last year the XO version was 720P 20-30FPS?

I even remember DF showed us one 17fps minimum in the XB1 version. But yes, the main point of this article was to conclude that XB1 >= HD7770, whether it's true or not.
 
Last edited by a moderator:
A good article IMO, lots more detail than the last one but essentially the same conclusion.

DF really need to check their own expectations of PC hardware though, the article reads like the performance is surprising compared to the XBO version but given a Titan is no-where near powerful enough to provide 4.5x the XBO performance - even on paper - this shouldn't come as a surprise at all.
 
The game also clearly favors AMD hardware over Nvidia. 290X could have got that steady 60fps. I hope multigpu support comes to this game.
 
I even remember DF showed us one 17fps minimum in the XB1 version. But yes, the main point of this article was to conclude that XB1 >= HD7770, whether it's true or not.
I find the fact that HD 7770 was running an avg 30fps stable very encouraging, considering how poorly the XO version ran.

but once again, exiting a building to be met with an undead parade still gives us that unwanted drop to 20fps.

It's also unfortunate that the game runs at far from the promised locked 30fps - rather a chugging 20fps during the zombie action while out in the big city.

http://www.eurogamer.net/articles/digitalfoundry-vs-dead-rising-3

Our pre- and post-patch analysis video shows low frame-rates of around 22fps when alpha effects and a large number of enemies come into play both before and after we updated the game.

http://www.eurogamer.net/articles/digitalfoundry-2014-dead-rising-3-patch
 
They kind of contradict themselves. They say 720p gives a fairly steady 30fps on the 7770 (which sounds about the same, if not better than the XB1 version) but then go on to say that the 7770 performs significantly slower than the XB1?

edit: point already brought up.
 
Last edited by a moderator:
They kind of contradict themselves. They say 720p gives a fairly steady 30fps on the 7770 (which sounds about the same, if not better than the XB1 version) but then go on to say that the 7770 performs significantly slower than the XB1?

edit: point already brought up.

From the article...

The final PC, utilising a lower-end Radeon HD 7770 with a Core i5 3570K at stock speeds, fared much worse. At 1080p with the highest details and v-sync disabled we averaged around 17fps. Only by dropping to 720p were we able to reach a fairly steady 30fps. Unlocking the frame-rate on this machine ultimately did little more than create additional judder, with frame-rates just barely over 30fps. In general, performance here is significantly slower than that of the Xbox One version - perhaps not surprising bearing in mind that the HD 7770 (aka the Radeon R7 250X) doesn't stack up favourably against the Xbox One's GPU. Our advice? An Intel quad-core processor is a must and, if you're looking for 1080p30 gameplay, a GTX 760 or Radeon R9 280/285 is recommended. Remarkably we can't recommend any single CPU/GPU combo that can sustain this game at 1080p60.

Things to note. PC version is not v-synced (they disabled it) while Xbox One is v-synced. That will lead to larger frame rate drops as it cannot display a frame until it can sync with the output with the display.

Also if you bothered to look at the original XBO article, the XBO is already a fairly steady 30 fps. It's only when certain circumstances happen that the framerate drops as low as 20. You can watch their video's to see the framerate graphs while they play.

I wish they had posted a video of their 7770 gameplay for a better comparison. But I'm going to bet that while it is still a fairly steady 30 FPS just like the XBO version the drops are more frequent and larger on the 7770 than the XBO.

Regards,
SB
 
Ah, I read past the v-sync part, I thought that was just for 1080p.

But FWIW, I did see the pre-patch vs patched video and there were occasional dips, and a few major ones (as you mentioned). To me, that sounds about the same (if not worse) than a "fairly steady 30fps". "Significantly slower" still seems to be a bit of a stretch. But I agree, it would be nice to see a comparison with a 7770 and vsync on.

edit: Dunno if DR is more CPU intense, but this rig with a faster CPU than DF and 7770 ran DR3 at 1080p VHQ fine.
http://www.neogaf.com/forum/showthread.php?p=128868560#post128868560
 
Last edited by a moderator:
Ah, I read past the v-sync part, I thought that was just for 1080p.

But FWIW, I did see the pre-patch vs patched video and there were occasional dips, and a few major ones (as you mentioned). To me, that sounds about the same (if not worse) than a "fairly steady 30fps". "Significantly slower" still seems to be a bit of a stretch. But I agree, it would be nice to see a comparison with a 7770 and vsync on.

edit: Dunno if DR is more CPU intense, but this rig with a faster CPU than DF and 7770 ran DR3 at 1080p VHQ fine.
http://www.neogaf.com/forum/showthread.php?p=128868560#post128868560

Yeah, that's the gamegpu benches, which is at odds with what DF found, but I'd be more inclined to trust DF. I've always been a tad suspicious of gameGPU benches. Some things I've noticed in the past, their benches always seem to favor Intel/Nvidia more than some other sites, and another weird oddity I noticed was their 7850 FPS seems, at least in maybe a dozen or so reviews I've looked at, to be always exactly 50% more than their 7770 score. Almost to me as if they cheat and just multiply the 7770 by 1.5 to get the 7850 benchmark, or something. I get that 7850 is in theory ~exactly 50% faster than 7770, but it's just odd to me the results could possibly be so consistent across so many games and not show a flyer here or there, lets say a game more bandwidth limited should be >50% slower on 7770 just as an example. What I'm saying is I dont think I've ever seen a bench from there where 7850 isn't exactly 50% faster than 7770, after allowing for odd numbers (for example if 7770 is 31 FPS, then 7850 will either be 46 or 47 FPS). plus whenever I see a site bench like 50 cards, I always wonder if they really did that or how much time they spend on each run.

Not that their benches couldn't be 100% legit after all that :p
 
"We timed it at a lengthy one minute 15 seconds on our system - on par with the Xbox One version of the game. What's troubling is that disk access remains at zero per cent during the first 40 seconds of this wait while CPU, memory and GPU time all drop significantly as well. It's almost as though the PC is just idling for a good 40 seconds before it even attempts to perform any operations"

This is the weirdest part of the whole analysis for me. Combined with the fact that resolution is locked at the same as the Xbox One, are Microsoft requesting that games are presented exactly the same as the console? This isn't the first game doing this on PC and I'd argue that Destiny is most likely another game employing the same type of logic.

Not good at all.
 
Surely MS has no say whatsoever over PC titles. There's no licensing, and no terms to use DirectX AFAIK. They also don't control the major store-front, so can't exclude a game from being sold if it doesn't hit their requirements, unlike Valve.
 
Nor do they hold any licensing over Playstation, yet we still see games that are a match on both systems regardless of the power differential.

I realise it probably won't be a popular view, and there may not be anything in it, but it's very odd indeed.
 
Last edited by a moderator:
Or that Bungie does not want to have an even more fragmented codebase to maintain. PC can bruteforce what the consoles does through "elegance" why spend time on adding more to it if you do not have to?
If PC was a big revenue stream for them, they would of course cater to the PC users. Heck if they get away with compile and run on the PC, why not?
PS4 vs X1, why not try to keep them equal unless it seriously hampers the game.

I am interested in seeing which generation of consoles actually will be the biggest earner for them with Destiny. PS3/X360 vs PS4/X1 and if you want to break it down even more, PS3 vs X360 vs PS4 vs X1 vs PC. And then map that up against perceived spent resources on each platform. Will most resources have been spent on the older generation of consoles or the current ones? :)
 
Nor do they hold any licensing over Playstation, yet we still see games that are a match on both systems regardless of the power differential.

I realise it probably won't be a popular view, and there may not be anything in it, but it's very odd indeed.
You mean leveraging their console position to influence the PC and PS4 space? That is, "make you PC game have performance parity with XB1 or we won't allow you to release on XB1"? It's a logical theory, but one I seriously doubt. XB1 isn't in a strong position. Devs could just ignore it and let it die without necessarily losing massively, not in the long run anyway. It'd be more in their interests to tell MS to shove such attitudes.

If MS were doing this, and it outs, the shitstorm would be of biblical proportions. The outrage from PC gamers would domnaite the web. Valve could then fly in as a saviour, move people to Linux and gaming, and undermine MS's PC provisor position. The negatives of such action far outweigh the positives (PC gamer - "Oh, my PC can't play it any better than XB1, I'll get an XB1" and Joe Gamer - "Should I get a PC or XB1? Games look the same on both, so I'll get an XB1" - real number of XB1 converts can't be that high). I'm far more inclined to view it as a cross-platform engine not well optimised for PC and not scaling well to the hardware, or even the devs took a shortcut and decide to create a fixed-res game and port to devices, or even that it's just a but rushed and they'll patch out the PC shortcomings later, just make sure there's a game out there that works on release day.
 
From the article...



Things to note. PC version is not v-synced (they disabled it) while Xbox One is v-synced. That will lead to larger frame rate drops as it cannot display a frame until it can sync with the output with the display.

Also if you bothered to look at the original XBO article, the XBO is already a fairly steady 30 fps. It's only when certain circumstances happen that the framerate drops as low as 20. You can watch their video's to see the framerate graphs while they play.

I wish they had posted a video of their 7770 gameplay for a better comparison. But I'm going to bet that while it is still a fairly steady 30 FPS just like the XBO version the drops are more frequent and larger on the 7770 than the XBO.

Regards,
SB

the quote is not clear on being "fairly steady 30fps" with it disabled or not, it makes me think it was enabled, because they only said it was disabled for 1080P, and then they dropped resolution for 720P for "fairly steady 30fps", but after that they mention disabling vsync again, which gave them judder and little more than 30fps.


Yeah, that's the gamegpu benches, which is at odds with what DF found, but I'd be more inclined to trust DF. I've always been a tad suspicious of gameGPU benches. Some things I've noticed in the past, their benches always seem to favor Intel/Nvidia more than some other sites, and another weird oddity I noticed was their 7850 FPS seems, at least in maybe a dozen or so reviews I've looked at, to be always exactly 50% more than their 7770 score. Almost to me as if they cheat and just multiply the 7770 by 1.5 to get the 7850 benchmark, or something. I get that 7850 is in theory ~exactly 50% faster than 7770, but it's just odd to me the results could possibly be so consistent across so many games and not show a flyer here or there, lets say a game more bandwidth limited should be >50% slower on 7770 just as an example. What I'm saying is I dont think I've ever seen a bench from there where 7850 isn't exactly 50% faster than 7770, after allowing for odd numbers (for example if 7770 is 31 FPS, then 7850 will either be 46 or 47 FPS). plus whenever I see a site bench like 50 cards, I always wonder if they really did that or how much time they spend on each run.

Not that their benches couldn't be 100% legit after all that :p

I've been seeing their results for a few years, the thought of simulating performance never came to me, but at the beginning their testing was quite weak, they didn't simply test different cards but like 5 different PCs (and made CPU comparisons with different graphics cards..) probably from different people and so it was very hard to take them seriously, but now it seems to be something more controlled.;..
what you said is something worth investigating,

I took 100% a random game because it's what I could find in 2 different sources,

http://gamegpu.ru/images/remote/htt...ein_The_New_Order_-test-WolfNewOrder_1920.jpg

the 50% difference you noticed,

http://pclab.pl/zdjecia/artykuly/chaostheory/2014/05/wolfenstein/charts/wolfenstein_1920.png

around 45% I think

not sure what to make of it.

---

but at the same time I think DF lacks the experience when it comes to PC testing,
 
plus whenever I see a site bench like 50 cards, I always wonder if they really did that or how much time they spend on each run.

Not that their benches couldn't be 100% legit after all that :p
The site is a game compatibility test lab located in Russia, the kind that developers use for generating system requirements for their games.

They also don't do extesnive tests like Digital Foundry, the most they can do is bench 2-3 minutes of gameplay (juts like any tech-site actually). Digital Foundry obviously tests more than that.

I took random multiple samples, I found the percentage ranges from (47~53%), so it's not exactly always 50%.

This is the weirdest part of the whole analysis for me. Combined with the fact that resolution is locked at the same as the Xbox One, are Microsoft requesting that games are presented exactly the same as the console?
It's just a bad port, that's all, they are working on a patch as we speak.

the quote is not clear on being "fairly steady 30fps" with it disabled or not, it makes me think it was enabled, because they only said it was disabled for 1080P, and then they dropped resolution for 720P for "fairly steady 30fps", but after that they mention disabling vsync again, which gave them judder and little more than 30fps.
Yeah, I am thinking the same thing, besides it doesn't matter XO is running V.Sync or not, it is irrelevant, because it most likely drops it at the first sign of trouble, it's not like V.Sync is causing the drops, on contrary actually.

Also if you bothered to look at the original XBO article, the XBO is already a fairly steady 30 fps. It's only when certain circumstances happen that the framerate drops as low as 20. You can watch their video's to see the framerate graphs while they play.

The statement from the team is that a locked 30fps is targeted here, but in the build we saw there's a huge gap to be bridged in this regard; drops to 20fps are consistent and sustained when outdoors, with 16fps being our record low during some of the biggest explosions.
http://www.eurogamer.net/articles/digitalfoundry-vs-dead-rising-3
 
Last edited by a moderator:
Nor do they hold any licensing over Playstation, yet we still see games that are a match on both systems regardless of the power differential.

We saw the exact same thing last gen as well where Sony was applying pressure to have games on their weaker ps3 hardware match the 360 versions. In the end though many of us voluntarily made all versions match up as much as possible. That's because we eventually realized that you have to deal with gamers as they are, in that they get outraged over anything so it's safer to voluntarily get things running the same on all consoles whenever possible. There will still be some outrage as there always is with console gamers, but it results in the least amount of negative press so it's the smartest way to go.
 
We saw the exact same thing last gen as well where Sony was applying pressure to have games on their weaker ps3 hardware match the 360 versions. In the end though many of us voluntarily made all versions match up as much as possible. That's because we eventually realized that you have to deal with gamers as they are, in that they get outraged over anything so it's safer to voluntarily get things running the same on all consoles whenever possible. There will still be some outrage as there always is with console gamers, but it results in the least amount of negative press so it's the smartest way to go.

Do you any actual examples of 360 games getting netfed with the purpose of looking just like a PS3 version.

Or are you referring to extra resources needed to get the PS3 versions up to par.

Because it's funny that there is so many examples of 360 coming out in front of the PS3 from the df tests. So whoever was supposed to make the games look alike obviously didn't do a great job.
 
Also, 'weaker' is not what I would call the ps3... more like a lot more complicated. But yeah, PS3 was noticeably inferior a lot of the time, especially earlier on. Not until year 3 or so were PS3 ports getting better and closer to par with X360, but that was probably because the tools and experience with the hardware improved. It even squeaked out the odd win from time to time. Even late in the PS3s life, we would see X360 with the advantage a lot of the time; whether it be better particle effects, better AA, better textures or a higher resolution etc.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top