Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Frame rates have been shit this generation, I think it's fair to praise 4A for what they've done.

The Sony/ND ultra-fanboys are going apeshit over the comparison to LoU frame rates though. Can't help wondering if RLB was being a little mischievous there. :LOL:
 
Frame rates have been shit this generation, I think it's fair to praise 4A for what they've done.

The Sony/ND ultra-fanboys are going apeshit over the comparison to LoU frame rates though. Can't help wondering if RLB was being a little mischievous there. :LOL:

Calling people fanboys for pointing out its not a fair comparison is whats wrong with forums. Dont agree with someone and you automatically labeled a fanboy. I remember a time when using derogatory words like fanboy was a bannable offence
 
I'm calling people fanboys for being fanboys, and nursing butthurt and spewing shit about Richard Leadbetter and DF whenever they manage to interpret something in a way that doesn't help their side of the console war. But this isn't the time or place for this, so we should leave this discussion.

Frankly, 4A have achieved something that ND didn't despite it being one of ND's stated (and publicised) aims, and despite 4A making two vastly more visually accomplished titles for two platforms in a matter of months. Is it helpful for DF to point that out? That's up for debate, but frankly I'm glad that someone is finally holding the industry to account on frame rates instead of just regurgitating P.R. containing frame rate claims, bullshots and the inevitable "1080p" checkbox hyping.

I wish that DF had more resources to dig deeper, into more games, and get more developers to open up to them. Look at this nugget:

Oles Shishkovstov said:
Counting pixel output probably isn't the best way to measure the difference between them though. There are plenty of other (and more important factors) that affect image quality besides resolution.

Golden.
 
You are completely missing the point. its not a fair comparison to make. It took considerable effot just to get Tlou remastered to run on the ps4.
 
The comparison to the Tlou remastered is a very stupid one. Such a good interview otherwise.
Leadbetter just seems not to be able to help himself sometimes.

And apparently he still didn't understand what ND did in 5 months with TLOUR: they bascially retro-engineered the most hardware intensive PS3 game (the most complex console out there) to be "emulated" on a very different machine when 4A games basically just ported a PC game / engine on 2 simple PC configurations (they didn't use any GPGPU advanced stuff apparently).

And he again persists as if his cherry picked framerate stress performance video of TLOUR was somehow representative of the whole game.

I am all for framerate stress performance videos, but only if all games on all hardware get the same fair tests. But for whatever reasons the only framerate stress videos (where only the worse parts are shown) we got since the beginning of this gen are mainly PS4 games with notably: Infamous SS, Watchdogs PS4 and recently TLOUR.
 
You are completely missing the point. its not a fair comparison to make. It took considerable effot just to get Tlou remastered to run on the ps4.

It might help if you explained why its not a fair comparison to make. By your last statement, are you implying it did not take considerable effort to get Metro Redux to run on the PS4?
 
It might help if you explained why its not a fair comparison to make. By your last statement, are you implying it did not take considerable effort to get Metro Redux to run on the PS4?
I think Globalisateur explains it. Porting from PS3 to PS4 is probably a lot harder than it is PC->PS4/XB1. I agree that it is a pretty dumb comparison.

But that takes nothing away from what 4A has achieved. Both ND and 4A have done a very good job in such a short time, but for different reasons.
 
Last edited by a moderator:
And apparently he still didn't understand what ND did in 5 months with TLOUR: they bascially retro-engineered the most hardware intensive PS3 game (the most complex console out there) to be "emulated" on a very different machine when 4A games basically just ported a PC game / engine on 2 simple PC configurations (they didn't use any GPGPU advanced stuff apparently).

This isn't true.

If you read the article you can see clearly that they were working with the console code base and they optimised the game significantly for GCN.

Also, I don't think "emulating" is a very accurate way to describe porting TLoU.

And he again persists as if his cherry picked framerate stress performance video of TLOUR was somehow representative of the whole game.

No, he's not doing this. At all. He is not doing this.

The DF analysis made clear that the dips were *not* representative of the whole game.

You are making this up. Please stop.

I am all for framerate stress performance videos, but only if all games on all hardware get the same fair tests. But for whatever reasons the only framerate stress videos (where only the worse parts are shown) we got since the beginning of this gen are mainly PS4 games with notably: Infamous SS, Watchdogs PS4 and recently TLOUR.

You're playing the Sony-as-victim card. You're always doing this.

Fucks sake Globalisateur, you're better than this. Seriously, you are, you have things to genuinely contribute but keep falling back to this.

A genuinely interesting article and the thread has been turned to mush by DF hating.
 
I think Globalisateur explains it. Porting from PS3 to PS4 is probably a lot harder than it is PC->PS4/XB1.

People should read the article.

Oles Shishkovstov said:
Well we just ported the games over and ran a lot of tests!

One little example I can give: Metro Last Light on both previous consoles has some heavily vectorised and hand-optimised texture-generation tasks. One of them takes 0.8ms on single PS3 SPU and around 1.2ms on a single Xbox 360 hyper-thread. Once we profiled it first time - already vectorised via AVX+VEX - on PS4, it took more than 2ms! This looks bad for a 16ms frame. But the thing is, that task's sole purpose was to offload a few cycles from (older) GPUs, which is counter-productive on current-next-gen consoles. That code path was just switched off.

Oles Shishkovstov said:
There is no secret. We just adapted to the target hardware.

GCN doesn't love interpolators? OK, ditch the per-vertex tangent space, switch to per-pixel one. That CPU task becomes too fast on an out-of-order CPU? Merge those tasks. Too slow task? Parallelise it. Maybe the GPU doesn't like high sqrt count in the loop? But it is good in integer math - so we'll use old integer tricks. And so on, and so on.

But people keep saying that it's just the PC version where they added the "compile_for_console_lol" option.

Just ... read the article.
 
People should read the article.





But people keep saying that it's just the PC version where they added the "compile_for_console_lol" option.

Just ... read the article.

Yes and if you believe they didnt use any of the knowledge or experience from making an engine for the pc. You are a fool. I suppose the redux version on pc just threw away all the previous code from there pc engine and just used the "console port".

They did a brilliant job and it still amazes me what they can do considering there circumstances compared to other studios which are in first world countries.
 
I agree. Metro being on multiple platforms was probably a considerable asset vs completely re-working one of the best looking last-gen games that was tailored for PS3, onto the PS4.

I think that singling out ND was unnecessary, and is not a fair comparison. Saying something like, "achieving a locked 60fps on console has seemingly been a challenge for some developers", would have been more appropriate IMO. Tomb Raider DE would be the closest comparison.
 
Last edited by a moderator:
Frame rates have been shit this generation, I think it's fair to praise 4A for what they've done.

I think frame rates have been a huge improvement over last gen where many games struggled for a stable 30 and dipped into the 20s. Solid 30s, some 60s (or close too) and a bunch of 30+ unlocked which I'm cool with.

Great interview.
 
^Completely agree.

And for the record, no one is taking anything away from 4A. We're simply questioning the need to single out and throw Naughty Dog under the bus, when 1. it's not a completely fair comparison; and 2. there are worse attempts at '60fps' on console.
 
The talk about the Xbox One DX11 api is interesting. Doesn't sound like it's nearly as performant as everyone speculated it should be. DX12, or this new low-overhead API might actually bring some real improvement on the CPU side. This must be why the Xbox One performs poorer on CPU tests. It'll be interesting if we ever get to hear what games might be taking advantage of the new API. Metro is brand new and doesn't use it.
 
Yes and if you believe they didnt use any of the knowledge or experience from making an engine for the pc. You are a fool. I suppose the redux version on pc just threw away all the previous code from there pc engine and just used the "console port".

Of course they used their knowledge and experience, what an utterly pointless thing to say. They also have knowledge and experience of x86 and will have some of high level use of GCN.

But this is a shifting of goal posts after the previous "it's just PC->PS4 therefore easy" has been shown to be bullshit simply by reading the damn article that we *should* be discussing.

I agree. Metro being on multiple platforms was probably a considerable asset vs completely re-working one of the best looking last-gen games that was tailored for PS3, onto the PS4.

Indeed, it's a big asset. ND are onto two new architectures that are very different to the ones they optimised their tools and code for. They weren't even on unified shaders with the garbage RSX, and not all of their Cell code will run as fast on Jaguar.

That said, with enough resources - meaning with it as a high enough priority - I'm sure they'd have achieved a rock solid 60 fps instead of just a very solid 60 fps.

I'm sure ND will be fine. They'll adapt quickly and continue to succeed just like they did on the PS3.

I think that singling out ND was unnecessary, and is not a fair comparison. Saying something like, "achieving a locked 60fps on console has seemingly been a challenge for some developers", would have been more appropriate IMO. Tomb Raider DE would be the closest comparison.

Perhaps TR would have been a better - and certainly more sensitive - comparison. It did add more to the remaster graphically than TLOU though, so frame rate compromises are more understandable, at least superficially.

I'm going to leave the ND discussion here though, as it's the rest of the article that's really interesting (far more so than RLBs name drop in the frame rate question!).

I think frame rates have been a huge improvement over last gen where many games struggled for a stable 30 and dipped into the 20s. Solid 30s, some 60s (or close too) and a bunch of 30+ unlocked which I'm cool with.

Great interview.

I suppose it's a matter of perspective. I like high and consistent frame rates. And if they're not going to be high then they should at least be consistent. I find unlocked frame rates distracting. And tearing too. Xbox 1 Titanfall would be quite unpleasant for me, I think!
 
It'll be interesting if we ever get to hear what games might be taking advantage of the new API.
Also curious is how the new low-level access fits in the with VMs and overall original design. Was there always going to be a low level option and DX was just a stop-gap, or have MS abandon something like forwards compatibility in order to release more for the XB1 games?
 
The talk about the Xbox One DX11 api is interesting. Doesn't sound like it's nearly as performant as everyone speculated it should be. DX12, or this new low-overhead API might actually bring some real improvement on the CPU side. This must be why the Xbox One performs poorer on CPU tests. It'll be interesting if we ever get to hear what games might be taking advantage of the new API. Metro is brand new and doesn't use it.

The value of legacy code that performs "well enough" is likely to high, particularly early on. If you could put that time into GPU optimisation instead of splitting off your XB1 code from the PC DX11 entirely, I guess that means it might be a slow transition for some.

It sounds like the 360 had lower overhead than DX11.x.

The move to the metal makes it sound like easy backwards compat for Xbox Two via the API won't be happening. Hardware BC or bust.

The performance figures for some of the the CPU code going from PS360 to Jaguar makes it sound like software BC for PS360 on 4Bone won't be happening any time soon, either.
 
The DF analysis made clear that the dips were *not* representative of the whole game.

You are making this up. Please stop.

From the TLOUR DF article:

Just over 15 minutes of gameplay from The Last of Us Remastered running in 60fps mode, giving a good indication of how well the game holds its lock.

You're playing the Sony-as-victim card
That's not enough as an argument, particularly in a DF article, where mainly numbers, facts count. My point, that mainly PS4 framerate stress tests are published, still stands.
 
Status
Not open for further replies.
Back
Top