Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
MS aren't forcing parity.

That doesn't mean that there is going to be a big - or necessarily any - difference between platforms.

There doesn't have to be any inconsistency between these positions.
 
When Albert came out and said there is no way MS would conceded 20% power to Sony and that we wouldn't see a difference in games this year, most gamers assumed that PS4 GPU performance would not be taken advantage of. At that time many developers and comments here on B3D said that was ludicrous, that developers would not force parity and cited last generation as evidence.

Now your saying the exact opposite hence my comments. What you're suggesting doesn't seem to line up with what we've been experiencing so far this generation or what we say the majority of last generation.

I could see a developer spending more time on XB1 version to get results but its also true that almost anything they do to optimize there will lead to better results on the PS4. Other than memory what optimization can be done that won't benefit the PS4?

Last gen was a totally different beast, you could never achieve total parity without having one machine idle. That's just the way it was, but it was a business decision that made sense with all the uproar that was going on at the time. So it wasn't forced on anyone in that there weren't people walking up to coders and making them delete code or remove features for the greater good, instead they went with parity by design eventually realizing that building first on ps3 was the easiest way to achieve that.

This gen the machines are so similar that there shouldn't be as much idling going on. You can make xb1 versions 900p, ps4 version 1080p and fully tax both machines yet still achieve parity because good luck finding an average person who could tell the difference between the two. So Albert was right, they aren't conceding any visual difference (to the masses) and yet at the same time you don't have the ps4 gpu idling in large amounts because they munched up the extra cycles with resolution. So games are still catering to the lowest common denominator, achieving parity and yet fully taxing both machine gpu's in the process. It's the current happy medium. You could argue that this is not the best way to exploit ps4 hardware, but that's a whole different argument.
 
I am just wondering the following, maybe Joker you can shed some light on this as I find it so curious last gen: what you say about PS360 last gen sounds reasonable and was documented by DF...well, most of the time. What happened in the odd cases: I remember Final Fantasy 13 and Castlevania: Lords of Shadows performing better on PS3 compared to X360. How can this happen in a development scenario where PS3 has many bottlenecks and X360 idles a lot to achieve parity? Did the devs do this on purpose? It's also worth noting that those two games where very fine looking pieces and not some trash.
 
Later on it became possible to a point if you just let the 360 idle more but even still was difficult to achieve because of the myriad of limits and bottlenecks on the ps3 which you would see in the df tests, vast online feature differences, less memory and so on. But eventually they were close enough for the masses to consider them "the same". It's not something that any dev wants to willingly do, but eventually we realized it was the smart business decision. I won't give any game examples but it happened a *lot* last gen. Hell I made a career of it, that's all I did for my last two years in gaming, working on "parity".

So you are saying that the actively worked on nerfing 360 games to be on par with the PS3? To be perfectly clear, you worked actively on PS3 and 360 games where you made a career of limiting the 360 games that could have looked better but that xtra "shine" was dropped so that the PS3/XBOX looked on par? Or did you work actively on making sure the PS3 was pushed to the limit?

I am well aware of the crazy PS3 hardware and the "easy" 360 development, and all the limitations. But you know what, the 1st party games on the PS3 were imho easily comparable to the 1st party games on the 360. I would go so far and say the best PS3 games actually surpassed the best 360 games. Were those 360 games idling to?
 
It's not really a good idea to try and compare exclusives between platforms.

You can't effectively judge performance and utilisation of a console - say Xbox 360 - using software that only ever ran on another - say PS3.
 
But I never expected that resolution would become such a big bullet point to forum posters and the media alike, so now I wonder if there that resolution difference will remain.

Resolution was already a big deal during the PS3/360 war, and afaik the difference back then was even smaller and when that generation started SD was still in play, 720p was the norm and Sony got bitchslapped for raising the Real HD flag.

How you could be surprised today when "Real HD" is the norm is a surprise :)
 
360 was lead platform for many developers for much of the generation, the PS3 tools were far behind too. I think that had as much to do with results as difficulty adapting to Cell or the poor memory layout of PS3.

Again I find it strange that Sony was in a position to boss developers around... Mind you in many cases these developers had timed exclusives or DLC which came out first on 360. But they were all bending to accommodate the guy whose system chewed up more of the development budget even though he wasn't paying for extra features like DLC... Had a strained relationship with the public and the gaming media for much of the generation....

Additional time and resources allocated to PS3 version for optimization makes sense in some situations though TKF raises an excellent point when he says the 360 exclusives weren't visually superior to PS3 exclusives. Kinda odd that the superior hardware never delivered visuals to back that up. Or to put it another way does anyone doubt that we'll see titles on PS4 that won't have visual equivalent on XB1?
 
360 was lead platform for many developers for much of the generation, the PS3 tools were far behind too.
At the beginning, maybe, but after a while PS3's tools were very good going by dev reports, even better than MS's in some significant areas (eg. build times, could iterate faster on PS3).

GPU overlays, mixing D2D/XAML over D3D .... assuming the dev was brave enough to do this just for Windows devices
Why can't PS4 do that? 1) It has hardware layers (although the second one may be reserved for OS). 2) You don't need hardware layers and can composite on GPU, as Graham was describing somewhere - he was even perplexed as to the use of the composition layers feeling that integrating it all in the graphics pipeline on GPU was more efficient. So certainly PS4 can render some parts with some renderer and other parts with another and composite.
 
1) 900p vs 1080p
Easy to show and market as one being better than the other, i.e. pushing both hardware.
2) 1080p vs 1080p with slightly better effects/pixel quality.
I think that they both would be pretty close visually, as I don't believe there is enough headroom to make the pixel quality a huge difference. Unless specifically looking for it.
Harder to bullet point and show that one is better than the other unless its DF breakdown.

So it will be interesting if the X1 becomes a lead platform.
Do they go for option 1, to show that their pushing both hardware to make everyone happy to avoid the 'forced' parity claims.
Or option 2, that may be harder to easily prove it?
 
1) 900p vs 1080p
Easy to show and market as one being better than the other, i.e. pushing both hardware.
2) 1080p vs 1080p with slightly better effects/pixel quality.
I think that they both would be pretty close visually, as I don't believe there is enough headroom to make the pixel quality a huge difference. Unless specifically looking for it.
Harder to bullet point and show that one is better than the other unless its DF breakdown.

So it will be interesting if the X1 becomes a lead platform.
Do they go for option 1, to show that their pushing both hardware to make everyone happy to avoid the 'forced' parity claims.
Or option 2, that may be harder to easily prove it?

Going forward I'm in the camp in believing it'll likely be option 2.
I think as the tools mature and evolve for X1 and developers get better with the memory system they'll be able to get over their resolution issues.
And it's not that I don't believe PS4 will get better, I'm just thinking X1 is far behind where it should be in terms of performance, and developers have had a much easier time pushing the ps4 with its simpler architecture.
 
What happened in the odd cases: I remember Final Fantasy 13 and Castlevania: Lords of Shadows performing better on PS3 compared to X360. How can this happen in a development scenario where PS3 has many bottlenecks and X360 idles a lot to achieve parity?

Maybe they built it on ps3, ported it back to 360 and deemed it good enough and moved on, who knows. I finished Castlevania Lords of Shadow and don't recall there being much real world performance between the two versions, both had frame rate drops that seemed similar to me. Never touched FF 13.


So you are saying that the actively worked on nerfing 360 games to be on par with the PS3? To be perfectly clear, you worked actively on PS3 and 360 games where you made a career of limiting the 360 games that could have looked better but that xtra "shine" was dropped so that the PS3/XBOX looked on par?

No.


Resolution was already a big deal during the PS3/360 war, and afaik the difference back then was even smaller and when that generation started SD was still in play, 720p was the norm and Sony got bitchslapped for raising the Real HD flag.

How you could be surprised today when "Real HD" is the norm is a surprise :)

Because people we tested on didn't seem to care, and people on this forum didn't seem to care. Search this forum on peoples comments on games like GTA, COD, Bioshock, etc to see how little ps3 users here didn't care about resolution differences. It was largely considered a non issue here, people were more outraged at bdrom disc space not being used.


Again I find it strange that Sony was in a position to boss developers around.

They didn't boss people around, I'd say that's pushing it. But influence was exerted by both console makers, hints dropped, suggestions made, stuff like that. Sony knew the position they were in, they knew they had the weaker hardware and said as much privately so they understood they were at a disadvantage and weren't going to be unreasonable about it. But they didn't want that disadvantage to run amok.


1) 900p vs 1080p
Easy to show and market as one being better than the other, i.e. pushing both hardware.
2) 1080p vs 1080p with slightly better effects/pixel quality.
I think that they both would be pretty close visually, as I don't believe there is enough headroom to make the pixel quality a huge difference. Unless specifically looking for it.
Harder to bullet point and show that one is better than the other unless its DF breakdown.

So it will be interesting if the X1 becomes a lead platform.
Do they go for option 1, to show that their pushing both hardware to make everyone happy to avoid the 'forced' parity claims.
Or option 2, that may be harder to easily prove it?

Yeah see that's the thing I realize now, people like the tangibles. It really makes no difference whether or not someone can actually see the difference between 900p and 1080p, the fact is that it's a numeric difference on paper and hence easily understood. Hence it's a clear and concise bullet point victory for one over the other and why it's now rocketing up the importance chart. On the other hand #2 is much harder to prove because as you say it takes a *lot* of gpu headroom at 1920x1080p to make a visual feature stand out over another platform version.

I wonder if some company will have the balls to ship a game at 1280x720 but with the visuals dialed up compared to 1080p games, or if they would be crucified for doing so.
 
the balls to ship a game at 1280x720 but with the visuals dialed up compared to 1080p games, or if they would be crucified for doing so.

I dont know why people try to subtract resolution from raster graphics like its some after thought. Want to resolve more detail from "dialing up the visuals"? Then there has to be adequate resolution to resolve it, period.

Also, I have no doubt ~1280x720 will rear its ugly head soon enough this generation. I just hope it doesn't last long enough for the sub 1280 sub 30fps mess the last gen consoles were getting near the end of life cycle.
 
the quote is not clear on being "fairly steady 30fps" with it disabled or not, it makes me think it was enabled, because they only said it was disabled for 1080P, and then they dropped resolution for 720P for "fairly steady 30fps", but after that they mention disabling vsync again, which gave them judder and little more than 30fps.

Read it again, you are misinterpreting their statements.

The final PC, utilising a lower-end Radeon HD 7770 with a Core i5 3570K at stock speeds, fared much worse. At 1080p with the highest details and v-sync disabled we averaged around 17fps. Only by dropping to 720p were we able to reach a fairly steady 30fps. Unlocking the frame-rate on this machine ultimately did little more than create additional judder, with frame-rates just barely over 30fps. In general, performance here is significantly slower than that of the Xbox One version - perhaps not surprising bearing in mind that the HD 7770 (aka the Radeon R7 250X) doesn't stack up favourably against the Xbox One's GPU. Our advice? An Intel quad-core processor is a must and, if you're looking for 1080p30 gameplay, a GTX 760 or Radeon R9 280/285 is recommended. Remarkably we can't recommend any single CPU/GPU combo that can sustain this game at 1080p60.

1. They first ran it at 1080p, highest details, with no v-sync.
2. The only change they did next was to lower resolution to 720p
3. After that (the bolded part) they removed the 30 FPS cap to see what would happen. The first 2 scenarios still had the game locked to a 30 FPS limit.

The first 2 scenarios was with the game's 30 FPS limit in place. In all cases it ran significantly slower on the rig with the HD 7770 (R7 250x) than it did on the XBO. Unless they are just flat out lying. I'd assume they could refer back to their framerate over time video's from their XBO analysis. I just wish they would have also provided a video of the HD 7770 so we could determine whether "significant' is just a 1-2 FPS difference in general and in intense scenes or if it's a 3-5 FPS difference.

1080p averaged 17 FPS. The XBO was a relatively steady 30 FPS (probably average around 27-29). Dropping the PC with the 7770 to 720p isn't going to make the game jump from 17 FPS average to 30 FPS average (>75% faster) or anywhere close to it. For example with Battlefield 4 (https://www.youtube.com/watch?v=cwQSgYZlB1Q#t=79) on a 7770, dropping it from 1080p to 720p gave at most a 34% increase in FPS at a variety of performance settings.

I'm going to bet they were pretty loose with their "fairly steady 30 fps." More realistically their average FPS probably went from 17 fps to 23-25 fps. I'm guessing in the informal testing they didn't really have a lot going on in the scene for the 720p test.

I'm all for showing the superiority of PC's but not at the expense of reality.

Regards,
SB
 
I dont know why people try to subtract resolution from raster graphics like its some after thought. Want to resolve more detail from "dialing up the visuals"? Then there has to be adequate resolution to resolve it, period.

I'm more of a believer that the whole is the sum of the parts, and putting half as much workload on the gpu by dropping resolution leaves room to implement a whole bunch of other parts. Some visual aspects are also completely independent of resolution.
 
Read it again, you are misinterpreting their statements.



1. They first ran it at 1080p, highest details, with no v-sync.
2. The only change they did next was to lower resolution to 720p
3. After that (the bolded part) they removed the 30 FPS cap to see what would happen. The first 2 scenarios still had the game locked to a 30 FPS limit.

The first 2 scenarios was with the game's 30 FPS limit in place. In all cases it ran significantly slower on the rig with the HD 7770 (R7 250x) than it did on the XBO. Unless they are just flat out lying. I'd assume they could refer back to their framerate over time video's from their XBO analysis. I just wish they would have also provided a video of the HD 7770 so we could determine whether "significant' is just a 1-2 FPS difference in general and in intense scenes or if it's a 3-5 FPS difference.

1080p averaged 17 FPS. The XBO was a relatively steady 30 FPS (probably average around 27-29). Dropping the PC with the 7770 to 720p isn't going to make the game jump from 17 FPS average to 30 FPS average (>75% faster) or anywhere close to it. For example with Battlefield 4 (https://www.youtube.com/watch?v=cwQSgYZlB1Q#t=79) on a 7770, dropping it from 1080p to 720p gave at most a 34% increase in FPS at a variety of performance settings.

I'm going to bet they were pretty loose with their "fairly steady 30 fps." More realistically their average FPS probably went from 17 fps to 23-25 fps. I'm guessing in the informal testing they didn't really have a lot going on in the scene for the 720p test.

I'm all for showing the superiority of PC's but not at the expense of reality.

Regards,
SB

well, I associated the unlocking of the frame rate with disabling vsync,

anyway, 720 vs 1080 is a big difference for performance...

the best I can find in this game with the same VGA is this
http://gamegpu.ru/action-/-fps-/-tps/dead-rising-3-test-gpu.html

780 Ti 1920x1200 = 74FPS; (SB-E 4.9GHz)

780 TI 1280x800 = 118FPS (Haswell +- 3.7Ghz)

the 1200P have a faster CPU, but it's certainly GPU limited, the 800P is probably somewhat CPU limited.
 
I wholeheartedly agree. Even if offline CG it doesn't necessarily makes sense to just brute force render everything in 1080p - dropping the resolution for some of the layers offers many advantages, from more complex shading/lighting to more iterations and so on.

Then again most of our stuff ends up viewed on youtube or gametrailers, with brutal compression, so a lot of the extra detail of 1080p would be lost anyway...
 
At the beginning, maybe, but after a while PS3's tools were very good going by dev reports, even better than MS's in some significant areas (eg. build times, could iterate faster on PS3).

Why can't PS4 do that? 1) It has hardware layers (although the second one may be reserved for OS). 2) You don't need hardware layers and can composite on GPU, as Graham was describing somewhere - he was even perplexed as to the use of the composition layers feeling that integrating it all in the graphics pipeline on GPU was more efficient. So certainly PS4 can render some parts with some renderer and other parts with another and composite.

Well I guess it could ...

On the Windows/XB1 Front ..

1. HW exposes Overlays
2. Dx has support for multiple swapchains and overlays
3. UI Framework (XAML & WinRT/Cx) has special libraries that aide in targeting these API's ...

In MS's case it has all 3 ..

So yes PS4 can too I guess BUT if a developer does take advantage of this on the XB1 platform, or any modern Windows WinRT/Dx11.2 based platform for that matter, you are seeing possible tight coupling to the UI framework and Dx API ...
 
theres a little difference between
xb360 / ps3 & ps4 / xbone
its the difference of
secret sauce and the cell
here I have them each in either hand
in my left hand I have the cell feel it and touch it & in my right hand I have the secret sauce, shhhhh you've gotta be quiet or you will frighten it off
... I told you to be quiet
 
The first 2 scenarios was with the game's 30 FPS limit in place. In all cases it ran significantly slower on the rig with the HD 7770 (R7 250x) than it did on the XBO. Unless they are just flat out lying. I'd assume they could refer back to their framerate over time video's from their XBO analysis. I just wish they would have also provided a video of the HD 7770 so we could determine whether "significant' is just a 1-2 FPS difference in general and in intense scenes or if it's a 3-5 FPS difference.

1080p averaged 17 FPS. The XBO was a relatively steady 30 FPS (probably average around 27-29). Dropping the PC with the 7770 to 720p isn't going to make the game jump from 17 FPS average to 30 FPS average (>75% faster) or anywhere close to it. For example with Battlefield 4 (https://www.youtube.com/watch?v=cwQSgYZlB1Q#t=79) on a 7770, dropping it from 1080p to 720p gave at most a 34% increase in FPS at a variety of performance settings.

I'm going to bet they were pretty loose with their "fairly steady 30 fps." More realistically their average FPS probably went from 17 fps to 23-25 fps. I'm guessing in the informal testing they didn't really have a lot going on in the scene for the 720p test.

I don't know... As pjbliverpool has pointed out, DF seems to have a bit clouded view on what framerates these PCs should achieve. At this point I don't think it's even unreasonable to think that their generally negative attitude for their PC benchmark numbers of DR 3 has clouded them to rate basically the same 720p performance on the 7770 vs One as somehow a win for the One. I'd certainly like to hear them explain that bit in a way that actually makes sense. Fairly steady 30fps and getting a bit over that when unlocked doesn't in reality warrant guesses of the frame rate being in reality 23-25fps like you did.

Also the frame rate penalty going from 720p to 1080p is often much more than in that single youtube example you pointed out. I'd actually say that's a quite bad example you managed to dug up, frame rates will vary between games and scenes, but often the difference is a lot more than that single Battlefield example.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top