When Albert came out and said there is no way MS would conceded 20% power to Sony and that we wouldn't see a difference in games this year, most gamers assumed that PS4 GPU performance would not be taken advantage of. At that time many developers and comments here on B3D said that was ludicrous, that developers would not force parity and cited last generation as evidence.
Now your saying the exact opposite hence my comments. What you're suggesting doesn't seem to line up with what we've been experiencing so far this generation or what we say the majority of last generation.
I could see a developer spending more time on XB1 version to get results but its also true that almost anything they do to optimize there will lead to better results on the PS4. Other than memory what optimization can be done that won't benefit the PS4?
Later on it became possible to a point if you just let the 360 idle more but even still was difficult to achieve because of the myriad of limits and bottlenecks on the ps3 which you would see in the df tests, vast online feature differences, less memory and so on. But eventually they were close enough for the masses to consider them "the same". It's not something that any dev wants to willingly do, but eventually we realized it was the smart business decision. I won't give any game examples but it happened a *lot* last gen. Hell I made a career of it, that's all I did for my last two years in gaming, working on "parity".
But I never expected that resolution would become such a big bullet point to forum posters and the media alike, so now I wonder if there that resolution difference will remain.
Other than memory what optimization can be done that won't benefit the PS4?
At the beginning, maybe, but after a while PS3's tools were very good going by dev reports, even better than MS's in some significant areas (eg. build times, could iterate faster on PS3).360 was lead platform for many developers for much of the generation, the PS3 tools were far behind too.
Why can't PS4 do that? 1) It has hardware layers (although the second one may be reserved for OS). 2) You don't need hardware layers and can composite on GPU, as Graham was describing somewhere - he was even perplexed as to the use of the composition layers feeling that integrating it all in the graphics pipeline on GPU was more efficient. So certainly PS4 can render some parts with some renderer and other parts with another and composite.GPU overlays, mixing D2D/XAML over D3D .... assuming the dev was brave enough to do this just for Windows devices
1) 900p vs 1080p
Easy to show and market as one being better than the other, i.e. pushing both hardware.
2) 1080p vs 1080p with slightly better effects/pixel quality.
I think that they both would be pretty close visually, as I don't believe there is enough headroom to make the pixel quality a huge difference. Unless specifically looking for it.
Harder to bullet point and show that one is better than the other unless its DF breakdown.
So it will be interesting if the X1 becomes a lead platform.
Do they go for option 1, to show that their pushing both hardware to make everyone happy to avoid the 'forced' parity claims.
Or option 2, that may be harder to easily prove it?
What happened in the odd cases: I remember Final Fantasy 13 and Castlevania: Lords of Shadows performing better on PS3 compared to X360. How can this happen in a development scenario where PS3 has many bottlenecks and X360 idles a lot to achieve parity?
So you are saying that the actively worked on nerfing 360 games to be on par with the PS3? To be perfectly clear, you worked actively on PS3 and 360 games where you made a career of limiting the 360 games that could have looked better but that xtra "shine" was dropped so that the PS3/XBOX looked on par?
Resolution was already a big deal during the PS3/360 war, and afaik the difference back then was even smaller and when that generation started SD was still in play, 720p was the norm and Sony got bitchslapped for raising the Real HD flag.
How you could be surprised today when "Real HD" is the norm is a surprise
Again I find it strange that Sony was in a position to boss developers around.
1) 900p vs 1080p
Easy to show and market as one being better than the other, i.e. pushing both hardware.
2) 1080p vs 1080p with slightly better effects/pixel quality.
I think that they both would be pretty close visually, as I don't believe there is enough headroom to make the pixel quality a huge difference. Unless specifically looking for it.
Harder to bullet point and show that one is better than the other unless its DF breakdown.
So it will be interesting if the X1 becomes a lead platform.
Do they go for option 1, to show that their pushing both hardware to make everyone happy to avoid the 'forced' parity claims.
Or option 2, that may be harder to easily prove it?
the balls to ship a game at 1280x720 but with the visuals dialed up compared to 1080p games, or if they would be crucified for doing so.
the quote is not clear on being "fairly steady 30fps" with it disabled or not, it makes me think it was enabled, because they only said it was disabled for 1080P, and then they dropped resolution for 720P for "fairly steady 30fps", but after that they mention disabling vsync again, which gave them judder and little more than 30fps.
The final PC, utilising a lower-end Radeon HD 7770 with a Core i5 3570K at stock speeds, fared much worse. At 1080p with the highest details and v-sync disabled we averaged around 17fps. Only by dropping to 720p were we able to reach a fairly steady 30fps. Unlocking the frame-rate on this machine ultimately did little more than create additional judder, with frame-rates just barely over 30fps. In general, performance here is significantly slower than that of the Xbox One version - perhaps not surprising bearing in mind that the HD 7770 (aka the Radeon R7 250X) doesn't stack up favourably against the Xbox One's GPU. Our advice? An Intel quad-core processor is a must and, if you're looking for 1080p30 gameplay, a GTX 760 or Radeon R9 280/285 is recommended. Remarkably we can't recommend any single CPU/GPU combo that can sustain this game at 1080p60.
I dont know why people try to subtract resolution from raster graphics like its some after thought. Want to resolve more detail from "dialing up the visuals"? Then there has to be adequate resolution to resolve it, period.
Read it again, you are misinterpreting their statements.
1. They first ran it at 1080p, highest details, with no v-sync.
2. The only change they did next was to lower resolution to 720p
3. After that (the bolded part) they removed the 30 FPS cap to see what would happen. The first 2 scenarios still had the game locked to a 30 FPS limit.
The first 2 scenarios was with the game's 30 FPS limit in place. In all cases it ran significantly slower on the rig with the HD 7770 (R7 250x) than it did on the XBO. Unless they are just flat out lying. I'd assume they could refer back to their framerate over time video's from their XBO analysis. I just wish they would have also provided a video of the HD 7770 so we could determine whether "significant' is just a 1-2 FPS difference in general and in intense scenes or if it's a 3-5 FPS difference.
1080p averaged 17 FPS. The XBO was a relatively steady 30 FPS (probably average around 27-29). Dropping the PC with the 7770 to 720p isn't going to make the game jump from 17 FPS average to 30 FPS average (>75% faster) or anywhere close to it. For example with Battlefield 4 (https://www.youtube.com/watch?v=cwQSgYZlB1Q#t=79) on a 7770, dropping it from 1080p to 720p gave at most a 34% increase in FPS at a variety of performance settings.
I'm going to bet they were pretty loose with their "fairly steady 30 fps." More realistically their average FPS probably went from 17 fps to 23-25 fps. I'm guessing in the informal testing they didn't really have a lot going on in the scene for the 720p test.
I'm all for showing the superiority of PC's but not at the expense of reality.
Regards,
SB
At the beginning, maybe, but after a while PS3's tools were very good going by dev reports, even better than MS's in some significant areas (eg. build times, could iterate faster on PS3).
Why can't PS4 do that? 1) It has hardware layers (although the second one may be reserved for OS). 2) You don't need hardware layers and can composite on GPU, as Graham was describing somewhere - he was even perplexed as to the use of the composition layers feeling that integrating it all in the graphics pipeline on GPU was more efficient. So certainly PS4 can render some parts with some renderer and other parts with another and composite.
The first 2 scenarios was with the game's 30 FPS limit in place. In all cases it ran significantly slower on the rig with the HD 7770 (R7 250x) than it did on the XBO. Unless they are just flat out lying. I'd assume they could refer back to their framerate over time video's from their XBO analysis. I just wish they would have also provided a video of the HD 7770 so we could determine whether "significant' is just a 1-2 FPS difference in general and in intense scenes or if it's a 3-5 FPS difference.
1080p averaged 17 FPS. The XBO was a relatively steady 30 FPS (probably average around 27-29). Dropping the PC with the 7770 to 720p isn't going to make the game jump from 17 FPS average to 30 FPS average (>75% faster) or anywhere close to it. For example with Battlefield 4 (https://www.youtube.com/watch?v=cwQSgYZlB1Q#t=79) on a 7770, dropping it from 1080p to 720p gave at most a 34% increase in FPS at a variety of performance settings.
I'm going to bet they were pretty loose with their "fairly steady 30 fps." More realistically their average FPS probably went from 17 fps to 23-25 fps. I'm guessing in the informal testing they didn't really have a lot going on in the scene for the 720p test.