PS4: Theoretical extra power, Where is it?

Status
Not open for further replies.

Nesh

Double Agent
Legend
The PS4 is supposed to have theritically 40% more power than the XB1.

But in all honesty I havent seen it so far. Someone can debate that the PS4 demonstrates quite often higher resolution or better peformance but in all honesty the difference is marginal in my eyes.

There are games that operate at the same resolution with minor framarate advantage in favor of the PS4 . But most cases its a resolution difference that doesnt have a huge impact.

The gap seems to get smaller as devs get better tools and accustomed with the hardware XB1's. Considering that the XB1 requires some etxra work due to the ESRAM, the memory bandwidth being a lot lower and the fact that the GPU has less CU's you would expect a higher difference.

But nop.

So someone might say that developers are probably not exploiting the PS4 to its full advantage (minus the resolution difference). Since the game performs well enough on the PS4 why invest extra money and man hours on additional optimization if the product runs great and will perform in the market very well anyways?
Perhaps this is the case. At least partly

For example the latest COD runs satisfactory as it is on both. Would it make sense business wise to squeez more out of the PS4 assuming it has the juice? Not much if at all

But then we see amazingly looking games on the XB1 that have robust performance or have lots of promise. Sunset overdrive may be 900p but it looks great and has a very stable performance. FH2 is 1080p with robust performance too. Quantum Break is a technical wonder to behold even at 900p. It is probably the best looking unreleased game so far. Its jaw dropping.

In all honesty the performance difference doesnt manifest itself strongly when looking at both consoles. Both offer an extremely similar experience. I doubt most will experience the difference enough to be considered significant in any way.

Unless of course the PS4 has some untapped power that will be revealed in time and the gap will increase.

But as it stands now the difference is similar to last gen's consoles and I see no signs of the gap becoming larger. i only see it becoming smaller as if the extra computational power of the PS4 is insignificant

The only game that has the potential to proove the story different is if Uncharted 4 manages to indeed perform at 1080p/60fps while retaining constant jaw dropping visuals that you would expect from a high end PC. So far this is just words
 
That 40% lead bis not system wide, only on the gpu side.
The xbone should be as such:
5GB of RAM
5 CPU cores
And whatever left more or less as it.

Yet even like I suspect we would not see a night and day type of difference, actually on a drunk Friday night... I could see people not noticing much differences.
 
I already got there but I think that is where MSFT should have ended more or less, a significantly cheaper system, closer to that unborn Yukon platform.
 
1) PS4 only has a 40% GPU advantage. On CPU, it has a disadvantage. BW is too complicated to call thanks to CPU conflicts.

2) 900p is 70% of the resolution of 1080p. Ergo, where a 40% GPU advantage would enable 40% more resolution, we're seeing 30% more resolution and a bit of extra niceness (perhaps more due to ESRAM design issues?), accounting for 40%, no?

/thread ?
 
The xbox one has the cpu advantage and a ram advantage of about 1 gig I believe. The ps4 may have a gpu advantage of 40ish% and perhaps a bandwidth advantage.
 
That 40% lead bis not system wide, only on the gpu side.
The xbone should be as such:
5GB of RAM
5 CPU cores
And whatever left more or less as it.

Yet even like I suspect we would not see a night and day type of difference, actually on a drunk Friday night... I could see people not noticing much differences.
Isn't it 6 CPU cores out of 8 for next gen consoles?

I think Xbox One's weaker GPU is more "wrapped up" and taken care of with some extra leveraging from additional chips.

The sound chip, should also help freeing up CPU resources.

It's more likely that the Xbox One will also take a step forward by enhancing the coding and stuff like that, because the initial line-up was filled with 720p games and CoD for instance has progressed immensely compared to the prior version.
 
1) PS4 only has a 40% GPU advantage. On CPU, it has a disadvantage. BW is too complicated to call thanks to CPU conflicts.

2) 900p is 70% of the resolution of 1080p. Ergo, where a 40% GPU advantage would enable 40% more resolution, we're seeing 30% more resolution and a bit of extra niceness (perhaps more due to ESRAM design issues?), accounting for 40%, no?

/thread ?

Well then the conclusion is that the GPU advantage as long as it is given to the resolution area it wont show much of an observable difference on screen. It is observable but not that much.

And then you ve got games that perform at similar or identical resolutions with tiny differences on framerate.

Alien Isolation, Shadow of Mordor and COD being such examples.

Too bad as the performance advantage is probably not exploited in the most productive ways
 
1) PS4 only has a 40% GPU advantage. On CPU, it has a disadvantage. BW is too complicated to call thanks to CPU conflicts.

2) 900p is 70% of the resolution of 1080p. Ergo, where a 40% GPU advantage would enable 40% more resolution, we're seeing 30% more resolution and a bit of extra niceness (perhaps more due to ESRAM design issues?), accounting for 40%, no?

Well percentages depend on your perspective. 1920x1080 (2,073,600) is 44% more pixels than 1600x900 (1,440,000). Naturally that assumes no shenanigans, i.e. a native 1:1 horizontal:vertical ratio at 16:9.
 
Well percentages depend on your perspective.

Darth Vader totally killed Anni

Too bad as the performance advantage is probably not exploited in the most productive ways

Well, they still have to set a baseline for development, keeping mind that many multiplatforms will/can also scale further down on PC.
 
PS4: Theoretical extra power, Where is it?

It is in the hands of the developers.


It's not just numbers though. Also need to look at the software and (whatever) hardware design differences.

EDIT:
It's not-so-fruitful to scream performance differences on multiplatform games in general.

It is also not that useful to use unreleased games as examples. Things may/will change when the game comes out.

It's still early (TM). In a few years time, if the developers want to show the differences, they can. If the developers want to make them look the same, they can.
 
But then we see amazingly looking games on the XB1 that have robust performance or have lots of promise. Sunset overdrive may be 900p but it looks great and has a very stable performance.

The problem is you are talking both performance (objective) and "looks great" (subjective).

The subjective part should be ignored, beauty is in the eye of the beholder.

The performance difference is real enough, there are plenty of games to show this. Take Sunset Overdrive 900P and 30fps. OK, now look at a launch window game Second Sun, 1080P, great TSAA and run over 30fps.

But this kind of thread can't end well. Everything is a known quantity at this point, you can cherry pick to make any point or just fall back on "looks great". What we have is 900P vs 1080P and I doubt that will change too much (ignoring parity).
 
The PS4 is supposed to have theritically 40% more power than the XB1.

But in all honesty I havent seen it so far. Someone can debate that the PS4 demonstrates quite often higher resolution or better peformance but in all honesty the difference is marginal in my eyes.

The answer is actually quite simple. We've gotten to where it takes huge increases in power to see what amounts to a noticeable visual difference for the average person. It's why I've argued for some time now that smaller devices like tablets and so on will be able to bridge the visual gap with consoles even with less power under the hood, because the power delta has to be so large for it to visually matter to the typical player. The second reason is that art has become more important than tech, and good art will usually triumph over good tech. Or in other words you can have a game that is perceived as visually nice if it has weak tech with great art, but great tech with weak art won't be viewed quite as favorably.
 
I vaguely remember Mark Cerny's story was:

If it's easier to achieve your vision, you can invest more time in making the game and explore newer stuff (or helping out other projects).
I remember that Resogun game uses brand new techniques and that's a launch game. And addictive to play. :cool:


If the developers can change things up quickly, then they can experiment more. It doesn't really have to show the advantages visually.
Still... exclusive games will show the most advantages since they don't have to worry about performance and quality on other platforms.
 
Well then the conclusion is that the GPU advantage as long as it is given to the resolution area it wont show much of an observable difference on screen. It is observable but not that much.
Diminishing returns. That's why you need a 10x+ improvement on a previous console generation to get a noteworthy upgrade.

And then you ve got games that perform at similar or identical resolutions with tiny differences on framerate.
Just because a machine is capable of more, doesn't mean the devs should do more with it. See Developer decisions/parity threads. Software is a business.

Too bad as the performance advantage is probably not exploited in the most productive ways
Subjective and not relevant to the discussion point. Which I think is fairly answered, and to save another thread going down the parity [strike]toilet[/strike] route, I'll close it now. The answer to the question is, "there, in the GPU (and maybe compute advantage), whether you see it or not, and whether devs use it or not."
 
Status
Not open for further replies.
Back
Top