Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
Yes, since ninty chose esram for the cube I don't see why it couldn't be used again. And again it's not like XO's implementation was a failure.

You have to admit the thought of gddr6 and 128mb of 7nm esram is exciting.

I think the barrier to this would be die size, not cost. They can only make an APU so large at least without getting creative. Perhaps we could see discreet CPUs and gpus again?

Bah. I just like the custom hardware differences we saw in the old days, made it more interesting.

Well the big problem with the xbox ones esram was, that it was a bit to small, so a 1080p framebuffer did not fit in it entirely, which would have made things much easier.
But on the other side, I never expected the current gen to really target native 1080p at all, just because both of them had such low performance targets from the start. 720p is just a much better fit for both machines (at launch).
Well than 4k came much faster than I expected it.

One of the problems MS had with their solution was cost. If they wouldn't have invested so much into Kinect, XBO would have been a total different machine.
 
Well the big problem with the xbox ones esram was, that it was a bit to small, so a 1080p framebuffer did not fit in it entirely, which would have made things much easier.
But on the other side, I never expected the current gen to really target native 1080p at all, just because both of them had such low performance targets from the start. 720p is just a much better fit for both machines (at launch).
Well than 4k came much faster than I expected it.

One of the problems MS had with their solution was cost. If they wouldn't have invested so much into Kinect, XBO would have been a total different machine.

720p this gen!?

We had some games running 1080p last gen, I wonder this gen on base PS4 what % are 1080p I'd wager most other than the more demanding cross platform titles.

GPU was more than capable of 1080p the CPU was holding this gen back and that affecting FPS mostly.
 
Well the big problem with the xbox ones esram was, that it was a bit to small, so a 1080p framebuffer did not fit in it entirely, which would have made things much easier.

xBone's esram's 32MB is more than enough for a 1080p framebuffer. The problem is many games use deferred rendering, so they need a 1080p G-Buffer....
 
720p this gen!?

We had some games running 1080p last gen, I wonder this gen on base PS4 what % are 1080p I'd wager most other than the more demanding cross platform titles.

GPU was more than capable of 1080p the CPU was holding this gen back and that affecting FPS mostly.
This was my impression of the current gen before it's release (and not only mine). On the last gen consoles, it was already hard to get 720p. The newer graphical features were just to demanding. When we than saw the specs for PS4 & xb1 I thought that only 720p could be the target resolution and maybe here and there some games with 1080p. At this time I already had a AMD HD7850 which was … well not the best card you could have, but a good match for the PS4 GPU. Yes you could play in 1080p but not long and details and res had to be sacrificed to get 1080p running. And we can see it on some games, that Details are really something where they saved performance (vs pc).

Resolution is not everything, and those GPUs in xb1 and PS4 can really show much more detail at lower resolutions. But than the resolution-gate came and everything had to run in 1080p, if not people were complaining.
Resolution is only the easiest way to get better image-quality, not the best or most efficient. E.g. Battlefield Hardline on xb1 was 720 and looked so good (well the game wasn't so good at all ^^). 720p is just a better fit for the GPUs of current consoles while 1080-1440p is a better fit for the enhanced consoles. Else you must sacrifice to much detail just for resolution.
 
I'm living a different timeline.

The one I was in had just the very latest and most demanding games struggling to hit 720p (with X360 usually hitting the mark)...even Battlefield 4 which was pushing last gen and was a cross gen title was almost 720p.

The pixel count is only double and the power is significantly more, so unless I'm missing something I'm not sure what you're saying?

And comparing PC GPU to closed box environments isn't the best idea...consoles will always outperform from my experience. I certain expected 1080p from PS4 for most games, I think the main concern was the CPU.

I should add that I'm talking from a PS4 perspective...I do recall the concerns over XBO and it's inferior GPU holding it back. I also recall MS sending over Devs to help get 'parity' for Destiny.

Edit, sorry we're going OT.
 
Last edited:
Looking at the jedec docs, the gddr6 I/O count, from the point of view of the SoC controller, is about 72 balls per chip versus 61 for gddr5/x. So the total I/O count would look like this:

256 bits gddr5: 448 (ps4/pro)
256 bits gddr6: 576
384 bits gddr5: 732 (xb1x)
384 bits gddr6: 864
512 bits gddr5: 976
512 bits gddr6: 1152

It's not a huge overhead as it doubles the interface for only 18% more i/o lines, but it might become difficult to fit a 384 bits gddr6 on anything less than a 400mm2 SoC.
 
Looking at the jedec docs, the gddr6 I/O count, from the point of view of the SoC controller, is about 72 balls per chip versus 61 for gddr5/x. So the total I/O count would look like this:

256 bits gddr5: 448 (ps4/pro)
256 bits gddr6: 576
384 bits gddr5: 732 (xb1x)
384 bits gddr6: 864
512 bits gddr5: 976
512 bits gddr6: 1152

It's not a huge overhead as it doubles the interface for only 18% more i/o lines, but it might become difficult to fit a 384 bits gddr6 on anything less than a 400mm2 SoC.

These are all active toggling I/O? I.e. something that must be driven from the SoC?
 
These are all active toggling I/O? I.e. something that must be driven from the SoC?
Yeah it's the actual physical lines that need a discrete pin on the SoC: command, address, data, bus inversion, bank selection, error detection, write clocks are per channel, not sure about the main clock but I counted it per chip. Differentials counted as two.
 
256bit bus leading to the eight chip mounting points sounds reasonable.

I still hope for clamshell approach... 16 x 2GB GDDR6 [later switched into 8x4GB GDDR6].
 
256bit bus leading to the eight chip mounting points sounds reasonable.

I still hope for clamshell approach... 16 x 2GB GDDR6 [later switched into 8x4GB GDDR6].

What would you be willing to sacrifice from the remaining pool of hardware specs to get 32 instead of 16?
 
What would you be willing to sacrifice from the remaining pool of hardware specs to get 32 instead of 16?

Everyone should be willing to give up the Optical UHD drive and a supplied internal hard drive. The consoles should be BYODrive; have it use external USB 3.0 / 3.1 drives. But I doubt others would be that willing, as physical media is extremely popular, allegedly. Even at giving up both the Optical and internal, I don't think that would cover the cost for another 16 GB.
 
I’ve been reading most of these posts and as we go on and on and the most apparent thought that comes to mind, is the murder of the $399 price point..that has to go or the machines will be going nowhere. We are pushing to 2020 and beyond, inflation etc...the new machine should be at $599 and let the specs go to town. For all the tight asses that complain you can always stay on the last gen machines till the price finally goes down.
What serious gamer wouldn’t spend that on a 8year gaming investment? The casual gamers will stay on last gen anyway.
Bring on something special.
 
I’ve been reading most of these posts and as we go on and on and the most apparent thought that comes to mind, is the murder of the $399 price point..that has to go or the machines will be going nowhere. We are pushing to 2020 and beyond, inflation etc...the new machine should be at $599 and let the specs go to town. For all the tight asses that complain you can always stay on the last gen machines till the price finally goes down.
What serious gamer wouldn’t spend that on a 8year gaming investment? The casual gamers will stay on last gen anyway.
Bring on something special.

Consoles aren’t just for serious gamers. Our discussions are predicated on consoles being mass consumer devices. So most of the posters debating against high prices are doing so based on the logic that a too high of a retail price can negatively effect overall lifetime sales.

Regardless of your personal threshold for pricing on a console we all benefit from large userbases. Game budgets are based on potential sales which is dictated by the taste of the userbase and its overall size.

How do you prescribe getting down a $600 console to mainstream pricing levels? The only consoles priced above $500 that survived had to cut features (Kinect camera) or endure massive losses to get down to mainstream pricing. It’s not like everybody is confident that near term historical trends will evaporate and chips and DRAM pricing will drop faster than normal.

Having a bunch of gamers still stuck on last gen consoles just means big multi-platform projects with large budgets are still going to be using last gen hardware as the baseline for their development. What would be the point of 16-32 GB of VRAM if most games are still designed for 8?
 
Last edited:
Consoles aren’t just for serious gamers. Our discussions are predicated on consoles being mass consumer devices. So most of the posters debating against high prices are doing so based on the logic that a too high of a retail price can negatively effect overall lifetime sales.

Regardless of your personal threshold for pricing on a console we all benefit from large userbases. Game budgets are based on potential sales which is dictated by the taste of the userbase and its overall size.

How do you prescribe getting down a $600 console to mainstream pricing levels? The only consoles priced above $500 that survived had to cut features (Kinect camera) or endure massive losses to get down to mainstream pricing. It’s not like everybody is confident that near term historical trends will evaporate and chips and DRAM pricing will drop faster than normal.

Having a bunch of gamers still stuck on last gen consoles just means big multi-platform projects with large budgets are still going to be using last gen hardware as the baseline for their development. What would be the point of 16-32 GB of VRAM if most games are still designed for 8?

You raised some valid counter arguments but, I think the sales and marketing mentality needs to change, we need to train people on a new price point, and if it makes a difference people will pay. $600 intro price to get the ball rolling and by the time manufacturing ramps up and games start coming out the price will drop for the masses.
I see that as a win win.

Or forget what l said and launch in 2021
 
Status
Not open for further replies.
Back
Top