Predict: The Next Generation Console Tech

Status
Not open for further replies.
Aah, you're assuming there's _been_ a "first set of silicon" from the fab. Aah, youth, so idealistic. :)

I should have specified Orbis silicon. Sweetvar26 mentioned it on Gaf and I 100% believes in his source as long as the source doesn't have room for double interpretation.

I was puzzled at how a system designed with unified GDDR5 in mind can be changed to include 8GB of ram. Either Sony started with DDR3/4 in mind, or they didn't
 
Really? Do you mean the closed box nature of consoles will narrow the gap or do you mean the GPU and/or CPU will have high end PC performance?

Honestly, I think Acerts reasonable spec is about the best I expect. Maybe a few customisations like DSPs to aid performance.

I am expecting games to look like high end PC games and beyond on release.
 
The specs in Acert93 are freaking good.

I'll be extremely happy with 18-20 gcn2 cus in either system.

Acert93's disappointed edition is like everyone else's reasonable edition, and Bkilian's "better than expected" edition.

EDIT: To be more specific, the above pertains only to the gpus.

Well you cannot (a) just look at GPUs and (b) only look at shaders.

Shaders can be a good benchmark for "GPU-class" i.e. 20 CUs being a Pitcairn class.

But that isn't what I wrote ;) A full Pitcairn at 212mm^2 -- smaller than the Xbox 360 total GPU budget and smaller than the PS3 GPU budget -- has 20CUs, 80TMUs, 32ROPs, at 1000MHz. What I wrote had 18CUs, 50TMUs, 8 ROPs at 750MHz.

In pure shader-limited scenarios this is a 70% Pitcairn (ouch considering the size and the fact I am betting CPU die area is going to be less, not more, this gen) but is more like 50% Pitcairn in texturing and the fill rate is going to be dog low--but hey, it will only need to support 1080p ;)

As a system the GPU was in a system with limited architecture (Xenos style eDRAM) and slow main memory (sebbbi has mentioned why this is not a good idea) and a CPU that is spartan. Argue IPC all everyone wants but the reason Xenon/Cell had some legs was because there was performance on the table if you could vectorize your code (Metro devs have discussed this). Definately not an ideal situation but with 4 Jaguars you are looking at a reduction in threads (both systems) and cores (PS3), have barely raised the top end, and will be asking it to do more in terms of Kinect 2/Move 2, more advanced/always on dashboards, etc.

Which of course puts more pressure back on the GPU because there won't be a beefy CPU to feed it and instead of doing interesting GPGPU things on the GPU you are going to be using those shaders to offload CPU work any way you can. In that context I think said GPU looks pretty weak sauce.

Which is fine, because we all know these consoles are a stop gap until streamed gaming to any devise matures in 3-5 years.
 
What I wrote had 18CUs, 50TMUs, 8 ROPs at 750MHz.
That's not going to work anyway as the TMUs are integral parts of the CUs, i.e. there are always 4 TMUs per CU as long as AMD doesn't change their GCN design completely. But it doesn't make sense to do such quite deep reaching change.
 
I wish I would be that optimistic!
Well "looking like" high end PC games is a given though I read many time here that BF3 for example was not good enough.
I don't expect that games budgets are going to further explode up, or at least not for a significant amount of titles.
Though on pure technical merit I don't expect this gen to achieve what the 360 achieved last gen and be at the top of the pyramid even for a few months.
Haswell CPU should provide performance consoles won't touch. On the GPU side I would be surprised if they manage to include the last generation of AMD GPU (should be GCN2 by tis time) in the design, even less the higher products.
I would think that both MSFT and SOny would want to thing to be well ironed out, to start production well ahead of the launch window, etc. I would think that RroD and its lesser brother on the PS3 would have triggered an aversion to risk among the executives of those two companies.

Overall I think that next gen are pretty much going to be PC under disguised, so nothing new for the devs and researchers that have gotten their hands on that kind of hardware for a long while and even on way more forward looking kind of hardware (like Larrabee).

The thing that kind of turn me off is that mid way this gen we were fed presentations about the "paradigm shift" to come and it won't happen. I think that actually aside the researchers that wrote those papers nobody wants it to happen. The R&D cost to create hardware would be crazy high (if we speak about a billion for a major revision like Broadway => Expresso imagine for something new). On the software side well early result would be most likely underwhelming.
Shortly looking at the money involved I would say aversion to risk pretty much ensure that we are mostly only going to get more of the sane.
At this point I'm just hoping that "next gen" does last more than 5 years. I went to best buy this week end and it is depressing to see those old consoles caught in nothing short of a "time stop" when their are new tablets, new computers with touch screen, even a new windows.
Not a really a surprised that sales are down, I mean it is depressing.
Anyway it is more and more likely that from my pov a proper PC will be a better investment that either the next xbox or ps4 even though it could proved a tad costlier.
I mean if I wait as planned till late 2014 to replace my laptop I should be able to buy something that match those "next gen" systems, that is mobile, that allows me to do whatever I want, and so on.
 
Last edited by a moderator:
GPU
Reasonable, "We should not ask for too much" GPU: 8xxx revision Pitcarn-class. 20 CUs (1280 stream processors), 80 TMUs, 32ROPs, in the 800-1000MHz range (TDP constrained).
Disappointed Edition: 7xxx derived, neutered edition: 18 CUs, 50 TMUs, 8 ROPs, 750MHz.

Is 18 CU's a typo? Still seems high for a disappointing GPU.

Personally, I'm thinking:

Possible: 16 CU's @ 1GHz
Probable: 12 CU's @ 750 MHz
 
GPU
Reasonable, "We should not ask for too much" GPU: 8xxx revision Pitcarn-class. 20 CUs (1280 stream processors), 80 TMUs, 32ROPs, in the 800-1000MHz range (TDP constrained).
Disappointed Edition: 7xxx derived, neutered edition: 18 CUs, 50 TMUs, 8 ROPs, 750MHz.

So a 7870 updated with GCN2 units? We're talking at least 2.2TF there. Much higher than the 1-2 TF rumors floating around. I'd personally be very happy with something like that. (7970m hit 2.176 TF with 850 core clock in ~100W TDP. I assume the more selective bin of mobile cards will be compensated for by an improved GCN2 to get better yields at same level of performance)
 
The 12 CUs would be the disappointing version. I like to think that more are probable.

While the CU/Flop's are a nice easy to grasp "power" metric, they are really only meaningful if you have the bandwidth to feed them.
From what little I've seen compute shaders are more often constrained by memory than ALU count.
If your bandwidth constrained doubling the CU count isn't going to help much, you could probably take the current PC figures as what NVidia/ATI believe are useful ALU to bandwidth ratios. Though both are likely optimized for current PC games.
 
Also folks (like Charlie) have been rumbling that the new console was originally one architecture and then changed to another one. That's simply not true, there has only been one architecture considered for the device from the start of the project. A few things did change, but not big changes. It's how I know Charlie is full of crap.

Really?
The Yukon architecture from the leaked MS roadmap seems to quite different to whats in the dev kits (it had PPC cores for backcompat, 2 GPUs, and a choice between ARM or x86 cores).
yukon.jpg
 
That wasn't made by engineers, but marketing division. I'm sure MS decided for power and architecture of next Xbox long ago and didn't change much. As I said,
 
LOL! No, he is saying he doesn't know what's in Orbis. He is using the made up Orbis number (18) as a reference, because he DOES know Durango's number. :)
I was using the rumor of a 1.8TF GPU in Orbis, which at about 750MHZ comes to 18 CUs :)

I'll also speculate that both MS and Sony will pull a Nintendo this round and not talk about the hardware specifics at all. It'll be all about the capabilities and actual games, with nary a mention of RAMs and cores and gigahertz.
 
Status
Not open for further replies.
Back
Top