Predict: The Next Generation Console Tech

Status
Not open for further replies.
Agreed.

I just don't know who that is.

Wii gamers? Is it still Wii-mote centric? Did they improve the motion interface to be as accurate as Move, or is it still the same issue plagued wii-mote?

Core gamers? I'm pretty sure most of them will be waiting for ps4/xb720 based on the specs known.

Casuals/moms/kids? Not sure if they stick around or if they're headed to kinect/facebook/tablets/phones.


I think by them coming in light on hardware they limit the available demographic interested in their wares. They will still be competing with ps3 (w/move) and xb360 (w/kinect) for the duration of the WiiU's cycle.

I don't see Xbox3 and PS4 being that far away for PS3 and 360 to have enough time to be considered "true" competition.

As for who they are targeting, they say it's everyone. Will it be successful? Who knows? I personally expect the market to shrink and none the next gen consoles to sell better than their predecessors. But that's discussion for a different thread.
 
I personally expect the market to shrink and none the next gen consoles to sell better than their predecessors. But that's discussion for a different thread.
That depends on whether future game consoles will turn into something more than just a game console. Microsoft and Sony are clearly moving in that direction, they want their Xbox/PS to also function as your media hub and I think they'll improve on that functionality with their future consoles. I'm wondering how good Nintendo will be at that though.

I think the Wii U will be powerful enough if the difference in performance between the Wii U and the PS4/neXt-box is small enough that future iterations of CoD will be made to run on all 3 consoles. The Wii is missing out on those kinds of triple A titles simply because it's not powerful enough IMHO. If the Wii U does get such 'third-party' AAA games, just like the PS3 and neXt-box, then it'll be a great family game console. I'm still wondering whether Nintendo will be able to bring a great multimedia experience to the Wii U as well though.
 
Not forgetting anything.

This was discussed pages ago.

Yes, I remember now making an almost identical post a couple of months ago!

I am in the camp that thinks we'll get much more modest systems from Sony/MS.

4 Core IBM OoO Power7 3-3.5Ghz
AMD GCN based GPU with raw performance around Cape Verde
2GB GDDR5/XDR unified RAM
HDD 320GB-500GB (maybe smaller capacity SSD)
4-6x BD

At 28/32nm, the above would come in substantially below launch PS3/360 power usage and would be able to launch at $300-$400 in 2013.

They could do 200W again but I really doubt they will.
 
I personally expect the market to shrink...

I fully expect that to be the case as well, but mostly due to what hardware is, or isn't in the box.

Judging from what Nintendo is offering, I expect at least half the Wii market will be off onto other platforms.

The hardware they have on offer seems convoluted, and the chipset is weak. Where core gamers might have the patience/time/interest to know about all the hows whys and buts for the Nintendo WiiU interface, the casuals/moms/kids just want it to work. The fact that the tablet interface is limited to 1 (2?) is contrary to their prior offering of bringing everyone together (4 people) and though there are technical reasons for why they can't have 4 tablet controllers at once all getting an individual feed, their typical audience doesn't understand that. Also I'm not convinced that their audience is thrilled with the tablet notion to begin with. If they are, don't they have a kindle-fire/ipad?

So then that leaves core gamers, and the spec sheet isn't enough to sway them.


Now whether or not Sony/MS see a dwindling market for their product will mostly come down to what is in the box's spec sheet as the core gamer is their demographic. They're trying to expand beyond that, but the core gamer will make or break ps4/xb720.

It's up to MS/Sony to decide if that demographic is good enough, or if it will be a race to the bottom.
 
Yes, I remember now making an almost identical post a couple of months ago!

I am in the camp that thinks we'll get much more modest systems from Sony/MS.

4 Core IBM OoO Power7 3-3.5Ghz
AMD GCN based GPU with raw performance around Cape Verde
2GB GDDR5/XDR unified RAM
HDD 320GB-500GB (maybe smaller capacity SSD)
4-6x BD

At 28/32nm, the above would come in substantially below launch PS3/360 power usage and would be able to launch at $300-$400 in 2013.

They could do 200W again but I really doubt they will.

If they end up putting a 130mm2 GPU in the box, what makes you think they would put a ~230mm2 (28nm) quadcore Power7 to match it? Who in their right mind would pair such a weak GPU with a quad power7 in a console? When has a console maker ever had such a beefy CPU paired with a GPU half the size?

Are we building a server or a games console?

If they're going that light on the gpu budget, they can save a ton by sticking with a quad PPE/Xenon and call it a day...

Pretty sure the core audience will promptly ignore the wasted retail shelf space and proceed to stick to their existing gaming rig which outclassed such an offering years ago, or switch to one, but thankful for the new dx11 games that will finally make their way to the PC.

Or just stick to the existing console they own.
 
Last edited by a moderator:
If they end up putting a 130mm2 GPU in the box, what makes you think they would put a ~230mm2 (28nm) quadcore Power7 to match it? Who in their right mind would pair such a weak GPU with a quad power7 in a console? When has a console maker ever had such a beefy CPU paired with a GPU half the size?

Are we building a server or a games console?

If they're going that light on the gpu budget, they can save a ton by sticking with a quad PPE/Xenon and call it a day...

Pretty sure the core audience will promptly ignore the wasted retail shelf space and proceed to stick to their existing gaming rig which outclassed such an offering years ago, or switch to one, but thankful for the new dx11 games that will finally make their way to the PC.

Or just stick to the existing console they own.

Maybe I should have said "based" or left "7" off. My point is that I believe they will end up much less cutting edge compared to this generation.

What would be the minimum you would expect spec wise from next gen?
 
If they end up putting a 130mm2 GPU in the box, what makes you think they would put a ~230mm2 (28nm) quadcore Power7 to match it? Who in their right mind would pair such a weak GPU with a quad power7 in a console? When has a console maker ever had such a beefy CPU paired with a GPU half the size?

How are you getting 230mm2? 45nm 8 core Power7 is 576mm2, so just cutting the die in half to 4 is 288mm2. 45nm --> 28nm is only going to shrink 50mm2? Not to mention all the excess server/supercomputing crap and excess cache that would be removed... If it was made for a console 4 Power7-esque cores would probably end up 120mm2 or less at 28nm.
 
What would be the minimum you would expect spec wise from next gen?

The minimum will depend on the MSRP, but assuming $400 is the base target (with premium editions above this) I'd expect at least 250mm2 dedicated to graphics. However, I'm not under the assumption that the ps4/xb720 consoles will not have GPGPU functionality. Some CPU burden will be lifted.

That's why I'm figuring ~300mm2 as the baseline to expect for a gpu as it will not just be a dedicated graphics engine, but will be a gpgpu. Taking a bit of the die budget away from the CPU and shifting it toward the GPU.

With that said, ~100mm2 is what I'm expecting of the CPU. Roughly double the size of current Cell/Xenos at 28nm.
 
So TheChefO, you think around ~400mm2 total @28nm. I think around ~250-300mm2.(If Ninjaprime's figures are correct on the CPU:smile:)

They will probably end up somewhere in the middle. Now we wait for some proper leaks.
 
So TheChefO, you think around ~400mm2 total @28nm. I think around ~250-300mm2.(If Ninjaprime's figures are correct on the CPU:smile:)

They will probably end up somewhere in the middle. Now we wait for some proper leaks.
I'm more in line with you, and we indeed want more leaks but to me the info about all the early rumors is the lack of words "awesome" "monster" "big" "impressive" etc.
At this stage I'll be happy with the x6 statements turns out true.

I'm still thinking of the possibility of a set-up like AMD "dual graphic" especially after digging more about LLano more precisely about how power varies in regard to variations of GPU and CPU clock speed and the number of SIMD. I'm lacking knowledge: I don't know if disable SIMD array still leak and I still lack data: I should search the web more for serious measurement of llano power consumption with CPU,GPU overclock down clock but I start to build an answer to why it could make sense (not that MS did that but in "theory") to go with an APU+GPU. The answer could be simply that "you can" on the A8-3850 and the A6-3650 running the GPU on top of the CPU at full load cost you only 16/17Watts so ~114% of the system power consumption with the gpu idle (hardware.fr data by the way).

EDIT
Also taking in account the akin hd6670 and the time it may take to put together a proper APU/SoC it's likely that MS would want the SOC GPU and the discrete SOC to share the same architecture (VLIW%, southern island derivative, etc.).
 
Last edited by a moderator:
Saw this debate from the GAF thread, good to see this is being discussed here! Here's a question for you guys. Do you think the nextbox and playstation will use GDDR5 or will they stay with GDDR3?
 
I'm still thinking of the possibility of a set-up like AMD "dual graphic" especially after digging more about LLano more precisely about how power varies in regard to variations of GPU and CPU clock speed and the number of SIMD. I'm lacking knowledge: I don't know if disable SIMD array still leak and I still lack data: I should search the web more for serious measurement of llano power consumption with CPU,GPU overclock down clock but I start to build an answer to why it could make sense (not that MS did that but in "theory") to go with an APU+GPU. The answer could be simply that "you can" on the A8-3850 and the A6-3650 running the GPU on top of the CPU at full load cost you only 16/17Watts so ~114% of the system power consumption with the gpu idle (hardware.fr data by the way).

I think a dual APU/GPU setup could make sense in a two SKU system: a low priced tier consisting of only the APU that runs only Kinect, XBLA, and Apps (maybe without an optical drive as well) and a system with that SOC but an additional GPU for the triple AAA games. Perhaps the second GPU is eventually merged into the SOC after a die shrink (or two).

I don't think that this would be an efficient implementation, but I guess it's a possibility.
 
Saw this debate from the GAF thread, good to see this is being discussed here! Here's a question for you guys. Do you think the nextbox and playstation will use GDDR5 or will they stay with GDDR3?

Sorry I don't get which debate you're speaking about, die size or dual graphics kind of solution?
For ram GDDR3 won't happen. It's either ddr3+ edram or gddr5 with low odds cor something rambus based.
 
I think a dual APU/GPU setup could make sense in a two SKU system: a low priced tier consisting of only the APU that runs only Kinect, XBLA, and Apps (maybe without an optical drive as well) and a system with that SOC but an additional GPU for the triple AAA games. Perhaps the second GPU is eventually merged into the SOC after a die shrink (or two).

I don't think that this would be an efficient implementation, but I guess it's a possibility.
I don't like this idea of different SKUs (to begin with) with that much disparity in characteristic.
There were rumors about MS going that route as well as announcement to be made by the CES, so far it's been proved BS.

I think that the second GPU could indeed get merged down the road but it has implication on the memory organization. IF the second GPU has it own ram and most likely is connected to it by 128 bits bus once the whole stuff put together the resulting chip will have to accommodate for two 128 bits bus which has strong impact on the chip minimal size.


Some pages ago I considered that both the SoC and GPU could be on the same as xenos and its daughter die and connected with a high bandwidth link. Problem is I don't know what kind of bandwidth can be achieved at reasonable cost :???:

Case 1. We might want something like 64GB/s ie as much as the HD6670 is provided with if the second gpu is completed (ie ROPs are on chips).

Case 2. If in the end the second GPU is incomplete akin to Xenos and all the ROPS are on the SoC die I don't know how much bandwidth would be needed to make things workable. I fished for information about it earlier and so far I got answer. FYI I though that the bandwidth requirement would grow with the number render targers, their resolution, and the precision used for colors.

I tried to think more about it and here is my "thinking flow":
If Ms sticks in face of enthusiasts and most likely marketing division to 720P rendering a 32GB/s or a bit more link as in nowadays xbox is obviously doable.
Actually if you need less than 32GB/s and that the bus in the 360 was oversized to take in account the possibly bursty nature of the comunication between shader cores and ROPs that would be a good news (one should not forget either about the communication with the Main RAM). As it is the link in the 360 allow to write to move ~1GB of data per frame (at 33ms a frame) that's a lot more than a handfew of Render target and any sane resolution. Basically you are hold back by the bandwidth to the main RAM (22GB/s).
There are also games that rendered at 1080P on the 360 and the 32 GB/s has not been raised as a concerned as far as I remember.
So at this stage and without insiders giving me clue I start to build the conviction that one may not need that much bandwidth between the shader cores and the ROPs.
In the 360 that bandwidth was also need to move the your render target to the main RAM. If in a hypothetical system the ROPs are on the SoC and render straight into the main RAM bandwidth requirement would be somehow lowered.
In our hypotherical system the link would also be the only way for the GPU to access any kind of data (/texture) so we have to account for that, in the 360 Xenos had only 22GB/s to do so (shared with the CPU). For ref a pci Express x16 link provide up to 32GB/s

To make a long (and iffy) story short I believe that it would be achievable to have a functional second GPU as long as all the ROPs are on the SoC.

I can't see MS shipping being basically a x cores SMP CPUs + a HD 6670. Trying to make sense out of what we heard so far I could see a well design dual graphics solution surprise buy its performances and its silicon footprint.

I will give another try at what could be a really cheap system to produce and would do in fact pretty well as far as performances as concerned (and obviously giving more credibility than needed to all this early talks, but if we learn more I'll try to make sense out of it as anybody else).

SoC
6 tiny and power efficient IO cpu cores 2 or 4 way SMT. Close parent to XENON and POWER A2.
6 SIMD arrays so 96 VLIW5 units or 480 SP (as the hd 6670)
64 Z/Stencil ROP Units & 8 Color ROP Units (twice the hd 6670 so close to the hd 5770)
128-bit GDDR5 memory interface
1200MHz gddr5 => same bandwidth as hd 5770/6770 parts.
UVD3 engine
@ 32nm

GPU 2
"ROP-less" hd 6670, no UVD3 @ 28nm

I won't cone with FLOPS figures or clock speed as it's not reasonable, we've seen that AMD lately use pretty high tensions to make sure all their parts function properly. It could be even worse for a console manufacturers as bad chips have no possible use. The good news for quiet some parts (llano or hd 6670 6570) the difference in GPU clock speed have marginal impact on power consumption. On llano the main offender to power consumption seems to be the CPU cores clock speed, so manufacturers may have more room to play than AMD in the part that interest us the most ie GPU perfs :)

I believe that the silicon foott print for such a system would be really low, south of 200mm2 for the SOC, around the size of nowadays daughter die for the second GPU.

A summup of the system could be, 6 cores, 960 SP which would sound more sane to a lot of members here. Then it's a matter of clock speed especially the CPU clock speed to make things fit under a single radiator.
 
I wonder if AMD warming up for ARM based designs is due to a console deal?

It's the most business elegant solution. You reduce the design supplier to one (AMD) and the manufacturer to one (Global Foundries).

A six core ARMv8 64 bit MS/AMD custom chip integrated with a high performance AMD GPU in the HSA architecture is the sexiest console design imaginable. The low power ARM cores would allow the power to go the GPU where it will count the most while keeping the CPU allocation of the TDP low.

MS could then turn around and not only run Windows 8, and Windows Phone apps on it, but could reuse the entire IP for other Windows devices.

Given how low performance the PPC GuTS core of XCPU is, emulation of it on the ARMv8 might not be that much of a stretch, while the altivec could be emulated on the GPU.

Do it, MS.
 
It's the most business elegant solution. You reduce the design supplier to one (AMD) and the manufacturer to one (Global Foundries).
.

I'm not sure about this. For one, the console maker buys the IP so whether it comes from multiple sources or not, in the end shouldn't matter (of course there could be a better deal by going with one).

I don't think a console manufacturer would want to be limited to one manufacturer. I think you need to be able to move you design between fabs (GF, TSMC, Chartered, IBM, etc) as prices and market demand fluctuates.

But other than that, I don't see a problem with an ARM based CPU provided it can achieve performance equal to a PPC CPU.
 
Status
Not open for further replies.
Back
Top