Predict: The Next Generation Console Tech

Status
Not open for further replies.
A timed exclusive of GTA will not happen. Not unless Nintendo basically buys Rockstar. That game is set out to sell 20 million units in an instant and Nintendo just cannot sell these many units of a new, expensive (supposedly?) console that fast. Some rumors even state that this time around the PC version might launch on time, too, which I'd appreciate... at least if the port is a bit better this time around than 4.

There are rumours all the way, around and then some, time shows :) Most likely I pass consoles 2012 and wait for next gen.
 
I can't see any reason to think it wouldn't be. I think it fair to say Nintendo have generally designed well balanced systems without an obvious gimpage. N64 had stupid memory limits, I guess, and perhaps one could say GC was a bit short on RAM too, but they've not done anything really stupid like stick a massive GPU on a tiddly little bus before now and have it BW starved. Where their hardware has been lacking is choice of components, not design. As such, if they put in a decent CPU and GPU this time around, I expect all the extras to be up the task. If we use such gross measures as multiples of a previous generation, if MS and Sony have a typical 10x increase from this gen to next, and Wuu only has a 2x increase over PS360, then Wuu will be 5x behind PS4*. If Nintendo go with 3x this gen (not implausible, with next-gen GPU and more RAM) then that places them ~3x behind PS4. The difference between PS4 and Wuu really can't be like the 10x difference between PS360 and Wii.

*Note I only use PS4 and not PS4 + XB3 for convenience. I clearly mean 'next gen consoles from MS and Sony' in that comparison. ;)


What if Wuu is only 20% more than PS360? And XB3/PS4 are more like 20X PS360, as we get into year 8, 9, from ps360 release?

I think sadly for Nintendo, the gap will be more than enough. Lets take your best case, 3X scenario, for this gen, that would be a 360 with a 166 mhz Xenos, 1ghz xenon, and I dont know, ~170 MB RAM? How pitiful would ports on that box have looked? Would anybody have bothered?
 
I think sadly for Nintendo, the gap will be more than enough. Lets take your best case, 3X scenario, for this gen, that would be a 360 with a 166 mhz Xenos, 1ghz xenon, and I dont know, ~170 MB RAM? How pitiful would ports on that box have looked? Would anybody have bothered?

What are you talking about ?

We already have a fair idea of what the Wii U will be like, it's > PS360, and I highly doubt competitors will release anything with more than twice its memory. (For the record the Wii only has 88MiB vs 512MiB this gen, and sold best :p, although I agree its gfx look quite dated.)
There almost certainly won't be as much difference between the WiiU and its competitors as there was between the Wii and PS360.
 
I think sadly for Nintendo, the gap will be more than enough. Lets take your best case, 3X scenario, for this gen, that would be a 360 with a 166 mhz Xenos, 1ghz xenon, and I dont know, ~170 MB RAM?
Those specs don't describe what I mean by a 3x difference. I used the term 'gross measure' because such measurements are hard to make and very inexact. Is 2x the console achieved with just 2x the RAM, 2x the CPU clock, 2x everything? If we take this gen as 10x last gen, for PS3 that was achieved with 8x the RAM; 11x the CPU clock, but massively more execution units; 3x the GPU clock; the same working RAM bandwidth...it's such a mishmash of different variables, you can't make a simple performance extrapolation by just changing clocks and RAM amounts.

For 3x the performance, I guess I mean 3x the perceived on-screen graphics and in game elements. 3x the vertex counts, 3x the shader complexity, 3x the amount of physics, whatever hardware that runs on. And it's probably not a linear scaling for all those factors either. Heck one console running a game 60 Hz, and the other at 30 Hz, would constitute a literal half performance, yet Joe Gamer won't care a great deal about the difference if other factors come into play (cost, popularity, yadayada). It's probably a logarithmic scale, like human visual perception. With light, half the amount of light is seen as about a 10% difference in brightness. Likewise a console that's 'half' the performance of another is only going to look marginally worse, and not only half as good. This is something tech-specs can't shed any light on, how a console's graphics will hold up especially versus competitors. Hence it's a gross measure, just for discussion purposes.

Edit: In fact, I can explain exactly how the specs work. They are like dimensions of a shape. A square that's half the size of another square isn't half the width and half the height. These parameters are multiplied to achieve a half total value. In a console CPU clock, GPU clock and RAM could be considered dimensions of a cube. Halving all three would result in 1/8th the total 'volume' or console 'power'. Only there are way more dimensions than just three. Thus a 10% decrease across the board could result in a halving of the total performance And some dimensions can't be measured, such as shader capabilities or GPU features.

In real terms for graphics, a third of an XB3 would probably look very similar in games, only at less fidelity. A lot of future performance could be spent on excellent lighting and shading, which a lesser console could fake more and look comparable. Hopefully we'll see a decent amount spent on IQ next-gen, but a lesser console could get away with less AA and still be in the same ball-park. Wii wasn't in the same ball park, or the same league. It was several divisions down and that was obvious in its visuals. Wuu should have visually less discrepancy. The same game could be run at half the poly detail, less AA, slightly simpler shaders, lower resolution (720p vs 1080p), and still look the broadly same and not like some last-gen throwback. The only way this couldn't happen AFAICS is if MS and Sony go all out with insane, unaffordable specs leaving Wuu some generations behind. That or launch 3 years later!
 
But AMD CPU&GPU combo may be cheaper then IBM CPU&AMD GPU, wouldn't MS/Sony choose price/perf over perf/watt?..

What about an AMD designed ARMv8 64bit chip custom designed with Microsoft?

Is this a totally unrealistic scenario? It would seem to solve the whole price/perf and perf/watt conflict very nicely.

This would be several low watt ARMv8 integrated within the GCN architecture all on a single fusion chip further realizing cost and cooling gains.

With AMD making every suggestion that they are going to eventualy adopt ARM and their former manufacturing division, Global Foundries, already working with ARM on 28nm and 20nm tech could this be the secret perfect solution that not only meets all performance, cost, thermal, etc requirements, but gives MS the advantage of integrating 720 with Windows on ARM and Windows phone and tablets?
 
The gap won't be the same because Wii U will be using "competent" hardware as opposed to Wii.



I see it similar to this, but I think Nintendo will target either 32nm or 28nm for the GPU and probably closer to 640 ALUs. I see Sony/MS targeting GCN CUs. No more than 20 (1280 ALUs).


Could be interesting with a gpu wiiu 640alus at high clock,wich in this case was a similar technology on the market the like Radeon HD 4770/RV740 since 2009 at 40nm dissipating 80 watts and performance similar to HD 4850,which in theory would probably be less than 60 watts under 32/28nm which fit perfectly in the apparent small size of the next gen Nintendo console( my guess wii u = less than 150 watts).

About 1280alus for "psx4720" gpu I also agree, but let us not forget the miracle that AMD did to get put in just 4.31 billion transistors only 365mm2 with Radeon HD 7970 with many thanx to 28nm process.So I do not rule out the possibility of all of it up to see a psx4720 1408alus capable of being placed in a console 200/250watts(my another guess about wattage for next gen console MS and Sony).
 
The HD 7850 has 1408 ALUs and is a 90w part counting board power, so less for a console with lower clocks.

If the CPU is less relevant and we already have 1408 ALUs under 100w with an alleged target of 200w, then I still see headroom for a bit more.

Significantly more if the 20nm die shrink is considered in 2013-14. Perhaps big-die 28nm in 2013 in hopes of a 20nm, cost cutting respin in 2014-15.

-----------------------------------

What if Microsoft designated Xbox 360 as the "Kinect Console" with a newer, cheaper hardware revision. Since Kinect is currently bottlenecked by the USB port, perhaps a newer model with USB 3.0 and a 20nm/22nm APU.
 
^ Is that correct? I haven't seen TDPs for anything below the 7950.

Could be interesting with a gpu wiiu 640alus at high clock,wich in this case was a similar technology on the market the like Radeon HD 4770/RV740 since 2009 at 40nm dissipating 80 watts and performance similar to HD 4850,which in theory would probably be less than 60 watts under 32/28nm which fit perfectly in the apparent small size of the next gen Nintendo console( my guess wii u = less than 150 watts).

About 1280alus for "psx4720" gpu I also agree, but let us not forget the miracle that AMD did to get put in just 4.31 billion transistors only 365mm2 with Radeon HD 7970 with many thanx to 28nm process.So I do not rule out the possibility of all of it up to see a psx4720 1408alus capable of being placed in a console 200/250watts(my another guess about wattage for next gen console MS and Sony).

I still think that amount is too high, but that's my take from what I've seen so far. And I think Wii U will be less than 100w.
 
No chance Nintendo will use 28nm chips. If they have to drop the clocks they just do. Until proven otherwise i agree with digitalfoundry that we are looking at a 4670 320SPUs @40nm derivate
 
No chance Nintendo will use 28nm chips. If they have to drop the clocks they just do. Until proven otherwise i agree with digitalfoundry that we are looking at a 4670 320SPUs @40nm derivate

DF made an assessment based on underclocked dev kits. Though I don't criticize them because that info came out after their assessment.

And it's very plausible that they can use 28nm for their GPU since NEC/Renesas started development of their 28nm fab back in 2009. They are also capable of eDRAM on a 28nm process. Nintendo used NEC for both GC and Wii.
 
^ Is that correct? I haven't seen TDPs for anything below the 7950.

It's not official just yet, but I think it's possible.

HD7850 (Pitcairn Pro):
The HD7850 price and specs would put it at a sweet spot for budget gamers. Based on the 28nm Pitcairn Pro Core and offering performance similar to the HD6950/GTX560Ti. The card would feature 1408ALUs clocked at 850Mhz, 88 Texture Units, 32ROPs, 22SIMDs and a rated TDP design of 90W. A 2GB 256-bit wide memory interface would run at 5.2Gbps 166GB/s.
The HD7850 would be priced at $199 US when launched.
http://wccftech.com/amd-launching-r...n-xtpro-february-2012-pricing-specs-detailed/


I'm going affirm my previous assertion: 2048 ALUs, Power7 quad, 3GB GDDR5.

My feelings on EDRAM is that we no longer need it. With the increase in bandwidth, EDRAM seems irrelevant. The transistor budget can go towards the GPU.
 
Last edited by a moderator:
If Nintendo are re-using an AMD GPU I would think they would stick to TSMC as it seems the easiest path to follow. If the case is that Nintendo decide to shrink the GPU from 40nm they may as well move it to another foundry but it looks like TSMC's 28nm fab is doing quite well right now.

east of eastside said:
What about an AMD designed ARMv8 64bit chip custom designed with Microsoft?

Is this a totally unrealistic scenario? It would seem to solve the whole price/perf and perf/watt conflict very nicely.

Yes it is a totally unrealistic scenario. There are no noises from AMD that they are going to begin developing an ARMv8 64bit CPU - regardless what one may read on the internet. Don;t know if you are referring to "Project Win" e-mail but that was just that - confusion.
 
It's not official just yet, but I think it's possible.

http://wccftech.com/amd-launching-r...n-xtpro-february-2012-pricing-specs-detailed/


I'm going affirm my previous assertion: 2048 ALUs, Power7 quad, 3GB GDDR5.

My feelings on EDRAM is that we no longer need it. With the increase in bandwidth, EDRAM seems irrelevant. The transistor budget can go towards the GPU.

EDRAM bandwidth has increased as well, and there are other benefits besides bandwidth from EDRAM, especially if you have it on-die.
 
It's not official just yet, but I think it's possible.

http://wccftech.com/amd-launching-r...n-xtpro-february-2012-pricing-specs-detailed/


I'm going affirm my previous assertion: 2048 ALUs, Power7 quad, 3GB GDDR5.

My feelings on EDRAM is that we no longer need it. With the increase in bandwidth, EDRAM seems irrelevant. The transistor budget can go towards the GPU.

He's basing that off a vliw4 7950 and not a gcn which we now know to be true. A Pitcairn Pro will have 20 CU's or 1280 ALU's (The Pitcairn XT will have 24CU/1536ALU and the Pitcairn LE likely be a downclocked Pro). Now unless the 7850 is just a down clocked 7950, which we have zero reason to believe, that'd mean it'll be a 16CU/1024ALU or 12CU/768ALU gcn or maybe a 1152ALU vliw. The chances of a console going with a full set of CU's in a gcn without any harvesting is highly highly unlikely. That means your most likely console amd gpu will have 20CU/1280ALU's and likely lower clocked(800mhz) too.
 
If Nintendo are re-using an AMD GPU I would think they would stick to TSMC as it seems the easiest path to follow. If the case is that Nintendo decide to shrink the GPU from 40nm they may as well move it to another foundry but it looks like TSMC's 28nm fab is doing quite well right now.

I see Nintendo sticking with NEC because they would amount to being a dedicated provider. TSMC has enough customers to deal with. Nintendo did did that with ArtX and ATi, so I doubt that would change just because AMD own them now. When looking at old press releases, what NEC/Renesas have designed sounded exactly like something Nintendo would use.

That means your most likely console amd gpu will have 20CU/1280ALU's and likely lower clocked(800mhz) too.

Gives me more confidence in my max expectation, especially since I expect around that clock speed as well. :LOL:
 
Yes it is a totally unrealistic scenario. There are no noises from AMD that they are going to begin developing an ARMv8 64bit CPU - regardless what one may read on the internet. Don;t know if you are referring to "Project Win" e-mail but that was just that - confusion.

Since then there have been comments from the new CEO that have been interpreted as opening the door for ARM support:

http://www.eweek.com/c/a/Mobile-and...tions-Open-Regarding-ARM-Architecture-381295/

That would include adopting a chip architecture that isn't x86, but rather the low-power ARM platform used by such companies as Qualcomm, Texas Instruments, Samsung Electronics and Nvidia and currently dominant in the fast-growing smartphone and tablet spaces. Read apparently wasn't definitive in whether the company will move in that direction, but he did say AMD is keeping the option open.


"Mr. Read (for the first time, we believe) suggested that an ARM-based system on chip (SoC) is not out of the question if that's what customers prefer," the analysts wrote in their note. "Heresy by AMD's historical standards, but quite consistent with Mr. Read's philosophy of winning in the market: execution, innovation, and convergence."


"At the end of the day, it has to be market driven and by the customer," Read said, according to a MarketWatch report. "We have a lot of IP and a lot of capability. We're going to continue to play those cards, but as you move forward, making sure that you're able to be ambidextrous is definitely a winning hand."
 
I doubt it was more than an educated guess. The guy has good sources in game development..

Nah, they were pretty clear on that.

http://www.eurogamer.net/articles/digitalfoundry-vs-e3-nintendo?page=3

At this point we're speculating, but our guess is that Wii U's RAM is based on GDDR3 or DDR3 - far more cost efficient than the top-end GDDR5 and the hitherto non-existent DDR4. In terms of the make-up of AMD's custom Radeon GPU, we reckon it probably has more in common with the Radeon HD 4650/4670 as opposed to anything more exotic. The 320 stream processors on those chips would have more than enough power to support 360 and PS3 level visuals, especially in a closed-box system. Fabricated on AMD's current 40nm process, it would be cool enough and cheap enough, but the 2012 launch may well mean that Nintendo could move directly to 28nm, making for a more cost-efficient, cooler box.
 
Status
Not open for further replies.
Back
Top