Predict: The Next Generation Console Tech

Status
Not open for further replies.
How much eDRAM would be needed to avoid tiling at 1080p with 4xMSAA? Is it an in anyway feasible amount (I have no idea)?

Alstrong said:
With that in mind let’s consider that either 8:8:8:8 (RGBA) or 10:10:10:2 (FP10), and a 32-bit Z-buffer the backbuffer should take:

63.3 MB for 1080p and 4xMSAA
28.1 MB for 720p and 4xMSAA

and with 16:16:16:16 (FP16), the figures double.

64 megabyte might be feasible cost wise, but im not sure if its worth it, althought it would do wonders for picture quality, 4x AA @ 1080p is rather decent IQ.
 
64 megabyte might be feasible cost wise, but im not sure if its worth it, althought it would do wonders for picture quality, 4xAA @ 1080p is rather decent IQ.

It sounds awesome to have 1080p with 4xMSAA 'for free' with EDRAM, but looking at what todays GPUs can do at 1920x1200 resolution AA levels at 8x and AF levels at 16x I'd say EDRAM is waste of space.

Bring me 'face lifted' GT5 engine at native 1920x1080 60fps with 4xMSAA. 2011 mid-range GPU probably could do that. Does not sound much and it screams Wii-like upgrade, but hey that is just me. :)
 
64 megabyte might be feasible cost wise, but im not sure if its worth it, althought it would do wonders for picture quality, 4x AA @ 1080p is rather decent IQ.

Would on die ed-ram or equivelent make any differences in the equation in terms of implementation or design considerations? It would certainly be far lower latency and perhaps they could get away with using less cache/ram as they could implement tiling and such much more efficiently and save on the off die bandwidth costs.
 
I'm talking about a cut down, 16 or 12 core based version for Nintendo's console. I don't expect it to feature anything that comes close to the PC high end, but then it doesn't have to be. Anything's going to be a huge leap above Wii, its going to be replacing over a decade old GPU design by the time it hits. As such whichever way Nintendo goes the performance inrease is going to be so massive that it doesn't matter what they choose, which leaves cost as the primary deciding fatcor imo. If the rumours of Intel pushing LRB to the console manufacturers with lucrative contracts are true, then that's got to have Nintendo taking notice.

Is the next process shrink after 32nm (22nm iirc) viable for a console released in fall 2012/2013? Surely A 16 core LRB on based on 22nm tech. would be suitable for a console design, especially one that forgoes a separate CPU altogether?

Would others be happy with a 16 core LRB with a 128 bit memory bus and 1GB of GDDR5 for Nintendo's next machine? Surely that's got to be capable of some very interesting things at 720p/30hz?

I agree with Shifty on the RAM front, no chance of us seeing more than 4GB of RAM, next gen. Sadly, after the Wii has shifted the goalposts making cutting edge tech. less of a concern I fear we may only see 2GB of RAM in Sony and Microsoft's box. :cry: Consoles are always RAM starved, we were lucky to get 512MB this generation, I doubt we'll be lucky enough to be seeing anything like 6GB or 8GB.

A low clock single 16 cores chip would give less performance than X360 or PS3. Nintendo will concentrate on their next gimmick, what ever that will be, if they need the processing power to handle their gimmick they probably just use a better PowerPC processor. I doubt they'll need 16 cores Larrabee for their gimmick.

Beside 12 cores Larrabee might still be on the large size compare to what Nintendo used in Wii.
 
64 megabyte might be feasible cost wise, but im not sure if its worth it, althought it would do wonders for picture quality, 4x AA @ 1080p is rather decent IQ.

That is an awful lot of die space though 1080p with 4xmsaa as a console standard does sound nice, I've really got use to it on the PC side and its hard to go back. GTX 260 and 4870 1GB level hardware cope just fine with 1080p resolution (with 4xmsaa) in my experience so I'm not sure its really needed. If a 256bit bit bus and GDDR5 memory is all that's needed for decent 1080p performance then you might as well dedicate that die space to more ALUs imo.



A low clock single 16 cores chip would give less performance than X360 or PS3. Nintendo will concentrate on their next gimmick, what ever that will be, if they need the processing power to handle their gimmick they probably just use a better PowerPC processor. I doubt they'll need 16 cores Larrabee for their gimmick.

Beside 12 cores Larrabee might still be on the large size compare to what Nintendo used in Wii.

One of Nintendo's "gimmicks" next generation will be HD graphics though, they have to make that step up and since HDTV adoption will be through the roof at that point its a decent selling factor to have. If anyone can market HD gaming to the masses, its Nintendo.

They've got to move away from their old fixed function design to achieve that, is anyone really expecting a tweaked Hollywood that makes HD rendering faesible? I severely doubt that, ATI, haven't made a GPU like Flipper/Hollywood at all for a very long time.

If a 32 core LRB is aimed at competing with the high end DX11 GPUs, then how does that put a 12/16 core derivative below a 7600GT/7900GT hybrid? I didn't mention anything about slashing clock speeds in half, sure they may be tweaked, but not halved like you mentioned, still something within sight of the projected 2ghz figure.

Once you cut that thing down by removing half the TMUs, utilising a 128 bit bus (with GDDR5 for instance) and shelving half the cores, and then producing it on a 22nm process its got to be much smaller than the LRB of today. Yet performance at 720p/30hz is still going to be pretty damn nice, a decent upgrade from A ps3/360 and a huge upgrade from Wii. PS4/Xbox3 titles aimed at 1080/60fps, may just be a case of halving the resolution/framerate and cutting the texture resolution, so long as you bear it in mind at the start of development and don't utilise too much processing power. I doubt the masses will care about such downgrades in their games (if they did they'd be playing todays multiplats on PC afterall), especially if the console/games are cheaper and have a new and unique control method.

I admit I'm a bit hopeful, I just would really like to see a Nintendo console with a little bit of grunt, the GCN was an excellent design and it was such a shame to see them take a sideways step in technology, no matter how much I appreciate the Wiimote.
 
Last edited by a moderator:
I assume economic reasons would be the main driver for eDRAM. It would allow the GPU to keep a 128bit memory controller.

What I expect from Microsoft is

Triple core IBM Power PC CPU based off of Power6 (which uses in-order cores)
ATI Direct X11 GPU with 128bit bus & 64 megs of eDRAM
2 Gigs of GDDR-5
8x Blu-Ray drive
500 GIG standard hard drive
launch X-mas 2011 @ $399
 
But wouldn't external ED ram require its own memory bus? So therefore wouldn't it be comparable to having a larger than 128bit memory bus anyway?
 
Nintendo Wii Wii

Couple of points on Nintendo's Wii:

1. Die size of the Wii CPU is 19 mm² clocked at 729 MHz (approximately 20 million transistors in the PowerPC 750CX - cannot find figures for the transistor count of the "Gekko").
2. Die size of the GPU is about 72 mm² clocked at 243 MHz with 3MB on die as a texture buffer (figures of approximately 51 million transistors)

This is all at 90 nm.

Fast forward to 2011 and say Nintendo use 45 nm technology instead and keeping with the die size and form factor Nintendo Wii has... lets restart the speculation?

Is 50% reduction feasible from 90 nm to 45 nm and then expend the resulting saving into transistors rather than die size savings? If so.. then:

40m transistor CPU and a 100m transistor GPU. Not much to play with really.

Nintendo use 45nm technology instead and keeping with the die size and form factor Nintendo Gamecube has (this I feel will be about as aggressive as Nintendo will be for its next gen console):

Original 180nm "Gekko" was 43 mm² = 80 to 100 millions transistor CPU
Original 180nm "Flipper" was 110 mm² = 200(ish) million transistors GPU


Now we can increase speeds of the CPU linearly from same factor as going from 180 nm to 90 nm - 1.5x for CPU and GPU (unconfirmed still I believe).

So in the end we have a new Wii2 with the following specifications for its CPU and GPU:

PowerPC CXe based CPU at 1.093 GHz - 90 million transistors (clock speed very slow and probably underestimated and an AMD Athlon64 with its 105 million transistor size budget may fit the bill better or a dual core Xenon based PowerPC based processor).

GPU at 364.5MHz - 300 million transistors (minus the 3MB edram die space saving) - looking around something like a slow clocked Radeon 4550 or Geforce 8600GT.

At 32nm you may as well add 1.5 to all the figures and then it gets really interesting! ;)

I know CPU's and GPU's are not designed to scale in clockspeed this way without serious modification and there are lots of caveats to my wild speculation but seems more likely than speculating Nintendo would use Larrabee.
 
The market leader is never the one to usher in a new generation, its always some competitor that wants to get an early jump on them that kicks it off. See Sega with their Dreamcast, Sony with their PSX and NEC with their PC Engine. Its one historical precedent that has remained constant despite how much the market has changed. With Nintendo rolling in the cash with Wii, I see nothing to suggest this is going to change anytime soon.

As such that pretty much entirely rules out a 2011 launch, makes 2012, somewhat unlikely and makes 2013 seem more reasonable. I'd sure as hell hope Nintendo won't be using a 45nm process in 2013! By the end of the year most PC GPUs will be targeting 40nm, do we really expect Nintendo to use an older process with a product that may be 4 years away?

Nintendo stuck with as modern a process as both Sony and Microsoft at the start of this generation despite the pathetic hardware, so why would they do anything different next time around? So that puts us at 32nm or 22nm if we're lucky, in which case your figures are an awful lot more interesting. 4670/4770 level hardware from ATI seems wholly possible then which would surely put a cut down 12/16 core LRB (especially if this means no need for a separate CPU) in the same sort of ballpark?

I still think Nintendo sticking with IBM/ATI is more likely and really this is all just wild speculation simply for the fun of it but I certainly think they'll be keep an eye open for the best possible deal they can get. If that's being offered from Intel then I don't see anything stopping them from taking it.
 
Last edited by a moderator:
One of Nintendo's "gimmicks" next generation will be HD graphics though, they have to make that step up and since HDTV adoption will be through the roof at that point its a decent selling factor to have. If anyone can market HD gaming to the masses, its Nintendo.

Nintendo could just get a cheap scaler chip or something to support their HD. Don't need an expensive solution for HD. Beside HD is Sony and MS gimmick this gen and it didn't quite work out for them.

They've got to move away from their old fixed function design to achieve that, is anyone really expecting a tweaked Hollywood that makes HD rendering faesible? I severely doubt that, ATI, haven't made a GPU like Flipper/Hollywood at all for a very long time.

Higher clock Hollywood with more eDRAM and output stage to handle HD. It should be good for another generation. Some of AMD low-end solution can do HD. LRB is a kitchen sink solution. It's very grand, not ideal for consoles unless you really looking for very nice graphics.

If a 32 core LRB is aimed at competing with the high end DX11 GPUs, then how does that put a 12/16 core derivative below a 7600GT/7900GT hybrid? I didn't mention anything about slashing clock speeds in half, sure they may be tweaked, but not halved like you mentioned, still something within sight of the projected 2ghz figure.

LRB is like Cell it's strength is in many cores. Put only 8 cores, you won't even get to leverage it's flexibility advantage because you just don't have the horsepower.

Nintendo would want a small form factor and low power like Wii. 12 cores LRB @ 2 GHz wouldn't work. Try 8 cores @ 500 MHz if they are on 32 nm.


You can not cut down on LRB. LRB is a grand design. It is pointless to use it if you are only going to put 8 cores. I would say 32 cores is probably the minimum to justify the cost of LRB.

If you are going to use graphics as your gimmick you need to do it right. Just look at what Intel has in mind for HPC solution.

lrbhpca.jpg


Those are 96 threads per LRB ie 24 cores. If they launched with 22 nm they can do 64 cores LRB for 256 threads per chip.

So for 'HD graphics gimmick' console in early 2012

4 x 64 cores LRB @ > 2 GHz
4 x 4 GB GDDR5 @ > 1 GHz
TB of mass storage.
Gigabit ethernet
BR drive
1000W PSU
RRP $1500-2000

Still somewhat competitive to high-end PC. And obviously games to show the 'graphics gimmick'.

The Wow factor are very hard to obtain with a single GPU solution. They need to move away from single GPU to multiple GPUs just like how it is in PC world.
 
2. Just because few games use 2 gigs now, that doesn't mean they won't on a brand new console generation 4 years from now(The extra years taking into consideration a few years to get used ot the hardware and for truly epic games to show up).

Sure it "could" happen. But I sincerely doubt it "will" happen. All my opinion, of course.

With there currently being no end in sight for 32-bit computing on the PC platform that limits almost all (there are the rare exceptions of games that have a 64 bit executable that doesn't really do much) games to 2 gigs of virtual addressing.

Windows 7 will be released in both 32 bit and 64 bit flavors. I don't see a sudden massive upsurge in 64-bit adoption rates with OEMs and people generally playing it safe.

So if consoles were to do that they would be a whole multi-generational gap better than PCs. Which would make porting rather problematic from console to PC. Then again, I suppose game makers could just abandon the PC space, but I don't see that happening.

So a possibility is that even if there WERE more than 2 gigs of memory, most cross platform games would probably not use more than 2 gigs of memory.

The number is certainly attractive, but I just don't currently see any console maker making the move to higher than 2 gigs of system memory plus another 512 megs to 1 gig of graphics memory.

Regards,
SB
 
As such that pretty much entirely rules out a 2011 launch, makes 2012, somewhat unlikely and makes 2013 seem more reasonable. I'd sure as hell hope Nintendo won't be using a 45nm process in 2013! By the end of the year most PC GPUs will be targeting 40nm, do we really expect Nintendo to use an older process with a product that may be 4 years away?

It is entirely feasible that Nintendo will delay their next generation console as long as they can for the reasons you mention but also because of the economic downturn. It makes more sense to launch a new product when the public is in a more positive mood with regards to spending and the banks have sorted themselves out! So, I will not disagree with a 2012/2013 launch.

Nintendo stuck with as modern a process as both Sony and Microsoft at the start of this generation despite the pathetic hardware, so why would they do anything different next time around? So that puts us at 32nm or 22nm if we're lucky, in which case your figures are an awful lot more interesting. 4670/4770 level hardware from ATI seems wholly possible then which would surely put a cut down 12/16 core LRB (especially if this means no need for a separate CPU) in the same sort of ballpark?

Two crucial points you make here:

1. 2012 = 32 nm and 2013 = 22 nm a possibility
2. Next gen Nintendo console either a traditional CPU - GPU solution or an all in one Larrabee solution which has the horsepower to drive HD resolutions (at least 1280 x 720)

1.
From the figures I gave:

32nm CPU = 120 million transistors to 150 million 1.5 GHz
22nm CPU = 150 million transistors to 200 million 2.0 GHz

Comparitively:

@ 32nm = AMD Athlon X2 equivalent
@ 22nm = Intel Core 2 Duo

Or... howabout a Dual Core PowerPC 970? Under the transistor budget...

32nm GPU = 300 million transistors 546.5 MHz
22nm GPU = 400 million transistors 729 MHz

@ 32nm = X1950 Pro
@ 22nm = unknown... 4770 transistor count is 826 million and 4670 is 514 million

2. At 22 nm we have given ourselves approximately 600 million transistors if there is just Larrabee.

Larrabee has approximately 1700 million transistors!

A hobbled Larrabee would still probably need close to 1 billion transistors... even at 22 nm that would be quite a large die? Or by my bad and rough mathematics approximately 150 to 200 mm2.

I still think Nintendo sticking with IBM/ATI is more likely and really this is all just wild speculation simply for the fun of it but I certainly think they'll be keep an eye open for the best possible deal they can get. If that's being offered from Intel then I don't see anything stopping them from taking it.

For both CPU and GPU nothing to stop Nintendo with going with AMD/ATI exclusively this time round. Was the IBM designed Xenon competitive compared with solutions from AMD at the time?
 
I expect Nintendo to do something akin to DSi: The next Wii will not be a clean "generational" break, but will rather be an upgraded experience designed to re-stimulate interest when Wii sales finally slow down. They have already in Japan, so I wouldn't be surprised if the new product drops soon. Here are the general things I expect:

1. Like DSi, there will be a substantial but not generational upgrade in the silicon. Like someone else said, perhaps a modification of the GPU and some extra RAM to support HD resolutions. But the main "traditional" upgrade I'm expecting is more storage space. The real complaint about the Wii, the one that the actual Wii customers seem to complain about the most, is how quickly that 512 MB gets filled up. Opening up the SD card slot is a stopgap solution until new hardware arrives. Like DSi, there might be "Wii 2 enhanced" titles.

2. There will be a new gimmick. I don't have any idea what it's going to be. A camera? Microphones in the controllers? Some new online tricks?

3. M+ will be integrated into the new controller (everyone knows that, though).

A new generation of processing power isn't Nintendo's modus operandi anymore. Speculating about things like Larrabee is IMO going in the wrong direction. I doubt the next machine will have anything that's not an extension of the current processors.
 
Mostly directed towards Tahirs speculation - the PPC970FX had 58 million transistors (single core). The dual core PPC970MP should have roughly a 100million. Makes more sense to go this way with updated memory system if they keep CPU and GPU separate.

I'm more inclined to believe fearsomepirates' scenario. Still difficult to predict though, even within fairly tight budget/power/compatibility limits, there are still quite a few ways to skin that cat.
 
What I expect from Microsoft is

Triple core IBM Power PC CPU based off of Power6 (which uses in-order cores)
ATI Direct X11 GPU with 128bit bus & 64 megs of eDRAM
2 Gigs of GDDR-5
8x Blu-Ray drive
500 GIG standard hard drive
launch X-mas 2011 @ $399

A bit too optimistic IMHO but matches my general expectations. A slightly more realistic scenario would be:

3-4 core PowerX (could be Power6) most likely in-order.
DX11 GPU with 32 Megs EDRAM (perfect for 4xMSAA 720p, and 2xMSAA 1080p)
2 Gigs GDDR-5
NO OPTICAL DISC (digital distribution)
16+ Gigs flash/SSD built-in (for Arcade SKU)
OPTIONAL 300+ Gigs HDD (for Pro SKU)
launch X-mas 2012 with at least 2 SKUs
 
Personally I think there is some really strange prediction in here.

1- Besides Atom none dev a no 64bits CPU in the last 5 years from a the cheapest (:love:0$ to the consumer) to the pricier and even without RAM to play both 360/PS3 have 64bits CPUs.

Is there any reason why would the put more powerfull and tech advanced CPUs that would go against the last 7 (by then) years of dev?

2- There is no reason why someone would go with less than low PC class processors, in fact go with to much outdated HW can be pricier (the RAM is a good example as DDR2 will be soon more expensive than DDR3), unless it is a SoC. BTW todays low end HW (eg a X3+4670) would beat the 360 in anyday in realworld situations.

3- It is getting harder and harder to go into smaller silicon so a big/hot chip in 22nm is quite improbable.

4- There is no reason why Nintendo would want LRB (unless it is really really cheap) they would want something as cold and easy to use/explore as they could get. And not going for a powerfull console (like most predict) would make it less atractive for Intel. Unused HW (or only used for a few) dont really make much sense for them, as their big hit games dont care for gfx.

As an observation each time less games are showing their gfx/tech as a selling point (still something will need more processing). Also todays gfx cards even in 256bits and DDR3 seems quite good at HD rez.

My prediction is that they will go with costumized comum HW from the time (eg the equivalent to a X3+4770+DDR5(6?), just as reference from whatever company offers them a better deal) that could scale well to a lower priced HW (eg a future SoC) over a long time.

That come from several things:

1- That will be powerfull enought for most people
2- There will be litle worry in the tech department (like it already happened with the sound this, and meybe even a litle bit in the last, gen), so they will cut cost in the R&D of the processors (and SW, dev tools etc...), functional and cheap will be the priority. Althought some cool thing like SSD/no loading or EDRAM tech etc may happen.
3- They will show/sell other things, from interfaces to services, upgrades(? eg interfaces).
4- Again the future for silicon is unknow.

So a good upgrade on the HW, but it will not make any smiles in geeks (probably some sadness even):cry:.

But I belife it will make other kind of geekness quite exited.
 
Last edited by a moderator:
what was the xbox1 hardware spec comparitively to PC when it launched?

what about 360 and PS3?

personally I think you guys are lowballing the numbers, if past trends are somthing to go by, these should at least match, and probably surpass a upper-end PC when released.
 
personally I think you guys are lowballing the numbers, if past trends are somthing to go by, these should at least match, and probably surpass a upper-end PC when released.


A PC has a very large case, special power supplies, and a lot of noisy fans to help with cooling. Unles a major form factor change occurs, there is a limit to how much power/heat a game console can have.
 
Status
Not open for further replies.
Back
Top