Predict: The Next Generation Console Tech

Status
Not open for further replies.
Power consumption are close to the one I found at hardware.fr which does imho serious reviews :)
How they get the numbers (in french):
http://www.hardware.fr/articles/781-1/dossier-vraie-consommation-73-cartes-graphiques.html
And the power consumption for the hd 6870:
http://www.hardware.fr/articles/804-4/consommation-bruit.html

141Watts under 3Dmark06, that's indeed a lot, you have add, the CPu the chipset, the RAM and the power supply efficiency (and most likely a cheap one...) the results would be awful... :(
And as Function is pointing out, plenty of dead systems are to be expected.

EDIT
And power may not be the main offender (even though it's related), the real offender is thermal dissipation, the card run ~77°C with a hell of a cooler in (I guess) a huge desktop with proper ventilation and CPU (an Intle with impressive power characteristic if you consider the perfs it achieves) also packed with an hell of cooler.
Put this even in a shell the size as the original Xbox and horrible things are likely to happen... fast.
 
Last edited by a moderator:
Speed will be primarily dictated by power consumption, secondly by yield. Power scales with operating frequency to the third degree, lowering speed by 26% halves power consumption, this gives a lot of room for hitting the right power envelope.

Cheers

How does static leakage affect the power use? When does it become a consideration for a console design?

So effectively we could probably expect clock speeds in the order of 600-750Mhz. I guess that follows with the TDP of many of the Juniper mobile parts being half or less than the equivalent desktop parts. They could make a slower chip bigger and more expensive with relatively good power consumption if they use on board RAM/cache and follow up with DDR3/4 or whatever standard grade memory they use. It'd also probably allow they to easily and cheaply have more memory without breaking the bank on expensive exotic technologies.
 
Last edited by a moderator:
I'm not in micro electronic but I read some days ago the Intel presentation about their tri gate technology and it appears that static leakage affects all micro electronic design when transistors reach a certain (tiny) size, it seems it's getting worse as transistors get tinier.

The effect of leakage is big enough that Intel expects a drastic 50% power consumption reduction (ceteris paribus).
In case you missed them the slides are available here.

To me leakage acts as noise. With to much leakage the noise/signal is bad so they have up the tension so the signal is more relevant => de facto more power is used. that's how I get it.
 
Last edited by a moderator:
I'm not in micro electronic but I read some days ago the Intel presentation about their tri gate technology and it appears that static leakage affects all micro electronic design when transistors reach a certain (tiny) size, it seems it's getting worse as transistors get tinier.

Pretty much. :) The problem is more or less that the ratio of active power to static power has been getting much worse at each smaller node. The leakage has an exponential relationship with the transistor's decrease in size.

One of the issues is that the gate oxide thickness has been getting too thin, and one of the solutions is to use high-k dielectrics to get an "equivalent oxide thickness" that's thicker. There's more to it, but it essentially comes down to some crazy material science and manufacturing just to get there.

Things get pretty complicated with the voltages involved as well as the supply decreases - keeping it high may end up inadvertently forcing transistors to always be at a high state (threshold voltage decreases with transistor size), but then the leakage power increases as Vth decreases... etc etc... something something something... dark side.

There is also just wreaking havoc on the transistors themselves with a high voltage (see Overclocking). :p


To me leakage acts as noise. With to much leakage the noise/signal is bad so they have up the tension so the signal is more relevant => de facto more power is used. that's how I get it.

Something like that. Increasing voltage can help to some extent whilst increasing clock speed, but that's pretty bad for power consumption as you said. It's all just some nutso balancing act.

---------

Increasing dopant levels for the transistors is another course of action, but I think you run into a number of yield issues or basically just turning it into a conductor. It's sort of where binning comes into play as not everything is so even across the wafer. Some chips perform better at certain voltages and speeds than others.
 
Crysis used two threads, that i7-920 won't even be firing on half its cylinders. The Furmark test show an increased draw of 265W, and that's with the CPU sitting on its hands.

Build a console with that stuff in, only budget for 258 Watts of power draw and heat dissipation, and then give developers 10 years to squeeze the most out of it. At least 110% of all the systems you ever make will self destruct!

It still hardly matters as the intel cpu will be a much bigger power hug than any cpu they put into the console and the gpu , the 6790 is on 40nm and a drop to 32/28 will decrease the power usage drasticly .


I still think if MS is going through amd they might do a bulldozer cpu + discrete gpu and by getting both at amd they might get a much better deal on everything.
 
Another good article about Nintendo move with project cafe for next generation console:

http://www.gamesindustry.biz/articles/2011-05-20-war-is-over-editorial

Bit disappointed by that personally, I was expecting them to speculate a bit on hardware or at least have some kind of info on it from sources. But it just seemed to meander along rather uninterestingly with the idea that Wii2 will just be the same as Wii was (something that doesn't fit wih the rumours going around), with no details of why they believe that will be the case.
 
Last edited by a moderator:
I wonder what OS shipped on them.......Windows NT?

They shipped with early versions of the XBox360 OS.

It's one of the reasons it's so hard to hide release timelines, by the time developers see any hardware, things like the OS have been under development for a considerable period of time.
Once any sort of devkit goes out it's leaked almost immediately.
Developers like EA have too many people on staff.
 
They shipped with early versions of the XBox360 OS.

It's one of the reasons it's so hard to hide release timelines, by the time developers see any hardware, things like the OS have been under development for a considerable period of time.
Once any sort of devkit goes out it's leaked almost immediately.
Developers like EA have too many people on staff.

That's why it's said Nintendo is not giving all devs info on Project Cafe but keeping it limited. To prevent leaks.
 
That's why it's said Nintendo is not giving all devs info on Project Cafe but keeping it limited. To prevent leaks.

Didn't help did it, it certainly leaked.
You won't get a lot of specifics, because the sources of the leaks are almost never devs, but rather it's second hand information.
The 360 leak was extremely unusual, in that it was virtually the entire contents of a meeting that MS had just conducted with 100 or so devs at a company. I often wonder if they ever identified the source.
 
Wonder if Steam will become more active next gen, in particular, becoming the online solution for either Sony or Nintendo.
 
Wonder if Steam will become more active next gen, in particular, becoming the online solution for either Sony or Nintendo.

Don't know why either would go that route. Sony's flexible approach ain't broke and Nintendo will probably stick to a homegrown solution. Nintendo probably realizes how lagging behind in online is a serious liability to their HD console.
 
Bit disappointed by that personally, I was expecting them to speculate a bit on hardware or at least have some kind of info on it from sources. But it just seemed to meander along rather uninterestingly with the idea that Wii2 will just be the same as Wii was (something that doesn't fit wih the rumours going around), with no details of why they believe that will be the case.



I also think the article was somewhat disappointing in the technical aspect too, but it was well grounded in the business side,cause Nintendo forcing M$ and Sony going to invest more in your next gen console.

This link* with good talk about some idea of what could be Project Cafe.

In the end talking about GPUs aspect if Nintendo comes with a Radeon 4850/RV770 custom, Microsoft with a Fusion II / Krishna / Radeon 5770/juniper or even 6850, sony Geforce GTX460 ** as a whole package(cpu+gpu,ram,drive etc) range of 200 to 300+ watts under DX10/11, I have the impression the gap will only change the number of frames per second at 1080P with 3D(despite maybe little more textures,shading processing,tesselation etc).

Maybe im just a dreamer, but I think the only one who could provide something different,is Sony... if going with PowerVR 6(maybe better relation processing power/wattage on the market...even only IP...Imagetech TBDR in powervr could compete with AMD and Nvidia?) customized for high clock (600/800MHz?) and 16 cores (MP16) with Cell "more powerful" (16SPUs) helping in light interactivity something like "extreme deferred shading" ("full global illumination") or even ray-tracing at 720P (if results really better than paradigm scan-line/rasterize/shaders).


* http://www.bit-tech.net/news/gaming/2011/04/25/project-cafe-system-specs-leaked/1

** http://techreport.com/articles.x/17747/12
 
Last edited by a moderator:
Lower than MS buying Nintendo. :p

That one actually wasn't very good. No specifics on hardware.
It's just a speculation piece on things that should be considered, at least that's what I got out of it. What do you expect? Some sort of magical leak? :|
 
I am very conservative about Nintendo Café possible CPU, I believe that we are going to see a PowerPC 970 based configuration (single core or dual core) as a CPU and an AMD E4690 as the GPU, memory configuration being NUMA like PS3 but with 512MB for the E4690 and 256MB for the 970. The idea is putting a lot of the work that is being done in the multicore configurations PS3 and 360 into the GPU using OpenCL.
 
Don't know why either would go that route. Sony's flexible approach ain't broke and Nintendo will probably stick to a homegrown solution. Nintendo probably realizes how lagging behind in online is a serious liability to their HD console.


They admitted they couldn't do all alone, so I am guessing they are partnering with other company to get the online, more important than uber powerful HW IMO, specially if free :D.
 
They admitted they couldn't do all alone, so I am guessing they are partnering with other company to get the online, more important than uber powerful HW IMO, specially if free :D.

How did they admit that? Because of Sony's security breach? Or because Nintendo didn't even try?
Nintendo or Sony would have to be braindead before sharing revenues with a 3rd party who provides nothing that they couldn't do themselves.
 
Status
Not open for further replies.
Back
Top