Predict: The Next Generation Console Tech

Status
Not open for further replies.
Those are low performance parts.

Have you noticed that the higher clock you try and go (on overclocking cpu/gpu) the power necessary to hit those clocks starts to ramp far quicker?

With flagship GPUs, AMD/Nvidia are vying for the performance crown and pushing the chips to their limit (within reason considering warranty etc).

So there is a lot of power savings to be had by clocking these chips lower.

Tahiti might not be the chip, and 740MHz might not be the speed, but that's the ballpark.
________________________________

BTW, please don't ever mention a 6570 or 6670 in this thread ever again. :p
Ok low end or not that's irrelevant to physics... By the way you can see that the calculation is way off for the 6850 and 6870 as shown in the link.

But it is also for the 7950 and 7970 see here.
 
Ok low end or not that's irrelevant to physics... By the way you can see that the calculation is way off for the 6850 and 6870 as shown in the link.

But it is also for the 7950 and 7970 see here.

HD5850 334mm2 40nm @ 725MHz pulled 151watts
HD2900GT 420mm2 80nm @ 600MHz pulled 150watts

HD7970 352mm2 28nm @ 925MHz pulls 250watts

So what speed would the HD7970 Tahiti core need to run at to hit 125watts?
 
HD5850 334mm2 40nm @ 725MHz pulled 151watts
HD2900GT 420mm2 80nm @ 600MHz pulled 150watts

HD7970 352mm2 28nm @ 925MHz pulls 250watts

So what speed would the HD7970 Tahiti core need to run at to hit 125watts?
I don't know and the proper answer is most likely that if you want to reach this power consumption you may want to start with another part.

Intel late statement about power seems to apply to our GPUs, they stated an architecture can be scaled an order a magnitude in regard to power consumption. In the CPU world it's 15watts to 150watts (not a accurate statement I give a gross figure // Intel own statement is an empirical evidence ).

For GPU it looks like were pretty much here too from 30 Watts to 300 Watts. Just choose the proper part.
The circuitry within those chips is design to run in a given speed range, under a given tension range, within this range of tension tensions leakage will varies within another range, etc. all this has an impact on the chips power consumption as well as physical implementation. How things varies when moving thing in insulation? AMD knows, we may have a clue with proper testing (but that would be for one chip not millions).

Anyway point is that formula is not working / is incomplete with any of the last AMD architecture.
 
HD5850 334mm2 40nm @ 725MHz pulled 151watts
HD2900GT 420mm2 80nm @ 600MHz pulled 150watts

HD7970 352mm2 28nm @ 925MHz pulls 250watts

So what speed would the HD7970 Tahiti core need to run at to hit 125watts?

Being horribly simplistic, P = fCV^2, so 462 MHz would do it. This doesn't account for leakage and other causes of constant power loss. However, you could probably lower the voltage too so 400-450 shouldn't be too far off I'd imagine, right? But as liolio said, this isn't the right part to do that with. You'd want a part with fewer shaders on this technology if 125W was your goal. Also means less risk in terms of number of transistors you need to yield properly per die.
 
If you're feeding the CPU from the same memory as the GPU then you might not be able to scale that GDDR5 speed back by 20%.
The memory can't consume too much energy or else how are rigs with 16-24GB being built? unless gddr5 has far higher memory consumption than present day main memory solutions, I assume it shouldn't consume much.
I find it unlikely that they'd go much higher than 200W for the next-gen consoles because of price, stability and noise.

The ps3 had none of those issues and had nearly twice that in the power supply. The noise issue was mostly the fast dvd drive in the 360, iirc, as bluray has constant speed, iirc, that won't be an issue and in any case max speed can be modulated as needed.

Regards to thermal issues it seems it was changes in the thermal paste or soldering or something that were behind the problems(lead free or something), experience with the new compound should minimize this issue also.


________
regarding power

7970 at 925Mhz consumes 310W on load, overclocked by 200Mhz it consumes 417W at 1125MHz(hardocp).

Your 7970 should be down throttling when required as is. I know mine when just doing desktop work the clock goes ... to 350mhz., jrrandel on 7970,hardforum
At 350mhz then it should run at 20w per hardocp calculation for desktop background consumption.

Would be interesting to know the shape of the graph for the correct conversion function, to know how much power consumption grows as frequency grows.
 
Last edited by a moderator:
I don't know and the proper answer is most likely that if you want to reach this power consumption you may want to start with another part.

Intel late statement about power seems to apply to our GPUs, they stated an architecture can be scaled an order a magnitude in regard to power consumption. In the CPU world it's 15watts to 150watts (not a accurate statement I give a gross figure // Intel own statement is an empirical evidence ).

For GPU it looks like were pretty much here too from 30 Watts to 300 Watts. Just choose the proper part.
The circuitry within those chips is design to run in a given speed range, under a given tension range, within this range of tension tensions leakage will varies within another range, etc. all this has an impact on the chips power consumption as well as physical implementation. How things varies when moving thing in insulation? AMD knows, we may have a clue with proper testing (but that would be for one chip not millions).

Anyway point is that formula is not working / is incomplete with any of the last AMD architecture.

Well for S&G, a Barts core (255mm2) on 40nm @ 775MHz burns 127Watts.

Pitcairn (245mm2) on 28nm @ 800MHz will likely burn a similar wattage (watts/mm stays relatively flat at a given frequency).

Then it's simply a matter of deciding bang for the buck:

Pitcairn @ 800MHz
or
Tahiti @ 500-600MHz
 
Tahiti might not be the chip, and 740MHz might not be the speed, but that's the ballpark.
________________________________

BTW, please don't ever mention a 6570 or 6670 in this thread ever again. :p

I don't mean to intrude here, especially since I'm not nearly as knowledgeable on tech as anyone in here, but why is that the "ballpark"? As far as I'm aware, this is a prediction thread. What's your reasoning that a company like Microsoft/Sony will go for power and thus target Tahiti or somewhere in that "ballpark" as you put it.
 
I don't mean to intrude here, especially since I'm not nearly as knowledgeable on tech as anyone in here, but why is that the "ballpark"? As far as I'm aware, this is a prediction thread. What's your reasoning that a company like Microsoft/Sony will go for power and thus target Tahiti or somewhere in that "ballpark" as you put it.

Welcome!

If one were to assume a linear progression, one would assume 250mm2 250mm2 split with cpu/gpu.

If one believes as I do that future consoles will rely on the GPGPU for CPU assistance, then part of the CPU die budget would go toward the GPU.

How aggressive this shift is is anyone's guess, But I'd assume a smart route would be to take an existing off the shelf gpu, and mate it with a custom CPU solution scaled from existing Cell/Xenon tech.

This will enable the relatively large GPU to be binned from a desktop chip which perhaps couldn't cut it at 925MHz, or perhaps doesn't have all of the 2048 ALU's working.
 
The memory can't consume too much energy or else how are rigs with 16-24GB being built? unless gddr5 has far higher memory consumption than present day main memory solutions, I assume it shouldn't consume much.


The ps3 had none of those issues and had nearly twice that in the power supply. The noise issue was mostly the fast dvd drive in the 360, iirc, as bluray has constant speed, iirc, that won't be an issue and in any case max speed can be modulated as needed.

Regards to thermal issues it seems it was changes in the thermal paste or soldering or something that were behind the problems(lead free or something), experience with the new compound should minimize this issue also.


________
regarding power

7970 at 925Mhz consumes 310W on load, overclocked by 200Mhz it consumes 417W at 1125MHz(hardocp).


At 350mhz then it should run at 20w per hardocp calculation for desktop background consumption.

Would be interesting to know the shape of the graph for the correct conversion function, to know how much power consumption grows as frequency grows.


Neither the launch 360 or PS3 consumed more than approximately 200W when running games. The PS3 was just about right at 200W and the 360 was around 180W. The supplies may have been rated for more, but that is probably to address issues of efficiency and noise in the supply. The new slim units use significantly less, which is why their cooling solutions are much quieter.

The thermal issues in the 360 were caused by a switch to lead free manufacturing. It is probably a manufacturing issue caused by cold joints, or other poorly formed solder joints, but cracked by thermal expansion and retraction of those joints as the console is put through the paces.

I still believe, that with a game installed, they wouldn't want their next-gen consoles to be any louder than what the original 360 was, once the install feature was added. It is not incredibly loud, but it is noticeable and a complaint of many people.
 
I don't mean to intrude here, especially since I'm not nearly as knowledgeable on tech as anyone in here, but why is that the "ballpark"? As far as I'm aware, this is a prediction thread. What's your reasoning that a company like Microsoft/Sony will go for power and thus target Tahiti or somewhere in that "ballpark" as you put it.
It's his assumption that he will get what he wants at whatever number he comes with if not 740MHz it's 500/600MHz, seems more likely. Point is we don't know exactly and the formula he used (improperly as I believe it only works on active/switching power) is not helping.

Now lowering the frequency lowers power consumption but it's not linear it's specific to an architecture, that's true.

NB most of us here are not engineer of any kind, mostly using what a few people were kind enough to explain us (here or elsewhere on the web).
 
The memory can't consume too much energy or else how are rigs with 16-24GB being built? unless gddr5 has far higher memory consumption than present day main memory solutions, I assume it shouldn't consume much

A quick search of Samsung parts would tell you what you shouldn't just assume... First of all, neither DDR3 nor GDDR5 operate at the same voltage, and certainly nowhere near the same signal rates. That should tell you already that power is likely very different, otherwise you'd just use GDDR5. Bandwidth certainly is important for large scale servers, and latency would just be mitigated by increased workloads. Clearly, power is a greater concern for large-scale systems that do use >16GB.

Just going by their listed examples, an 8x1Gb GDDR5-4000 config @ 1.5V (8 chips, 1GB RAM, 256-bit bus) is 16.5W whereas a 16x2Gb DDR3-1333 server module @1.35V (16 chips, 8GB RAM, 64-bit) uses 4.0W.

The 8x2Gb DDR3 SODIMM (8 chips, 4GB RAM) consumes 1.23W. The non-linear decrease is probably due to the extra components in server-class memory i.e. ECC circuitry or fully buffered. Ultimately, for a desktop/laptop, you're looking at a fairly low power consumption compared to GDDR5.
 
A quick search of Samsung parts would tell you what you shouldn't just assume... First of all, neither DDR3 nor GDDR5 operate at the same voltage, and certainly nowhere near the same signal rates. That should tell you already that power is likely very different, otherwise you'd just use GDDR5. Bandwidth certainly is important for large scale servers, and latency would just be mitigated by increased workloads. Clearly, power is a greater concern for large-scale systems that do use >16GB.

Just going by their listed examples, an 8x1Gb GDDR5-4000 config @ 1.5V (8 chips, 1GB RAM, 256-bit bus) is 16.5W whereas a 16x2Gb DDR3-1333 server module @1.35V (16 chips, 8GB RAM, 64-bit) uses 4.0W.

The 8x2Gb DDR3 SODIMM (8 chips, 4GB RAM) consumes 1.23W. The non-linear decrease is probably due to the extra components in server-class memory i.e. ECC circuitry or fully buffered. Ultimately, for a desktop/laptop, you're looking at a fairly low power consumption compared to GDDR5.

Intriguing. My original assumptions were based on the case that the heftiest power supplies usually are 1000-1250W, and one can handle a single high end card with a 750W power supply... I had assumed since such power supplies could presumably handle 4xSLI setups with 12GB gddr5, that the memory can't be that taxing on resources.
 
Neither the launch 360 or PS3 consumed more than approximately 200W when running games. The PS3 was just about right at 200W and the 360 was around 180W. The supplies may have been rated for more, but that is probably to address issues of efficiency and noise in the supply. The new slim units use significantly less, which is why their cooling solutions are much quieter.

I presume they're voltage regulated for 12V output, so I suppose the extra current rating might be for unexpected surges? Then again, surges from motors (spinning disc at start-up) shouldn't be too bad.

I suppose there's hooking up extra devices to the USB ports, but that's still pretty small in the grand scheme (2.5W per port).

Kinect is 12W, but that'd only be relavent for Slim units.
 
That whole "Power consumption scales with frequency cubed, a 20% reduction should result in around half the power consumption" is completely wrong. Power consumption is linear with frequency and squared(not cubed) with voltage. The formula is P = CV^2f, where C is capacitance, V is voltage and f is frequency.

If you figure capacitance is roughly average on a specific die during typical work, its pretty easy to figure out. Of course, as stated above, this doesn't account for leakage.

For the sake of arguement, say you had a 500mhz GPU at one volt and its power consumption was 50 watts. The lets say you can run it at 750mhz at 1.3 volts. Power consumption would be 1 * 1.3^2 * 1.5 = 2.535 times the power, ~126 watts. Again not including leakage. This isn't to say a 50% increase in clock = 2.535 times power consumption, it just shows the formula relative to a specific voltage. If you were to use frequency cubed, in this case, you'd end up with 3.375 times power consumption for the same increase, quite a bit ahead of the actual formula.

So to define, as the example talked about above, what Tahti uses at a lower clock, you need to find what voltages you can run a lower clocked Tahti at. Stock reference is 1.15 volts. Anyone with a 7970 want to test their shiny new card at underclocked/undervoltage? :p

Edit: Some quick brain math says that if a tenth of a volt in Tahti is gaining close to ~30% overclocks, then I would suspect dropping the core clock by a third would still require one volt. Using the formula above, if Tahti @1.15v and 925mhz = 250watts, Tahti at 1.0v at 608mhz would = 167 watts. Of couse, this is the full board power with RAM and all, so if that stayed the same(RAM amount, clocks, ect) while only the chip dropped you might be more up around 180 watts. Still nowhere in range for a console. If you cut the clocks in half and got down to 0.9v or 0.8v you might be in range for a console chip but at that point there are much better options open than using a large chip underclocked. Theres a reason no one takes a large chip and then underclocks it for lower end chips.
 
Last edited by a moderator:
Of couse, this is the full board power with RAM and all, so if that stayed the same(RAM amount, clocks, ect) while only the chip dropped you might be more up around 180 watts. Still nowhere in range for a console.

It kind of is in range for a console imo. A full GPU board has most of the things on a console (EG, a circuit board, video outputs, cooling system, 3GB high speed RAM). There's not that many more additional things a console needs, mainly a CPU.

If you jack the next gen console TDP to 250 watts instead of 200, which seems entirely plausible (if the manufacturers choose too, of course), you're already pretty close (this leaves ~70 watts for your CPU, seems doable), and can surely get there with tweaking.
 
Edit: Some quick brain math says that if a tenth of a volt in Tahti is gaining close to ~30% overclocks, then I would suspect dropping the core clock by a third would still require one volt. Using the formula above, if Tahti @1.15v and 925mhz = 250watts, Tahti at 1.0v at 608mhz would = 167 watts. Of couse, this is the full board power with RAM and all, so if that stayed the same(RAM amount, clocks, ect) while only the chip dropped you might be more up around 180 watts. Still nowhere in range for a console. If you cut the clocks in half and got down to 0.9v or 0.8v you might be in range for a console chip but at that point there are much better options open than using a large chip underclocked. Theres a reason no one takes a large chip and then underclocks it for lower end chips.

I believe it was pcgameshardware that tested their card (7950?) at lower voltage, and still had it running at the default clock saving 30% power or something in the process. I might be mistaking how that went. I'll see if I can track it down later, or maybe CarstenS can chime in and correct me.
 
Just going by their listed examples, an 8x1Gb GDDR5-4000 config @ 1.5V (8 chips, 1GB RAM, 256-bit bus) is 16.5W whereas a 16x2Gb DDR3-1333 server module @1.35V (16 chips, 8GB RAM, 64-bit) uses 4.0W.

The 8x2Gb DDR3 SODIMM (8 chips, 4GB RAM) consumes 1.23W. The non-linear decrease is probably due to the extra components in server-class memory i.e. ECC circuitry or fully buffered. Ultimately, for a desktop/laptop, you're looking at a fairly low power consumption compared to GDDR5.

Thanks for the calculation, i never know that gddr5 can spent that much of power. MS will have to leave some power headroom if then want to pursue console with 2+ GB of ram.

Is there any news about multi-layered memory from Samsung, something that is "console viable"?
 
That whole "Power consumption scales with frequency cubed, a 20% reduction should result in around half the power consumption" is completely wrong. Power consumption is linear with frequency and squared(not cubed) with voltage. The formula is P = CV^2f, where C is capacitance, V is voltage and f is frequency.

If you figure capacitance is roughly average on a specific die during typical work, its pretty easy to figure out. Of course, as stated above, this doesn't account for leakage.

For the sake of arguement, say you had a 500mhz GPU at one volt and its power consumption was 50 watts. The lets say you can run it at 750mhz at 1.3 volts. Power consumption would be 1 * 1.3^2 * 1.5 = 2.535 times the power, ~126 watts. Again not including leakage. This isn't to say a 50% increase in clock = 2.535 times power consumption, it just shows the formula relative to a specific voltage. If you were to use frequency cubed, in this case, you'd end up with 3.375 times power consumption for the same increase, quite a bit ahead of the actual formula.

So to define, as the example talked about above, what Tahti uses at a lower clock, you need to find what voltages you can run a lower clocked Tahti at. Stock reference is 1.15 volts. Anyone with a 7970 want to test their shiny new card at underclocked/undervoltage? :p

Edit: Some quick brain math says that if a tenth of a volt in Tahti is gaining close to ~30% overclocks, then I would suspect dropping the core clock by a third would still require one volt. Using the formula above, if Tahti @1.15v and 925mhz = 250watts, Tahti at 1.0v at 608mhz would = 167 watts. Of couse, this is the full board power with RAM and all, so if that stayed the same(RAM amount, clocks, ect) while only the chip dropped you might be more up around 180 watts. Still nowhere in range for a console. If you cut the clocks in half and got down to 0.9v or 0.8v you might be in range for a console chip but at that point there are much better options open than using a large chip underclocked. Theres a reason no one takes a large chip and then underclocks it for lower end chips.

Using this formula (tdp*(ocmhz / stock mhz)*(oc vcore/stockvcore)^2

http://www.ehow.com/how_6402697_calculate-cpu-wattage.html

and this underclocked voltage of .93v (stock 1.18):
http://hardforum.com/showthread.php?p=1038311992#post1038311992

Applying a 700MHz clock, I get 118.52Watts.
 
As I were posting in the Threat opened by ERP about "next gen gameplay/graphic differentaitor" and realized that I want MS to plainly copy the WiiUmote.
The benefit are to good to pass.
There is the synergy for xbla games development between winphone 7/8, windows 8 and the xbox.
It's a strong differentiators if not from N obviously but from last gen.
It's a strong argument for parents as the TV can be freed whenever they wants.
It opens us the gameplay as touchscreen can emulate way more successfully some KB+mouse combo that a pad can. Touchscreen by it self opens up games to different game plays.
I would expect the thing to include camera and motion sensor which should give devs even more options.
Overall I believe that MS should make the thing standard. The rumors about a less than amazing system make the thing even more likely to me. Ms and Sony lagged behind Nintendo last gen, now MS is ahead in as far as motion sensing is concerned, I'm not sure they would want to let Nintendo any advantage.

Something I've wondered about is 3D. I find 3ds 3d effect nice but I'm not sold either. It's too tiny for me to have a proper opinion. Do you think it could be worse it to include this to a hypothetical WiiU mockup and the impact on production cost.

For ref there are (shitty)7" tablet that ship @75$
EDIT
Another ref "glassless" 3d on ipad 1&2:
EDIT2
I prefer the head tracking solution by a large amount.
 
Last edited by a moderator:
It's a strong argument for parents as the TV can be freed whenever they wants.

You have to keep in mind the demographic.

For Nintendo, that is a big win, for Sony/MS, that's a big (fixed) cost for little benefit.

I think both PS4 and xb720 will be tablet "compatible", but they won't ship with one.
 
Status
Not open for further replies.
Back
Top