How much will 65nm CPU and GPU help Xbox360

a lot.






But more seriously, it's hard to say because power consumption of the chips doesn't automagically become 50% from the process shrink. They have to deal with leakage current and other issues that crop up with moving to smaller transistors. The power consumption for the whole unit won't be 50% (at best) either because of other components like the DVD drive and hard drive (maybe 30W for both?).

It would be nice to know individual power consumption by the GPU and CPU, but those aren't likely to be measured anytime soon... All we know is that the current 90nm 360s consume ~170-190W under load.

Dave Baumann may know more about Xenos' power consumption...
 
a lot.






But more seriously, it's hard to say because power consumption of the chips doesn't automagically become 50% from the process shrink. They have to deal with leakage current and other issues that crop up with moving to smaller transistors. The power consumption for the whole unit won't be 50% (at best) either because of other components like the DVD drive and hard drive (maybe 30W for both?).

It would be nice to know individual power consumption by the GPU and CPU, but those aren't likely to be measured anytime soon... All we know is that the current 90nm 360s consume ~170-190W under load.

Dave Baumann may know more about Xenos' power consumption...

It should be more than just a shrink, there should be some optimization going on as well.
 
I think the cooling system of the 360 is flawed, whereas in contrast the PS3 has one of the most sophisticated cooling system in console history. They really should redesign the cooling...but that would be way too much money and time. So, the best thing they can do is maybe optimize things a bit.
 
It will bring down the cost for Microsoft, and will allow them to reduce the price. In some sense, it will not help the 360 as much as it will hurt the PS3 - it won't help Microsoft address some of the significant problems before them - e.g. the less than stellar uptake in Europe - but it will keep 360 at half the price of PS3, which is the most significant problem before Sony.
 
They have to deal with leakage current and other issues that crop up with moving to smaller transistors.
If TSMC's 65nm G+ processing is used then the leakage isn't much of an issue - iirc we are seeing this as slightly better than 90nm GT in terms of leakage.
 
Couldn't find better place to ask, is it confirmed that the motherboard design for X360 has changed so that GPUs get larger heatsink (ie by moving GPU position)? They also need to get rid of X clamps.
 
If TSMC's 65nm G+ processing is used then the leakage isn't much of an issue - iirc we are seeing this as slightly better than 90nm GT in terms of leakage.


Ah... hmm... What's different between G+ and the GT in terms of engineering? (GT = "high performance"?)

The triple gate oxide should help a ton with the Ileak, but their 90nm process should have that too, no?

On a side note, it seems that IBM page on Waternoose is gone... And according to TSMC, they've been able to do eDRAM @ 65nm since Oct 2006. I wonder if they'll do a very silent 55nm shift since it's just an optical shrink.

edit: Nevermind, found that IBM page - http://www.ibm.com/developerworks/power/library/pa-fpfxbox/
(just for my reference in the future... it's too bad they don't talk about power consumption in detail)
 
And according to TSMC, they've been able to do eDRAM @ 65nm since Oct 2006. I wonder if they'll do a very silent 55nm shift since it's just an optical shrink.

NEC has only just started producing 55nm eDRAM engineering samples they aren't in production yet, I expect that TSMC is behind NEC in that area.

<edit> I'd also heard that the 65nm parts were going to be a low-power variant, I'll see if I can find the link
 
How much will 65nm die shrink reduce power consumtion and heat?

A process shrink normally reduces power consumption by about 30%. I know Cell will get a bigger reduction going to 65nm because they have used separate power supplies for the SRAM and logic. I don't know if they'll do this with the 360 CPU though.
 
Does anyone know exactly what percentage of the heat and power is due to the GPU (which wont be released in 65nm any time soon)??
 
It would be difficult to measure the power consumption of the chips on the motherboard, but it should be possible to measure and compare the heat energy being transferred to the heatsinks:

Code:
Q = m * Cp * deltaT

Q= heat energy 
m = mass of the heatsink
Cp = specific heat capacity of the heat sink or the energy required to 
raise the temperature of a per unit mass of substance by a degree.
deltaT = Final Temperature - Initial Temperature of the heatsink*

Cp = 0.9  J g−1 K−1 for aluminum and 0.39 J g−1 K−1 for copper**

*Note that it does not matter if we use Celsius or Kelvin in the equation
** this is why copper heats up so quickly and stays hotter longer.  
The thing about copper is that it has a higher thermal conductivity, so it 
transfers heat away from the chip more quickly.
The problem for us is that the heatsinks on the CPU and GPU are vastly different and employ heat pipes that could screw up the calculations or make them more complex.

An easier method would be to get two identical heat sinks made of only one element (i.e. either aluminum or copper) of sufficient and known mass and measure the blocks' temperatures.

Oh and you'd probably want some sort of contraption to hold down the ghetto heatsink. :p
 

Wow dude get with the times, thats an old news article.

http://www.dailytech.com/article.aspx?newsid=8586

It clearly says that the upcoming Xbox 360 Falcon revision is to have cooler 65nm IBM CPU, but not the revised ATI GPU, its still the same 90nm ATI GPU that many point to the source of the RRoD problem.

You know what that means, I am not getting an XBox 360 untill next year when both chips are confirmed and verified to be 65nm, stress testing is done and a price drop is also done, in the mean time I am going to have to buy the current Sony PS3 and or the Nintendo Wii.
 
Back
Top