AMD: Pirate Islands (R* 3** series) Speculation/Rumor Thread

That would be some marketing value like typical board power or SDP, or whatever. The board doesn't change TDPs depending on the game it's running.
 
GTX 980's TDP is 180W, not 165W. I have no clue why they use 165W in marketing when even the default reference bios says 180W
Its very obvious why they choose to market the 165W, people believe them. I am not sure why they are allowed to do it tho. Even during gaming, the card draws 180W from the tests at Tom's.
 
But TDP is the heat output of the chip, not the energy the card uses. How would the fan's power consumption have anything to do with that?
TDP is not only the power consumption of the chip, it's of the whole card, which may or may not include the cooler
 
I can't seem to find a good definition of TDP for GPUs but for CPUs, its supposed to be the power the chip uses. I would assume the power delivery and dram is in a card's TDP but I don't see how a fan would remotely make much sense.
 
I can't seem to find a good definition of TDP for GPUs but for CPUs, its supposed to be the power the chip uses. I would assume the power delivery and dram is in a card's TDP but I don't see how a fan would remotely make much sense.
That's because Intel sells the CPU silicon as an individual component. GPU silicon doesn't get sold as such: you buy the full card.
 
I would assume the power delivery and dram is in a card's TDP but I don't see how a fan would remotely make much sense.
The fan gets power over the PCIe slot, too.
The fan's motor, which also dissipates heat, is in the middle of the fan or even the heatsink.
 
The fan's motor, which also dissipates heat, is in the middle of the fan or even the heatsink.
Yes, but fan motors are generally rather insignificant. High-RPM blowers can draw 10-20W at full revs (and compared to a big-gun GPU this is still insignificant), but I'll wager a blower encased in a typical graphics card shroud/heatsink never gets to draw full power. Airflow is likely too restricted, and the jet engine howling of an unrestricted 10k RPM impeller would drive users crazy as well, he he... :)
 
I can't seem to find a good definition of TDP for GPUs but for CPUs, its supposed to be the power the chip uses. I would assume the power delivery and dram is in a card's TDP but I don't see how a fan would remotely make much sense.
FWIW, here's the definition NVIDIA includes in their reviewer's guide. They're citing Wikipedia.

The thermal design power (TDP), sometimes called thermal design point, represents the maximum amount of power the cooling system in a computer is required to dissipate. The TDP is typically not the most power the chip could ever draw, such as by a power virus, but rather the maximum average power that it would draw when running “real applications”. This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope, or requiring a cooling system for the maximum theoretical power (which would cost more but in favor of extra headroom for processing power).

In some cases the TDP has been underestimated such that in real applications (typically strenuous, such as video encoding or games) the CPU has exceeded the TDP. In this case, the CPU will either cause a system failure (a "therm-trip") or throttle its speed down. Most modern CPUs will only cause a therm-trip on a catastrophic cooling failure such as a stuck fan or a loose heatsink.

For example, a laptop's CPU cooling system may be designed for a 20 watt TDP, which means that it can dissipate up to 20 watts of heat without exceeding the maximum junction temperature for the computer chip. It can do this using an active cooling method such as a fan or any of the three passive cooling methods, convection, thermal radiation or conduction. Typically, a combination of methods is used.

Since safety margins and the definition of what constitutes a real application vary among manufacturers, TDP values between different manufacturers cannot be accurately compared. While a processor with a TDP of 100 W will almost certainly use more power at full load than a processor with a 10 W TDP, it may or may not use more power than a processor from a different manufacturer that has a 90 W TDP. Additionally, TDPs are often specified for families of processors, with the low-end models usually using significantly less power than those at the high end of the family.
 
They could also use board power to express the power draw of a card. A lot of the marketing pages at AMD go with wattage or board power rather than TDP.
That would help skip over some of the confusion that a discrete board has with managing its own components and power delivery. Since board fans do plug into the board's power supply, most modern GPUs will use the measured electrical draw when calculating total power consumption.

The fan's power is being dissipated, although much of it is in a form that the cooler should have little trouble dissipating if it's a cooler at all. I'm not sure whether power on that side of the cooler is considered part of the TDP, but if it isn't, all but the standard-breaking boards will be capped by what they state as their power requirements as PCI-e devices.

(edit: The context I've seen TDP used originally from was for CPUs that didn't count all the miscellaneous components since the chip generally got to monopolize the cooler and powering the cooler was someone else's problem. It was all about what the chip expected the cooler to dissipate.)
 
Last edited:
Back
Top