That would be the power budget for the card, based on how many power connectors there are.Yes, i agree!
But the thermal budget i think is 150W for CPUs and 300W for GPUs!
The GPU itself can't take up that much.
Some margin must be set aside for RAM, on-board chips, and inefficiencies in the power circuitry. Then some extra margin is put aside to keep the device from spiking too high and not being compliant with the PCIe specifications.
A GPU that draws as much as the maximum would not have a board to run it on.
Furmark does show that certain instances can make GPUs spike very high above average consumption.
The majority of graphics boards sold are not going to be the top-end cards with two connectors.
Obviously, the dual-GPU cards that have 300W can't have both GPUs drawing 150W.
This means that most GPU chips occupy at most the high-end of the power band allowed for enthusiast CPUs.
GT200 boards are somewhere around 230 W.
This may allow the GPU itself to pull in a generous amount of wattage in the worst case, but I don't know how much other board components take away from the maximum.
It might only be something like 200 W for the chip itself.
It does not appear the manufacturers are eager to push much higher, and this level is not too much higher than the highest CPU TDPs.
The specification would be the slot and then the additional specifications for power connectors.If you take the PCIe specification as the badget, GPUs hit the wall years ago!
GPUs have hit all three limits.
G80 was introduced in 2006. Its derivatives have been around ever since.Within a cycle, If for example Intel has a Pentium 4 C, it can do next something like Pentium 4 D, and then something like Pentium 4 E, or replace the above, with AMD phenoms progression..., all these minimal differencies in perf. within 2 years)
ATI for example can go from HD2000 tech, to HD3000 tech, to HD4000 tech and all these imo big differencies in perf. within 1 year and 1 quarter)
If you take as design cycles big ivents like Geometry on chip transformation or Shaders enabled GPUs or GPGPU enabled GPUs you can stress the notion of the GPU design cycle, but even then with this logic (which is correct, nothing wrong there) the CPUs equivalent cycles will be again larger!
G300 may be a real revamp, but there are not that many details.
R600's legacy is very strong in the succeeding chips.
Evergreen sounds like there are some significant changes, but it sounds like the family resemblance remains.
There have only been a handful of real big transitions for both CPUs and GPUs in the last decade.
The behind-the-scenes articles are also indicating that GPU design times are between 3-5 years (counting delays, especially), which isn't too far from what it takes for a CPU to come to market.
Guys, this is my 3rd post and i am new to the forum staff, i just want to make conversation, is the above writing style percieved (English not native lan.) that i want to argue? (because i don't want that!)
Please advise in order to change it!
Thanks in advance!
It would be easier for me to quote you if your responses weren't inside the quote box.
As far as perception, using a lot of exclamation marks makes it seem like you're very excitable or hyperactive. That punctuation has more force than a period and implies that there is more emotion behind the sentence or that the point is very important. It can seem forceful if used a lot.