damn is that a 8 pin pci-e i was hoping that because its pci-e 2.0 it wouldn't be....Got from another forum, no link to original thread where he got it from sadly
soo damn small cant anyone take pictures in higher resolutions...
damn is that a 8 pin pci-e i was hoping that because its pci-e 2.0 it wouldn't be....Got from another forum, no link to original thread where he got it from sadly
damn is that a 8 pin pci-e i was hoping that because its pci-e 2.0 it wouldn't be....
soo damn small cant anyone take pictures in higher resolutions...
Got from another forum, no link to original thread where he got it from sadly
Every chip has an array of contact pads, similar to pins on a CPU. This includes pins that connect the GPU to the memory chips. The bigger the width, the more the pins. Because of that, there's a lower limit of die surface for each bus, e.g. for 256bit it's around 190 mm2. GT200 is enough large to accomodate a 512bit bus, its die-shrink might not be. Therefore, to keep memory bandwidth in the same numbers, they would need faster memory = GDDR5.i don't understand why they couldn't fit it
Oh yes it does. G71, G73, R580 were all limited by GDDR3 of their time. G92 is limited by GDDR3 today. I imagine nVidia calculated the optimal memory throughput for GT200 and will easily reach the number even using cheaper GDDR3. But imagine they'd shrink the chip, lowering the TDP and making GX2 card possible. But a 512bit bus would no longer fit, so they'd have to narrow it down to 256 bits. Now if they used GDDR3, the chips would only get about half their calculated optimal bandwidth and that sure would limit them.as for the gddr5 i don't see them redoing the memory controller for a gx2 variant they haven't before or more importantly why would gdd3 limit them if it doesn't limit already existing cards
Of course, but why would nVidia purposedly screw up a product?there are alot more factors than memory bandwidth that can screw up a sandwich card.
yeah but even with gddr5 a 256 bit bus would cripple that card especially if it has 2 gigs of memory - couldn't they squeeze a 384 or a 448 bit bus on that?But a 512bit bus would no longer fit, so they'd have to narrow it down to 256 bits. Now if they used GDDR3, the chips would only get about half their calculated optimal bandwidth and that sure would limit them
that's just your opinion i wonder if that will change when the next crysis-like game comes out and not even the gt200 can produce playable framerates the 8800gtx was god with like every game (except crysis) and they still came out with the 9800gx2 i wasn't surprised at all they want more money so they whip out something people will give their left arm for gx2 $$$$ variant
Um, the GX2's perf per watt is nothing to be taken as a benchmark ..
My apologies.
With no benchmarks on GT200, I would say that it's a bit too early to compare its perf/watt with other GPU's.
Secondly; the 9800 GX2 shouldn't be used as a reference for perf/watt or perf/mm^2.
But imagine they'd shrink the chip, lowering the TDP and making GX2 card possible. But a 512bit bus would no longer fit, so they'd have to narrow it down to 256 bits. Now if they used GDDR3, the chips would only get about half their calculated optimal bandwidth and that sure would limit them.
umm what?
I'm afraid ifs and buts dont go well in discussions.well if this card is using 240 watts .......
Whats this last Single GPU crap?
Whats this last Single GPU crap?
Can you imagine how complex would the PCB need to be to accomodate for two GPUs with 12 memory chips each? That would make one hell of an expensive board.And a 2x384 bit bus should be not so bad, I guess