NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Got from another forum, no link to original thread where he got it from sadly
damn is that a 8 pin pci-e i was hoping that because its pci-e 2.0 it wouldn't be....
soo damn small cant anyone take pictures in higher resolutions...
 
200805201808234174cz4.png


Got from another forum, no link to original thread where he got it from sadly

http://forum.beyond3d.com/showpost.php?p=1164215&postcount=1205
;)
 
i don't understand why they couldn't fit it
Every chip has an array of contact pads, similar to pins on a CPU. This includes pins that connect the GPU to the memory chips. The bigger the width, the more the pins. Because of that, there's a lower limit of die surface for each bus, e.g. for 256bit it's around 190 mm2. GT200 is enough large to accomodate a 512bit bus, its die-shrink might not be. Therefore, to keep memory bandwidth in the same numbers, they would need faster memory = GDDR5.
as for the gddr5 i don't see them redoing the memory controller for a gx2 variant they haven't before or more importantly why would gdd3 limit them if it doesn't limit already existing cards
Oh yes it does. G71, G73, R580 were all limited by GDDR3 of their time. G92 is limited by GDDR3 today. I imagine nVidia calculated the optimal memory throughput for GT200 and will easily reach the number even using cheaper GDDR3. But imagine they'd shrink the chip, lowering the TDP and making GX2 card possible. But a 512bit bus would no longer fit, so they'd have to narrow it down to 256 bits. Now if they used GDDR3, the chips would only get about half their calculated optimal bandwidth and that sure would limit them.
there are alot more factors than memory bandwidth that can screw up a sandwich card.
Of course, but why would nVidia purposedly screw up a product?
 
But a 512bit bus would no longer fit, so they'd have to narrow it down to 256 bits. Now if they used GDDR3, the chips would only get about half their calculated optimal bandwidth and that sure would limit them
yeah but even with gddr5 a 256 bit bus would cripple that card especially if it has 2 gigs of memory - couldn't they squeeze a 384 or a 448 bit bus on that?
 
i don't understand your assuming gddr5 would make up the difference in loss of bandwidth with a smaller bus wheres the evidence?
 
GDDR3 goes at up to 1-1.2GHz, while GDDR5 starts at 2GHz and there are working samples at up to 3GHz (but of course, that kind can take years to be used - if it ever is - given the price tag and lack of volume available)

Also, tacopaco, please read your PMs - thanks.
 
but would the increase in speed make up for that? thats all im asking
and also i read the message after i made that post sorry
im new at forums soo i make alot of mistakes
 
512 bits × 2 GHz (eff.) = 128 GB/s
256 bits × 4 GHz = 128 GB/s
The second setup would probably be a little bit slower if they used 4 channels with 64 bits per chn. (GT200 has 8 channels), plus probably higher latencies of GDDR5, but nothing an extra few MHz couldn't fix.
 
oh i didn't realize the math was that simple thanks theres soo much i don't know (don't get me wrong i know my fair share) i knew that gddr5 was much faster i knew the numbers im not defending gddr3 id welcome gddr5 im just thinking i guess
 
As long as you're willing to learn and don't generate 50 posts per day with a low signal-to-noise ratio, that's not a problem at all! :) (and don't hesitate to use the 3D Beginner's Questions forum for any seemingly simple question if you want to)
 
that's just your opinion i wonder if that will change when the next crysis-like game comes out and not even the gt200 can produce playable framerates the 8800gtx was god with like every game (except crysis) and they still came out with the 9800gx2 i wasn't surprised at all they want more money so they whip out something people will give their left arm for gx2 $$$$ variant

That always happens in ever generation of cards no matter what even with a gx2.

Um, the GX2's perf per watt is nothing to be taken as a benchmark ..

My apologies.



With no benchmarks on GT200, I would say that it's a bit too early to compare its perf/watt with other GPU's.

Secondly; the 9800 GX2 shouldn't be used as a reference for perf/watt or perf/mm^2.

well if this card is using 240 watts ....... :LOL:
 
But imagine they'd shrink the chip, lowering the TDP and making GX2 card possible. But a 512bit bus would no longer fit, so they'd have to narrow it down to 256 bits. Now if they used GDDR3, the chips would only get about half their calculated optimal bandwidth and that sure would limit them.

A 512 bit bus maybe could not fit, but maybe a 384 bit bus could.
And a 2x384 bit bus should be not so bad, I guess :)
 
Status
Not open for further replies.
Back
Top