DavidGraham
Veteran
Some comments on the matter by industry veterans.
And DF coverage as well interviewing NVIDIA engineers.
And DF coverage as well interviewing NVIDIA engineers.
Last edited:
My very first big budget GPU - Asus Annihilator Geforce 256 DDR.
My dad was like, why the f is this graphics card $400 dollars. I was like, that's what it costs. I'm so lucky he didn't take it back. What incredible moments I had with that card.
Yes but these were usually a separate processor from the rasterizer. Even Naomi2 has its T&L, Elan, on a separate processor. Model 2 and 3 used various different processors for t&l. Even professional graphics cards for the PC has options for geometry processors. But geforce256 is the first to have it integrated on the same chip as the rasterizer?It was nothing new when it came out if you look outside PC-space. Aracede machines had had T&L" for years at that point
It was the first GPU with DDR by 6+ months.
The Riva 128 and TNT were extremely competitive chips for certain markets but they had significant trade-offs...
Wow, I think you're right, the Rage 128 VR did support DDR back in 1998 (but it was either 128-bit SDR or 64-bit DDR so just a cost reduction option rather than a performance improvement - and that's assuming DDR was cheap enough which I'm not sure it was): http://www.bitsavers.org/components/ati/Rage_128/Rage_128_Overview.pdf and https://bitsavers.computerhistory.org/components/ati/Rage_128/Rage_128_GC_Spec.pdfI wonder if anyone tried DDR with Rage 128 VR. Or why it did not happen.
AFAIK the main downside of the TNT was SW/driver maturity, which was greatly improved by the time the GeForce 256 came out. In terms of HW trade-offs, I think RAMDAC 2D quality was far from best-in-class and they missed their clock targets pretty badly, but both those things were already mostly fixed with the TNT2. The GeForce 256 didn't come out of nowhere, and the TNT2 was a very competitive GPU, but it wasn't so clearly ahead of everything else when it came out as the GeForce 256 DDR was.What would a TNT without trade-offs look like?
I think RAMDAC 2D quality was far from best-in-class and they missed their clock targets pretty badly, but both those things were already mostly fixed with the TNT2.
It took them at least 'till GF3 to get it right. GF2 gen was notoriously bad compared to other cards out there.I think RAMDAC 2D quality was far from best-in-class
Original TNT was TSMC 350nm though which they had already used for the Riva 128ZX a few months earlier, NVIDIA claims the clock target miss was due to power consumption being too high for passive cooling, which seems plausible given how rudimentary pre-tape-out power consumption estimation must have been at the time...Everyone planning for 250 nm in 1998 missed the clock targets.
Original TNT was TSMC 350nm though which they had already used for the Riva 128ZX a few months earlier, NVIDIA claims the clock target miss was due to power consumption being too high for passive cooling, which seems plausible given how rudimentary pre-tape-out power consumption estimation must have been at the time...
Somehow I had completely forgotten/missed that TNT(2) didn't have HW triangle setup at all which is why it was so much more CPU dependent than Voodoo 2/3, and GeForce 256 just skipped that step to go straight from nothing to full T&L.
Not at all disputing the point generally, but in the console area - I think the RSP component of the Nintendo 64's RCP did transform and lighting calculations?Yes but these were usually a separate processor from the rasterizer. Even Naomi2 has its T&L, Elan, on a separate processor. Model 2 and 3 used various different processors for t&l. Even professional graphics cards for the PC has options for geometry processors. But geforce256 is the first to have it integrated on the same chip as the rasterizer?