DemoCoder said:
And to answer the comments about the Sweeney demo, it was NOT the UT2k3 engine. It was stated to be a research project and it was more than just softshadows, it was a unified lighting model like Doom3 will full 100% shadowing everywhere (including softshadows) and also lots of other volumetric lighting effects. Sweeney said the main character that was walking around the screen (a Knight in armor with a torch) had 1+million polygons and that the NV30 could comfortably render about 20 of these at once. However, I doubt the full resolution mesh is actually being used.
Thanks for clarifying this a bit. I only asked about the UT2K3 engine because I'd assumed he'd built it for more longevity--but I guess Sweeny's relying more on the cpu-limited nature of the engine to go the distance.
And to answer your question about the benchmarks of Unreal, et al, they had them running on OEM stations outside the main conference, and you could play around with all the NV30 demos. However I was not able to get a look at benchmark numbers because the benchmark demos were in a special room for VIPs (probably for OEMs and important ISVs like EA, Epic, etc).
I must emphasize again, they had real, live, running hardware that anyone could play on, presumably running @ 500Mhz (dont know about memory) so none of the demos were canned, faked, etc.
As far as I'm concerned, that was never in doubt...
(Um, now that you mention it....were the demo boxes sealed?....*chuckle*)
I was told by several NVidia and OEM people I could buy one in January, so believe it or not for what it's worth.
Any specific OEMs? This sounds logical to me. Was there mention of what the exact problems were that prevented a pre-Xmas release? I'm getting the impression from you that you felt things were "ready to go" and so I'm wondering if mention of this was made (how the subject might not have come up would surprise me--but maybe it didn't.)
The beginning the presentation almost made me faint: GeForce + 3dfx animation was shown combining into GeforceFx? Was the NV30 a Gigapixel tiler!? HOLYCOW! What a surprise. Reality turned out to be far more mundane, but I was still happy that they managed to get it running at 500Mhz with 1Ghz RAM. Think about it: 4gigapixels/s, possiblty 4-8 giga-shader-ops per second (depending on the dispatch rate), and 6-8X FSAA? That's potentially 32-64giga-ops/sec.
I'm not too suprised at this considering it was done on .13 microns. I mean, if some air-cooled, factory .15 micron R300s can do 3.2 gigapixels/sec (400MHz), I can't see how 500MHz might elude ATI at .13 microns. I would also think that a lot of the work nVidia & TMSC did with the process would have been pioneering, and will allow ATI to much more quickly get to .13 microns when it's time.
I am waiting with caution to see how many other features are present: displacement maps, gamma correct AA, Video features, dual output?, and how does LMA3 work, etc.
Seem to be quite a few holes in the presentation thus far. I can't think of a good reason why they'd keep things like this so vague at this point.
My overall feeling is: Thank gawd it's not a huge disappointment. It is performance competitive with ATI, so we will soon have two high performance DX9 chips on the market, and hopefully, ATI and NVidia can together force everyone to upgrade over time, and create a large platform for developers to take advantage of these features.
I don't know too many who felt it might be dissappointing or fail to compete with the 9700 Pro...
However, I think there will be more than a few who will find what the nv30 is does not match their expectations (which were probably way out there in the first place.) The one great advantage I think ATI had here is that no one expected them to do an R300 9700 Pro product. As such, there was not as much fantasy built up around the product's release as nv30 has had. I'm anxious for things to shake out, though, so that I can see this product clearly. Right now it's difficult to separate the marketing from the technically factual.