NVIDIA HAS JUST posted a presentation it made in Hong Kong today. That describes the NV30 as the "next generation cinematic GPUs" with 100+ million transistors, and says NV30 and Cg will "lead the programmable revolution". It also says there is a "platform transition" in the shape of AMD's Opteron and AGP8X. The slide PDF starts here (31 slides).
Joe DeFuria said:How about I ask you this question then. If you were going to start testing performance for a GeForce4 review today, which Detonator driver would you use?
That's a good question, and one I have spoken about on this board previously. (I'll see if I can dig up my post...)
EDIT - Found it: http://www.beyond3d.com/forum/viewt...storder=asc&highlight=leaked&start=49
But in short: in any review, I would use TWO sets of drivers (assuming they are different.)
1) The set that is officially supported by the board manufacturer.
2) The latest beta drivers available to the public either directly from the board manufacturer, or from the chip vendor.
Number 1 is the most important, but unfortunately gets the least attention. Presumably, these drivers should support the complete functionality of the board, including TV-OUT, etc.
It would also be interesting, though not necessary. to see any "leaked" (non publically available) drivers, if there is some claim of major marked performance increase, bug fix, or new feature implementation. However, using "leaked" drivers is not something I would use for overall testing and comparison: just brief and specific tests to validate any claims.
So in the case of these Dets, I would use them in a review (since they are publically available)...along with the official drivers from the board vendor.
duncan36 said:Exactly. As i said, Nvidia has again and again said that the NV30 needs a 0.13 micron process because of it's advanced funtionality and what if it turns out to be a <= 110 millon part ? (since we already have a 0.15 micron 110+ million DX9 part out on the market
ATi has more experience with efficent designs than Nvidia does.
ir.net/media_files/NSD/NVDA/presentations/gsasia2002.pdf
Galilee said:Maybe, but I can't seriously remember NVIDIA having any big problems with their drivers at any time. I've had a GF1-DDR, GF2-PRO, GF3, GF3-Ti200 and now Ti4200.
You bring up some good points Joe although we do share a difference of opinion.
Hmm, I'm not sure if this is the "whole point". With the 9700 available now, perhaps there is another "motivation" for releasing them publicly IMODemoCoder said:The whole point of the Det 40 beta drivers are to support Cg developers and deliver OpenGL1.4 for testing. These drivers were announced a long time ago on the Cg site and we were promised them for enabling Cg NV30 development and testing. They were released to the public instead of "leaked" because Cg developers are not required to be registered NVidia developers.
If you are a registered developer, you know that NVidia follows an open release pattern -- release early, release often, for the devs. Very few of the driver builders ever become WHQL cert'ed. The Det40 beta has major new functionality which devs have been demanding so it wouldn't make sense to release them only to the small cadre of reg'ed developers. That's why they were pushed out for wide beta.
Joe DeFuria said:MikeC,
You bring up some good points Joe although we do share a difference of opinion.
Actually, our difference in opinion is that great.
If you are only going to use a single driver (understandable because time=money), then I would go with the latest officially drivers supported by the board manufacturer. (I assume that's what the 30.82 drivers are? The latest official drivers supported by Gainward?)
I've been employed in the IT profession for over 20 years and my wife still tells me that I need to grow up ...
Bjorn isn't assuming or inferring anything! You're reading too much into what he's saying.
If anything, he's exactly highlighting how ATI can be doing something that nvidia essentially claimed wasn't really possible. (Though basically, they have already). It's just that if NV30 comes in with LESS transistors than R-300, that's all the more ironic.
I'm quite sure most people on this board read other, game-related boards, too. Still they have many different opinions on this. So I don't think you can take what you posted as "proof" for anything.jjayb said:Go look for yourself and you will see I'm not making this crap up. (be sure to remove the <bleep> blinders first)
Though I agree that R8500 is "more advanced" than GF3, I think you're quite selectively picking features that prove your point while leaving out others...Doomtrooper said:Radeon 64 Meg Vivo
AIW
Supports all bump mapping modes
DVD features galore
Geforce 2 GTS has only one mode Dot3
Radeon 8500
PS 1.4
Truform (really a building block for dispacement mapping)
Higher Internal Precision
Supports all bump mapping modes again
DVD features galore
Geforce 3 and 4 supports Ps 1.1-1.3 and both bump mapping modes
Though I agree that R8500 is "more advanced" than GF3, I think you're quite selectively picking features that prove your point while leaving out others...