Nvidia is using 3dfx technology!!

Yannis

Newcomer
"MR: I had an inteview with Dan Vivoli, exactly one year ago (English version can be found here: http://www.hwzone.it/html/text.php?id=205 ). In that interview, he stated that NV30 would have been the first NVIDIA chip to use "something" from the aquired 3dfx technology. Can you tell me anything more, since one year has passed?

DK: GeForce4 series included the VPE - video processing engine - was designed at 3DFX. Also, some of the FSAA technology - filter on scanout - was first developed at 3DFX. This technology allows for very high speed anti-aliased rendering, while using less memory bandwidth. So, you can see that the 3DFX team really hit the ground running when they arrived at NVIDIA."

What is the "Video Processing Unit"? The dvd, mpeg stuff?
What is this fsaa filter? How can a filter on output reduce the bandwidth?
 
I find that old interview pretty interesting way to look back whole last year. ( how they saw their upcoming products and situation and what actually has happened...)

It truly looks like GF3Ti -series was made from GF3 very quickly after ATI released Radeon 8500 and Radeon 7500. (there's no indication of new naming of GF3 at April 2001 point. Neither there is any talk about NV25...)

a worth of reading IMHO.
 
Like most Kirk quotes, that is nothing more than marketing fluff.

However they possibly can, NV engineers will try to capture market share with stretches. One thing long-time 3dfx fans have been looking for is that utopian card, one with the AA of the V5, the anisotropy of the GF3, and modern day bandwidth/fillrate.

These types of quotes are only an effort to try and capture those folks by loosely associating product lines with recognized achievements acquired, but anyone that has used a Ti4600 knows the MS/4xS/Quincunx are still so drastically different from anything in existing 3dfx products and even rumored 3dfx future product lines prior to their demise that it's not worth discussing... or possibly some form of "help" with their current direction, but not a full steer.

Still, I'd prefer if companies allowed their products to sell on their own merits, as most of the next generation rumors appear very promising without the need for any fluff. Anand souring a future 20gb/s bandwidth product from Matrox by stating the NV30 will take this crown leads expectations to a product that should be able to pull off some incredible AA and anisotropy.. regardless of where the technology came from.
 
Actually, I heard from a vendor that the original plan was to hold tight with Geforce3 release Geforce3 MX (today's Geforce3) last year, and release Geforce4 Ti the end of the year/beginning of this year (both anand and Tom said it was ready in October/November , and frankly, I don't have a reason to disbelieve them) . I don't have confirmation of this from Nvidia of course, but it was interesting , what they told me at E3 last year and what actually happened. Ti500/200 is obviously just a speed-binned Geforce3 . Anyway, this fall should be the most interesting in 3d graphics in a long time, also , maybe one of these days the true story of Matrox's 2000/2001 " gaming card" can come out ;) (some of you may have seen some of my and Dave Barron's posts back then on the old B3D boards)
 
Was that 3dfx AA filter about combining the subsamples on the way to RAMDAC, not earlier in the pipeline?

And is that Matrox story about the G800 dev and how Nvidia lured away the key engineers and 250 units of model number? ;)
 
Gunhead said:
And is that Matrox story about the G800 dev and how Nvidia lured away the key engineers and 250 units of model number? ;)

well, If you take a closer look to nView technology, it's basic working modes are exactly same as G400 Dual Head had a three years earlier already. (suprisingly GF4 -series incorporating nView, were released about little bit over a year after this Engineers jump is estimated to happened...)

their earlier, TwinView wasn't anywhere near than that.

Also, Matrox was working something codenamed G800 / F800 Fusion. There is some rumours saying that, it wuold have been something like motherboard chipset with integrated GFX core.
 
Okay, but didn't Nvidia hire a gob of Appian people as well? Which makes me wonder why it was ATI who ended up with Appian's old "Hydravision" term...
 
Back
Top