Quadro vs Geforce

I just read an article about converting a Geforce into a Quadro board, so I wondered about the differences between those two chips and tried to figure them out . I just around for half an hour and the only reasonable infos I could find where on nVidias homepage.

Here are the main differences between Quadro 5500 and the Geforce 7900 according to my interpretation -- please correct them, because I'm totally unsure about it:

  • 12-Bit subpixel precision (in rasterizer?)
  • Framelock and Genlock
  • High precision dynamic range (vs. High dynamic range?)
  • Unified memory architecture (is this only true for the Quadros?)
  • Advanced color compression, early z-cull (I think that's also true for Geforce)
  • Hardware 3D window clipping (I think that's also true for Geforce)
  • Hardware-accelerated pixel read back

Common things seem to be
  • Memory interface
  • 32 bit float precision for colors (shading, filtering, texturing, blending)
  • 128 bit float precision for pipeline
  • FSAA
  • 400 MHz RAM DAC

Second: Are the the Quadro and Geforce really different? Or the same die, with a different bios onboard?
 
The only difference is the software (BIOS) and a few differently wired components for identification (just two resistors AFAICR). Same chip.

EDIT: google for modding guides, you can turn a normal GF into a Quadro or vice versa with little rewiring.
 
Well the driver is also a big differentiator, but you can just change your card's ID thing with something like RivaTuner and install Quadro drivers without issue.

However, what you really pay for when you buy a Quadro is customer service.
 
From what I've found, the last GeForce that can be converted by Rivatuner is a 6800.

The Quadro is for professional 3D apps, like CAD, engineering, off-line rendering, etc. It doesn't do anything for games. The drivers take advantage of parts of the hardware not used by games but stuff that can shockingly speed up those aforementioned pro apps. The slowest Quadro will beat up a SLI 7800 in those apps cuz things are being software emulated on GeForce.

http://features.cgsociety.org/story_custom.php?story_id=3321
 
Last edited by a moderator:
From what I've found, the last GeForce that can be converted by Rivatuner is a 6800.

The Quadro is for professional 3D apps, like CAD, engineering, off-line rendering, etc. It doesn't do anything for games. The drivers take advantage of parts of the hardware not used by games but stuff that can shockingly speed up those aforementioned pro apps. The slowest Quadro will beat up a SLI 7800 in those apps cuz things are being software emulated on GeForce.

http://features.cgsociety.org/story_custom.php?story_id=3321

Only 6800 GT and 6800 Ultra (NV40 or NV45).
6800 standard (the PCI-Express versions) and 6800GS cannot be converted into QuadroFX.
 
Thanks for your answers!

The Quadro is for professional 3D apps, like CAD, engineering, off-line rendering, etc. It doesn't do anything for games. The drivers take advantage of parts of the hardware not used by games but stuff that can shockingly speed up those aforementioned pro apps.

Looking at the performance charts of http://www.xbitlabs.com/articles/video/display/3dsmax5-quadrofx3000.html it seems to me, that especially the multi-viewport and wireframe rendering-speed ups are the most dramatic ones. But I still wonder which parts in particular are shut down with the GeForce series... (hardware 3d window clipping, hw accelerated pixel readback ??)
 
Thanks for your answers!



Looking at the performance charts of http://www.xbitlabs.com/articles/video/display/3dsmax5-quadrofx3000.html it seems to me, that especially the multi-viewport and wireframe rendering-speed ups are the most dramatic ones. But I still wonder which parts in particular are shut down with the GeForce series... (hardware 3d window clipping, hw accelerated pixel readback ??)

Does having a Quadro (or "faked" Quadro ..i.e. GeForce card using hacked Drivers) speed up Rendering speed in Maya?

I guess it wont affect Maya's "Software" Renderer? What about Mental Ray?

EDIT: Also, I have an AGP Radeon 800XT.... are there driver hacks out there to convert it into a FireGl equivelent( if it exists)???
Will it decrease the performance when it comes to gaming? If so, how?..i.e.. lower Framerates.... crappier picture quality.... less stable?
 
Last edited by a moderator:
Does having a Quadro (or "faked" Quadro ..i.e. GeForce card using hacked Drivers) speed up Rendering speed in Maya?

I guess it wont affect Maya's "Software" Renderer? What about Mental Ray?

EDIT: Also, I have an AGP Radeon 800XT.... are there driver hacks out there to convert it into a FireGl equivelent( if it exists)???
Will it decrease the performance when it comes to gaming? If so, how?..i.e.. lower Framerates.... crappier picture quality.... less stable?

**BUMP**

hoping for answer
 
Yeah, you can change the 800XT into a FireGL. There's the script method as of late because I *think* the drivers broke the Rivatuner method.
 
Back
Top