What happens to NURBS at the hardware stage?

Jason210

Newcomer
Do 3D cards work only with vertices and triangles, or can they work with NURBS data? If the former, when and where does conversion take place?

Thanks in advance,
Jason
 
AFAIK, no modern GPU is capable of handling NURBS in hardware (even though OpenGL has supported them for ages) - they are tessellated into triangles in software at the driver level and only then passed on to the GPU.
 
Arjan,

The NVTeaPot demo (which I had on my harddrive for a while but deleted in a fit of tidyingup-itis) ran faster with the hardware NURB option on my GF3 than the software mode. So some kind of support must exist.

*G*
 
It's possible that NVidia, convert the NURB's into Beziers and use their hardware to tesselate the result.
Given the amout of CPU setup required to draw a Bezier on NV2x I wouldn't expect it to be much faster than doing the tesselation with the CPU unless the curve density was low and the polygon density high.
 
You mean this one? http://glygrik.hotbox.ru/

It seems to be Bézier patches, not NURBS. NVidia provides an extension (NV_evaluators) for rational polynomial surfaces (up to order 8 IIRC). It seems to be a software implementation, though.
 
It seems to be Bézier patches, not NURBS. NVidia provides an extension (NV_evaluators) for rational polynomial surfaces (up to order 8 IIRC). It seems to be a software implementation, though.

FWIW this matches what the hardware can do.
But it requires significant CPU setup to get the data into a format suitable for hardware acceleration. Unless the rendered number of tris is much larger than the patch count I really can't see it being much of a win performance wise.
 
Back
Top