NV40 supports 3Dc

Demirug

Veteran
With a 67.02 driver (maybe earlier version too) a NV40 announces "ATI2" as a valid FourCC format for texture. "ATI2" is the FourCC for 3Dc.

If you want you can test it by yourself. Run "The Project" with a NV40 board and check the log. You will find the line "3Dc compression: Supported".
 
Demirug said:
With a 67.02 driver (maybe earlier version too) a NV40 announces "ATI2" as a valid FourCC format for texture. "ATI2" is the FourCC for 3Dc.

If you want you can test it by yourself. Run "The Project" with a NV40 board and check the log. You will find the line "3Dc compression: Supported".

Are you sure?

I have run "The Project" on a NV40 (6800 Ultra) and 67.02 Drivers and the log clearly states:

...
Geometry instancing: Supported
3Dc compression: Not supported
Gamma control: Hardware
...

regards, alex
 
Running some additional tests.

It works well but it looks like an driver solution. The memory impact and speed is identical to a V8U8 texture solution.
 
suicuique said:
Are you sure?

I have run "The Project" on a NV40 (6800 Ultra) and 67.02 Drivers and the log clearly states:

...
Geometry instancing: Supported
3Dc compression: Not supported
Gamma control: Hardware
...

regards, alex

actually 6800GT + 67.02 log :




Apply anisotropic texture filtering (level: 8)
****** D3D9 CryRender Stats ******
Driver description: NVIDIA GeForce 6800 GT
Full stats: HAL (pure hw vp): NVIDIA GeForce 6800 GT
Hardware acceleration: Yes
Full scene AA: Enabled: 4x Samples (0 Quality)
Projective EMBM: Enabled
Detail textures: Enabled
Z Buffer Locking: Enabled
Multitexturing: Supported (8 textures)
Use bumpmapping : Enabled (DOT3)
Use paletted textures : Disabled
Current Resolution: 1024x768x32 Full Screen
Maximum Resolution: 1280x1024
Maximum Texture size: 4096x4096 (Max Aspect: 4096)
Texture filtering type: TRILINEAR
HDR Rendering: FP16
MRT Rendering: Disabled
Occlusion queries: Supported
Geometry instancing: Supported
3Dc compression: Supported
Gamma control: Hardware
Vertex Shaders version 3.0
Pixel Shaders version 3.0
Use Hardware Shaders for NV4x GPU
Pixel shaders usage: PS.3.0, PS.2.0 and PS.1.1
Vertex shaders usage: VS.3.0, VS.2.0 and VS.1.1
Shadow maps type: Mixed Depth/2D maps
Stencil shadows type: Two sided
Lighting quality: Highest
*****************************************


duh ? :rolleyes:
 
The support is controlled by the regkey "D3D_54082152". If it do not work delete the key and restart your system. This will reset it to default.
 
somthing smells funky here...
cant the driver be used to fool the API while converting ATI2 files to some compatible format?
 
Hmm, I was wondering the same thing after spoting 3Dc was supported on NV40(according to the crytek demo's log, of course). I thought it's a bug in the software at first. :oops:

I'm using 70.41, but I couldn't find that reg key.
 
DOGMA1138 said:
somthing smells funky here...
cant the driver be used to fool the API while converting ATI2 files to some compatible format?

This is what I am mean with "driver solution".
 
so its not 3Dc its just ATI2 texture support, so it will still give you the same quality as DXT or S3TC formats will.
am i right?
 
991060 said:
Hmm, I was wondering the same thing after spoting 3Dc was supported on NV40(according to the crytek demo's log, of course). I thought it's a bug in the software at first. :oops:

I'm using 70.41, but I couldn't find that reg key.

Firtst I thought also it's a bug.

This key is missing by default. It is only there if you have used tools like rivatuner before.
 
DOGMA1138 said:
so its not 3Dc its just ATI2 texture support, so it will still give you the same quality as DXT or S3TC formats will.
am i right?

No, it will give you the same quality as 3Dc but it need more memory (2x) than 3Dc.
 
Humus's 3Dc demo works(NV40 with 67.02), yet there's no difference on performance if 3Dc is enabled or not, I guess the support on NV40 is through some sort of emulation.
 
This could be good news for 3Dc as long as it works without problems. This should mean that R300 hardware and others could use this "trick". Consequently, developers should then be able to use 3Dc without hesitation, knowing they get at least the same results as using DXT, on all hardware. If 3Dc is viewed as a good solution by the developers (and the industry as a whole to some extent) it will then gain support much faster than it would as an exclusive (elusive?) feature of a small portion of cards.

However, I will remain somewhat sceptical until I see it working with my own eyes. I still remember the S3TC "support" in Geforce 256 and how this format 'acted up' on subsequent hardware. Let's hope that driver emulation does not elevate 3Dc just to drop it by lacking actual hardware support in future products.
 
wireframe said:
However, I will remain somewhat sceptical until I see it working with my own eyes. I still remember the S3TC "support" in Geforce 256 and how this format 'acted up' on subsequent hardware. Let's hope that driver emulation does not elevate 3Dc just to drop it by lacking actual hardware support in future products.
The GF256 support of S3TC was not emulated, or broken (according to the spec). It simply didn't interpolate in 32 bit space, but in in 16 bit color space. It wasn't the most optimal solution, but it wasn't wrong.
 
RussSchultz said:
wireframe said:
However, I will remain somewhat sceptical until I see it working with my own eyes. I still remember the S3TC "support" in Geforce 256 and how this format 'acted up' on subsequent hardware. Let's hope that driver emulation does not elevate 3Dc just to drop it by lacking actual hardware support in future products.
The GF256 support of S3TC was not emulated, or broken (according to the spec). It simply didn't interpolate in 32 bit space, but in in 16 bit color space. It wasn't the most optimal solution, but it wasn't wrong.

Whatever it was, it never looked quite right. I am not sure it looks right even now on NV40. Perhaps my memories of it looking perfect on Savage 4 betray me.
 
Back
Top