3Dc

If anybody want to try to compress tangent space normals with DXT5, then try this:
Put X in the R, G and B channel of a R8G8B8A8 texture, and put Y in the alpha channel.
Compress it with a standard DXT5 compressor.
In the pixel shader, swizzle G to X, and A to Y.

Precision beyond 8 bit in the Y channel will be lost even though it could be possible to keep it with a specialized compressor.
 
Basically this argument can be sumarised as :

People who will be buying an ATI card : We want 3DC support, even if it's not a standard, and we don't want developers to fall back and use DX5 compression for other cards as we want games to look best on ATI cards.

Everyone else : We'd like to see developers exploring using 3DC but we'd also like them to fall back to using standardised compression techniques when 3DC compliant hardware is not detected. We'd also like to see a comparison of normal compressed with DXT5 and 3DC to make our own minds up about the benefit, rather than believing the marketing of one company who are instigating a proprietry technology.
 
DemoCoder said:
Yes, I agree 3Dc is better, but the question is, how much better. Enough to justify not supporting DXT5 *at all*? If developers are going to support 3Dc, they should atleast support DXT5 as well (since all hardware can do it). I think it's unfair to compare 3Dc hi-res with uncompressed low-res. The vast majority of consumers would benefit if games that supported 3Dc also supported DXT5 hi-res normal maps as well.

You can't tell me DXT5 hi-res maps aren't a vast improvement over low-res uncompressed maps.

I'm just asking for support for both. It makes sense for developers as well, since they get 90% of the market with increased IQ.

The point about 3Dc emulation is in the scenario where developers support *only* 3Dc which only a handful of consumers will have in the next 18 months. In that case, NVidia and ATI should support a FOURCC 3Dc interception and emulation.
WHY! because your company wont support 3Dc? And whats up with the " I think it's unfair to compare 3Dc hi-res with uncompressed low-res. " Its a fricking ATI PR pic.That Croetem wipped up. What? ATI shouldnt premote there realy cool 3Dc aganst the standard Normal? Start a pettion and send it to the devs, tell them to use DX5T cause not everybodys hardware uses better 3Dc and its unfair... whaaa wahaaaa. Oh yeah thats good lets make nvda and ATI set up some manhours to:"support a FOURCC 3Dc interception and emulation" cause its great for the consumer, Hell lets just get rid of M$ and DX and have FREE code, wait lets just have free computers... and you can code for us DC FOR FREE. I always like how some posters on this forum can do any thing better than the Devs, IHVs, AIBs... AND be so rude about it. Just maybe nvdia will come up with a cool item that we all want...OH hey they do its in DOOM , its about shiznizil shadows... lets have some of that sweet sweet "interception and emulation". pifft
 
dksuiko said:
It would, if not for the fact that nobody is yet using 3Dc, either. (Though I think Fry Cry will be supporting it soon?). Because if 3Dc is a not-so-significant improvement over DXT5, why would they use 3Dc if they wouldn't even use DXT5? One answer to that question would be ATI would assist them in implementing it. In which case, 3Dc, to developers, would be a better choice by virtue of having help doing it - wherein DXT5, there wouldn't be any.

And in so doing, 3Dc gains more support over DXT5. So yeah, the fact that nobody is yet using DXT5, in comparison to 3Dc, would be an implicit answer the question (because, of course, an unused feature is a useless feature :)) But as it stands now, nobody is using either correct?

Nobody is using 3D because they had no compression tools to utilise it and no hardware to decompress it (hence test it) – it requires hardware support. 3Dc was designed to answer the request that ATI had from developers: to provide a method of higher detail normal maps without killing performance / memory. It can’t be said that DXT5 hasn’t been available as an option to developers – if they had than need and an answer to the performance / memory was there in the form of DXT5 then it would be logical to conclude there is another reason for not utilising it.

AFAIK, I believe Valve are using 3Dc and a fallback compression method. I’m trying to get some comparison shots from them, but I wouldn’t say it will be easy to get anything from them.
 
DaveBaumann said:
Nobody is using 3D because they had no compression tools to utilise it and no hardware to decompress it (hence test it)
It can’t be said that DXT5 hasn’t been available as an option to developers – if they had than need and an answer to the performance / memory was there in the form of DXT5 then it would be logical to conclude there is another reason for not utilising it.

False on two counts. First of all, no one had thought of the idea of using DXT5's alpha channel to store normal maps until recently. Developers had the need for a good displacement map solution too, but until offset maps were shown in demos recently, no developers used them. They were trivial to implement. Now all of a sudden, everyone is starting to add them into their engines.

Secondly, there are no compression tools available to convert normal maps into DXT5 format. The performance is the same as 3Dc, the compression ratio is identical, and the quality, as claimed by ATI themselves isn't bad. As soon as someone writes the first open source DXT5 normal map compression tool, and a quick tutorial on how to use them, you'll have a much higher probability of uptake.

Dave, if DXT5 is such a bad solution, why did ATI conclude otherwise in its own research paper?
 
DaveBaumann said:
It can?t be said that DXT5 hasn?t been available as an option to developers ? if they had than need and an answer to the performance / memory was there in the form of DXT5 then it would be logical to conclude there is another reason for not utilising it.
Perhaps no hardware company is 'offering support' to developers to use it? Y'know, the kind of support where you put the hardware vendors logo on your box and they give you lots of free technical advice?

Personally, I'd like to judge for myself if DXT5 is really a bad fall-back solution. Unless I can see comparisons of normal-maps compressed by both and see they look noticeably worse, then your argument is just supposition (logical supposition, perhaps, but certainly not fact). Perhaps a B3D test would be in order, as DC suggests?
 
Basic said:
If anybody want to try to compress tangent space normals with DXT5, then try this:
Put X in the R, G and B channel of a R8G8B8A8 texture, and put Y in the alpha channel.
Compress it with a standard DXT5 compressor.
In the pixel shader, swizzle G to X, and A to Y.

Precision beyond 8 bit in the Y channel will be lost even though it could be possible to keep it with a specialized compressor.

This will work, but standard DXTC compressors are optimized for color values, and pick base values weighed in that regard. With normals, there are certain properties we want to preserve, such as favoring the general predominant direction of normals within the block. This will reduce blocking artifacts.

Thus, you need a tweaked DXTC compressor. In fact, you may actually be able to get away with using the entire 5:6:5 color values of the color block at the cost of additional pixel shader instructions for the decode.
 
DemoCoder said:
[
This will work, but standard DXTC compressors are optimized for color values, and pick base values weighed in that regard.
Apart from the tendancy to weight G over R and B, which you can turn off (i.e. by setting the weights to be equal), I don't think there is very much at all in the way of colour optimisation.
 
That's why I said it should be done that way.
The only thing you need to check is that the compressor doesn't do dithering.
If there's any special optimizations for compressing color values, it's likely regarding what to do when the 16 samples aren't located on a line in the color volume. If you make sure that the samples are in line, then there's little reason to change the compression algorithm. The "general predominant direction" should be favoured equally for color compression.

But I do agree that you need a special compressor if you want to use two DXT5 channels to get higher precision in the base values. Interestingly, if the DXT5 decompression hardware use high enough precision, you could get higher precision in the base values with DXT5 than with 3Dc. But DXT5 does of course have the limitation with only 4 interpolated steps in one dimension.
 
I think Joe's "if it makes sense" comes down to installed base... which is like 99% DXT5, 0% 3Dc in the "gaming" market currently.

btw DemoCoder, DXT1 and DXT5 installed base is very close, but not exactly identical ;)
 
Simon F said:
DemoCoder said:
[
This will work, but standard DXTC compressors are optimized for color values, and pick base values weighed in that regard.
Apart from the tendancy to weight G over R and B, which you can turn off (i.e. by setting the weights to be equal), I don't think there is very much at all in the way of colour optimisation.

I saw one compressor a long time ago that claimed to use color specific error diffusion/dithering. Although, I must admit, I am unaware of the internals of most DXTC compressors.

But you would of course want to specifically do processing when handling normals, especially depending on assumptions of renormalization or whether the input data is correct (no non-normals)
 
Xmas said:
I think Joe's "if it makes sense" comes down to installed base... which is like 99% DXT5, 0% 3Dc in the "gaming" market currently.

btw DemoCoder, DXT1 and DXT5 installed base is very close, but not exactly identical ;)

Really? I'm curious, is there a card that supports DXT1 and not DXT5? (besides the obvious problems with 16-bit interpolation on GeForce1-4 :) )

I wonder if you could combine the DXT5 hack with NVidia's D3DFMT_CxV8U8 format and old DX7/8 texturestagestate calls to avoid using a pixel shader slot and make it work on old DX7/PS1.1 hardware?
(e.g. huge GF2MX/GF4MX installed base) It might require driver support, but if the fixed function HW is there to calculate sqrt(1-u*u-v*v), then perhaps a FOURCC format could be created to support automatic expansion and swizzle for older HW.
 
Xmas said:
I think Joe's "if it makes sense" comes down to installed base... which is like 99% DXT5, 0% 3Dc in the "gaming" market currently.

Indeed. 3Dc is probably a good solution to a problem that have been overlooked for too long, but I don’t like it when a new standard comes directly from an IHV (be it ATI, nVidia or 3dfx). Especially not when we're right smack in the middle of a successful API like DX9 which will serve us a long time.

This standard should have been a part of DX endorsed by MS and should have been a part of the DX spec so it could have been implemented on all DX9 hardware. Normal maps are awesome in my opinion and there is (be will) way to many R3x0, NV3x and NV4x cards out there for game devs to rely on 3Dc however great.

Sorry for the rant, but jezz, I thought that the industri would have learned a thing or two about sorting ordinary and useful standards out by now. :?
 
DemoCoder said:
Really? I'm curious, is there a card that supports DXT1 and not DXT5? (besides the obvious problems with 16-bit interpolation on GeForce1-4 :) )
Kyro I (and II?). AFAIK they only support DXT1 in hardware. I think they are using a software fallback for other DXT formats. Seems odd to me to only support half of the extension, maybe there was a hardware bug?
 
mczak said:
DemoCoder said:
Really? I'm curious, is there a card that supports DXT1 and not DXT5? (besides the obvious problems with 16-bit interpolation on GeForce1-4 :) )
Kyro I (and II?). AFAIK they only support DXT1 in hardware. I think they are using a software fallback for other DXT formats. Seems odd to me to only support half of the extension, maybe there was a hardware bug?

You want really weird, S3 didn't support all DXT modes... They specified all 5 formats but there hardware didn't actually support all of them....
 
mczak said:
DemoCoder said:
Really? I'm curious, is there a card that supports DXT1 and not DXT5? (besides the obvious problems with 16-bit interpolation on GeForce1-4 :) )
Kyro I (and II?). AFAIK they only support DXT1 in hardware. I think they are using a software fallback for other DXT formats. Seems odd to me to only support half of the extension, maybe there was a hardware bug?

DXT1 is 4bpp while the others are 8bpp. The latter classes didn't seem to be worthwhile supporting given the lower compression ratio.

There was no bug and, unlike some chips, Kyro decoded DXT1 to full accuracy. Most annoying, however, was the tendancy of some developers to use DXTa (a > 1) for fully opaque textures when DXT1 had the same accuracy (except on certain cards :rolleyes: ) for half the cost.

DeanoC said:
You want really weird, S3 didn't support all DXT modes... They specified all 5 formats but there hardware didn't actually support all of them....
I suspect (but I have no proof) that the latter ones may have been concocted later as a collaboration between S3 and MS.
 
Simon F said:
mczak said:
Kyro I (and II?). AFAIK they only support DXT1 in hardware. I think they are using a software fallback for other DXT formats. Seems odd to me to only support half of the extension, maybe there was a hardware bug?

DXT1 is 4bpp while the others are 8bpp. The latter classes didn't seem to be worthwhile supporting given the lower compression ratio.
Compression ratio might be lower, but you simply need the other formats if you have textures with alpha components (unless 1bit transparency is enough). I just think it's odd because you still need to pay full patent license royalties (or not?) (for OpenGL), and it shouldn't cost too much to implement in hardware.
There was no bug and, unlike some chips, Kyro decoded DXT1 to full accuracy. Most annoying, however, was the tendancy of some developers to use DXTa (a > 1) for fully opaque textures when DXT1 had the same accuracy (except on certain cards :rolleyes: ) for half the cost.
Yeah, that's stupid. If the developers wanted to avoid some accuracy flaw in a lot of chips from one specific vendor (with a large market share I might add...), they should have still stored their (fully opaque) textures as DXT1. It's easy to "repackage" them as DXT3 (or DXT5) and sending it that way to the graphic card (no need to recompress). And, of course, for online compression it's no problem at all to use a different format.

DeanoC said:
You want really weird, S3 didn't support all DXT modes... They specified all 5 formats but there hardware didn't actually support all of them....
Is this only true for the original Savage4? There are tons of Savave4 derived cores still sold today.
Interestingly, Savage4 cores seem to support quite a few compressed texture formats:
TFT_S3TC4Bit,
TFT_S3TC4A4Bit, /*like S3TC4Bit but with 4 bit alpha*/
TFT_S3TC4CA4Bit, /*like S3TC4Bit, but with 4 bit compressed alpha*/
TFT_S3TCL4,
TFT_S3TCA4L4,
(that's from the open-source DRI driver, savage_bci.h).
This seems to suggest it supports DXT1, DXT3 and DXT5, as well as some luminance based compressed formats. In fact the last one looks exactly the same as 3Dc to me (two components compressed according to the DXT5 alpha compression algorithm), though that's just a guess. I have no idea if the formats actually work or not, it's not implemented in the driver (due to the stupid IP issues).
 
DeanoC said:
mczak said:
DemoCoder said:
Really? I'm curious, is there a card that supports DXT1 and not DXT5? (besides the obvious problems with 16-bit interpolation on GeForce1-4 :) )
Kyro I (and II?). AFAIK they only support DXT1 in hardware. I think they are using a software fallback for other DXT formats. Seems odd to me to only support half of the extension, maybe there was a hardware bug?
You want really weird, S3 didn't support all DXT modes... They specified all 5 formats but there hardware didn't actually support all of them....
Sort of. The Savage 3D and Savage MX/IX only had DXT1 support, but all following Savage products had DXT1-5 support if I recall correctly.
 
3Dc is clearly not a replacement for DXT1/5 based methods. It is an additional method that will have higher quality than any existing compressed method.

It is not in our interests to now say 'Forget everything else, just use 3Dc' because we have a large installed base that doesn't have 3Dc and these people are very important to us.

But if 3Dc is there and it gets used the game will look better. Surely there can be no objection to that...
 
Back
Top