3Dc

They are officially supported by MS with tools, documentation, and a runtime implementation.

If 3Dc had the same status with respect to DXTC (called "DirectX Texture Compression" by the way), there would be no need to talk about it "becoming a standard".

It's no more a standard than D3DFMT_CxV8U8 for example, and infact, less, because atleast those formats are in the SDK header files.
 
DemoCoder said:
They are officially supported by MS with tools, documentation, and a runtime implementation.

If 3Dc had the same status with respect to DXTC (called "DirectX Texture Compression" by the way), there would be no need to talk about it "becoming a standard".

It's no more a standard than D3DFMT_CxV8U8 for example, and infact, less, because atleast those formats are in the SDK header files.

So, you say DXT5 shouldn't be viewed as a failure even though there's no developers using it and you advocate more time. And now you're criticising 3Dc's "status" even though it was just announced days ago?
 
dude, DXT5 wasn't 'recommended' or considered for normal map compression till fairly recently. ATi themselves were some of the early advocates for using DXT5 w/ normal maps. 3Dc is a logical advancement of DXT5, and as a result, developers now can use, and are aware of two fairly good compression techniques for normal maps.
 
Before ATI's paper, most developers probably just thought they could get away with low-res uncompressed normal maps (e.g. "they were good enough") ATI's evangelization of hi-res normal maps has now made this a public issue, so there will be more pressure.

Actually it as something that was discussed on gdalgorithms last summer IIRC. Plus Jakub Klarowicz had the technique published in ShaderX2...
 
You guys miss my point

gdc is free anyone can put it in their cards . This is not something that will be charged for .

This isn't a nvidia extension that they charge for .

This is free.

And just like dx 9 it has to start off in one card .

But after that others will get it
It will be in more and more cards .


Even if its marginaly better than dxtc1 its still better .

Yes p.s 3.0 is great in a nvdia card . But it seems like 2.0 is much slower than atis verision . So of course i will go with the one that can do 2.0 faster because that is what all the games for the next few years are going to run .

No one is crazy enough to have a game that doens't support p.s 2.0 same with 1.1
 
jvd said:
gdc is free anyone can put it in their cards . This is not something that will be charged for .

You mean 3dc don't you? if so its been stated before the it looks like their are IP claims over it and it won't be free to put in their cards in opengl.
 
bloodbob said:
jvd said:
gdc is free anyone can put it in their cards . This is not something that will be charged for .

You mean 3dc don't you? if so its been stated before the it looks like their are IP claims over it and it won't be free to put in their cards in opengl.
yes but u can put it in dx . and ya i meant 3dc to much time in console forum.

WHich is the same as dxtc . You can only use it free in dx . Not in opengl which you need to pay for .
 
Yes it does matter, because if it's just marginally better (visualy or performance speaking) then it won't be used.
jvd said:
Evildeus said:
Dio said:
But if 3Dc is there and it gets used the game will look better. Surely there can be no objection to that...
Yeah you are right. But the question is, does it really looks much better than DXT5? I'm still waiting for a direct comparison.

does it matter ? one is getting pushed the other is not . Even if they look the same the one that is being pushed gets my vote
 
If 3dc is covered by S3's patent... then what is the problem? Doesn't like everyone have a license for using S3TC in OpenGL anyway?
 
akira888 said:
Diplo:

Glide and 3Dc is not a valid comparision. Ati is making 3Dc an open standard
But, AFAICS, (given S3's patent) it might require a license from S3 . Maybe it'd be OK under the auspices of MS/DX but I think I'd want to have that clarified first.
 
While we're on the subject, does 3Dc automatically compute the Z component from the stored X and Y values?
 
DemoCoder said:
Is D3DFMT_CxV8U8 covered by a patent too?
I believe that the idea of storing a normal map using only 2 dimensions and infering the third is patented... but then, the whole 'DOT3' idea of doing fast bump mapping in a local surface coordinate system using a (3D) normal map is patented too. <shrug>
 
Apologies if this has already been posted...

Could this be implemented on Nvidia hardware just using pixel shaders ?

The actual compressed data is 128 bits per block which fits nicely into a R32G32B32A32 texture. The maths to generate the pixel values isn't too steep and I'm sure there are reduced overheads in careful ordering of texture fetches and maths ops.

The question is does DX support the unpacking of pixel data that would be required (IIRC Cg/OGL has pack and unpack ops via the nv_fragment_prog extension)

Rob J
 
For the 3 component format there is a hardware change required as well as the shader instruction to calculate Z. I can't remember exactly what it was (it was altering the alpha channel such that it can deal with a different component).
 
if 3dc is so similar to dx5 then why is it necessary to have hardware support? why can't other cards do it. why did ati bother to hardware support it?
 
Back
Top