3Dc

vnet

Newcomer
I've read several sites stating that ATI's 3Dc technology would be an "open standard" and all, and yet when searching ATI's site, I couldn't find anything about 3Dc besides one white paper.
How are developpers supposed to adopt the format if no source code or SDK or anything are available?
I thought it was supposed to be an open standard?
 
You probably have to email ATI and so long as your email address doesn't end in @nvidia.com they'll send it you :)
 
thop said:
Excellent!
Well, it would be if it was actually adopted as a standard, but I reckon like Truform it will be one of those things that are (sadly) of passing interest. If ATI can get it adapted into some standard, that would be great; but if it remains proprietry then I can't see it being anything to get too excited about...
 
Diplo said:
thop said:
Excellent!
Well, it would be if it was actually adopted as a standard, but I reckon like Truform it will be one of those things that are (sadly) of passing interest. If ATI can get it adapted into some standard, that would be great; but if it remains proprietry then I can't see it being anything to get too excited about...

well ati wont charge anyone to use it .

So that means nvidia can suppot it to .

if its in half life 2 , far cry , mabye doom3 then i don't see why nvidia wont adopt it esp if it gives more performance. If not nvidia than mabye some of the other players will
 
jvd said:
well ati wont charge anyone to use it .

So that means nvidia can suppot it to .
Well, it would be great if Nvidia swallowed their pride and adopted it (though I can't see that happening and ATI know that). However, it would be better still if this had been presented to the DirectX/OpenGL working groups and been adopted as a standard in the APIs, guaranteeing it's success.
 
Diplo said:
jvd said:
well ati wont charge anyone to use it .

So that means nvidia can suppot it to .
Well, it would be great if Nvidia swallowed their pride and adopted it (though I can't see that happening and ATI know that). However, it would be better still if this had been presented to the DirectX/OpenGL working groups and been adopted as a standard in the APIs, guaranteeing it's success.

i'm pretty sure its dx specific but remember we are most likely a year off or more from dx10/ next .

Ati prob wanted to get it into hardware but since it was finished inbetween dx versions they figured if we get it out there and get an installed base of cards running it , it will stand a better change of being put in.

Nvidia has many opengl extensions and caps but unlike ati they charge other companys to use them .

This worked well for them when they were the market leader but they aren't anymore and it will be hard for them to force things into the market .

Hopefully all the parts from ati have 3dc including the xbox 2 part and game cube 2 part insureing it will become a standard.
 
jvd said:
Hopefully all the parts from ati have 3dc including the xbox 2 part and game cube 2 part insureing it will become a standard.
Ummm, good point - XBox support would indeed help. Certainly if nobody comes up with a better compression scheme then it should be adapted, I'd just like to see it ensured.
 
Diplo said:
jvd said:
Hopefully all the parts from ati have 3dc including the xbox 2 part and game cube 2 part insureing it will become a standard.
Ummm, good point - XBox support would indeed help. Certainly if nobody comes up with a better compression scheme then it should be adapted, I'd just like to see it ensured.

well from the list someone else posted alot of the fps games are going to be supporting it. I would imagine all of the farcry engine based games will support it , all half life 2 engine games will support it , all serious sam 2 games will support it .

From what i also understand its done in driver ? so that means the r3x0 series will support it ? if i understand correctly that is already a huge uderbase to ignore
 
jvd said:
From what i also understand its done in driver ? so that means the r3x0 series will support it ? if i understand correctly that is already a huge uderbase to ignore
It's a new HW texture format for R420.
 
OpenGL guy said:
jvd said:
From what i also understand its done in driver ? so that means the r3x0 series will support it ? if i understand correctly that is already a huge uderbase to ignore
It's a new HW texture format for R420.
ah i thought it was able to be emulated on the r3x0 series in software :( my bad . Lack of sleep would do that to u .

Oh well back to city of heros waiting till i can order my xt
 
You could theoretically get close to it via driver emulation.

Texture with 3Dc FOURCC gets bound to a shader.
Driver actually binds DXT5 encoded version according to ATI's paper (if it's precompressed as 3Dc, it will have to decompress, and re-encode as DX5)

Driver replaces all TEXLD instructions targeting the sampler which has a 3Dc FOURCC bound to do the extra Z-component calculation (if using 2 components) and any adjustments afterwards as needed.

Or, developers who are working to utilize the 3Dc format could also add DXT5 support at the same time since it's not that much extra work.
 
DemoCoder said:
You could theoretically get close to it via driver emulation.

Texture with 3Dc FOURCC gets bound to a shader.
Driver actually binds DXT5 encoded version according to ATI's paper (if it's precompressed as 3Dc, it will have to decompress, and re-encode as DX5)
It doesn't make sense to expose 3Dc if you're just going to convert it to DXT5 as DXT5 has inferior quality for this type of data.
 
OpenGL guy said:
DemoCoder said:
You could theoretically get close to it via driver emulation.

Texture with 3Dc FOURCC gets bound to a shader.
Driver actually binds DXT5 encoded version according to ATI's paper (if it's precompressed as 3Dc, it will have to decompress, and re-encode as DX5)
It doesn't make sense to expose 3Dc if you're just going to convert it to DXT5 as DXT5 has inferior quality for this type of data.

I agree, if you can convince developers to support DXT5 at the same time as 3Dc. What if your an IHV left in a situation where people are comparing 3Dc hi-res normal maps to uncompressed low-res normal maps? Might make sense for those games to enable the automatic conversion (as a driver/game profile option)

As for whether DXT5 is inferior, I'll wait for some real IQ comparisons to see if it's that distinguishable. ATI's own papers on the DXT5 method suggest that it's an enormous improvement and that artifacts aren't that visible.

But what's more distinguishable in terms of crap IQ? A low-res uncompressed normal map or a hi-res DXT5 compressed one?
 
DemoCoder said:
As for whether DXT5 is inferior, I'll wait for some real IQ comparisons to see if it's that distinguishable. ATI's own papers on the DXT5 method suggest that it's an enormous improvement and that artifacts aren't that visible.
DXT5 is an improvement because you can use a larger resolution texture, however 3Dc is a step above this. In the DXT5 method, one component is limited to 6-bits for the end points and two interpolants. With 3Dc, both components are treated equally so you get 8-bit end points and six interpolants for each. Pretty significant step up and takes the same amount of memory.
 
Yes, I agree 3Dc is better, but the question is, how much better. Enough to justify not supporting DXT5 *at all*? If developers are going to support 3Dc, they should atleast support DXT5 as well (since all hardware can do it). I think it's unfair to compare 3Dc hi-res with uncompressed low-res. The vast majority of consumers would benefit if games that supported 3Dc also supported DXT5 hi-res normal maps as well.

You can't tell me DXT5 hi-res maps aren't a vast improvement over low-res uncompressed maps.

I'm just asking for support for both. It makes sense for developers as well, since they get 90% of the market with increased IQ.

The point about 3Dc emulation is in the scenario where developers support *only* 3Dc which only a handful of consumers will have in the next 18 months. In that case, NVidia and ATI should support a FOURCC 3Dc interception and emulation.
 
Some confusion here. . . Are you saying that it ever implied that a developer had to support only one or the other? 3Dc or DXT5? Obviously a dev will use the compression format that best fits the texture's needs. DXT1 for when alpha is not a concern, 3Dc for normal-maps, and DXT5 other general texture-maps.
 
Back
Top