The Lack of high rez S3TC textures in games

Johnny Rotten said:
So then, in round about fashion, game developers ARE just dumb for not compressing everything to begin with. ;)

Well, there is a loss of quality when you compress textures. DXTC is not a lossless format. However, with well chosen, hi-res textures, it is very difficult to tell the difference between compressed and the original texture.

But, decompressing textures at load time for cards that don't support compressed textures is much faster than compressing the textures for cards that do. Again, the main difference will be quality of the textures.

Of course, the application writers would know which textures look fine compressed and which do not and could only compress the ones that looked okay.
 
Doesn't the developer have to pay attention to which textures must be compressed as well? (I.e. don't compress lightmaps)

The lossy format might be fine for surface textures, but is this going to work well for bump/normal maps (lots of fine detail), or textures used
as lookup tables in a pixel shader?

Correct me if I'm wrong, but compressing *everything* seems to be an overly simplistic approach...

Regards,
Serge
 
psurge said:
Doesn't the developer have to pay attention to which textures must be compressed as well? (I.e. don't compress lightmaps)

The lossy format might be fine for surface textures, but is this going to work well for bump/normal maps (lots of fine detail), or textures used
as lookup tables in a pixel shader?

Correct me if I'm wrong, but compressing *everything* seems to be an overly simplistic approach...

Of course you don't want to compress bump maps or textures used for dependent lookups. But many applications are compressing pretty much everything else (Quake 3, for example). Some people are careful, some are not...

Again, even for surface textures, the developer should compare the compressed vs. the original and see if the quality is adequate. If it is not, then it shouldn't be compressed.
 
Oompa Loompa said:
Althornin said:
i happen to think that it IS nVidias fault for ALLOWING developer laziness.
Features need to be pushed. nVidia could have done this (as S3 did) - but they failed to. This is why it is their fault.
That's ridiculous.

S3 pushed S3TC because it was the signature feature of their card at the time. It was a marketing decision. nVIDIA, 3dfx, ATI, etc. etc. have all thrown their weight behind various features at various times, in each case because they hoped to increase sales of their products.

nVIDIA deserves some criticism for having DXT1 work in a suboptimal fashion, but DXT3 works well enough. If developers had any interest whatsoever in spending the extra time and money to develop large, detailed textures they would have done so already.

Blaming a GPU manufacturer is as irrational as blaming... you. The lack of DXT compressed textures in new games is all your fault, Althornin. Why haven't you written to game developers demanding this feature? Why haven't you started your own game company, developing products with extensive use of high res textures?

Whatever. Your argument is ludicrous. I dont have moeny. nVidia does. I dont make gfx chips. nVidia does. I dont have an audince to market to. nVidia does.
Flawed argument.
My argument is nNOT irrational. You may not like the idea - thats another story. you may not even agree (obviously you dont).
However, nVidia was the only player in a position to help get TC used for better IQ and ithey sat back and let nothing happen. If you sat back and let a kid get run over by a bus, i'd blame you also (note: i am aware this argument is a ludicrous as...your argument against me!).

Ty: No one dragged you into an argument. you entered fully aware of the consequences. Stand up and take some damn responsibility like i told you to in the first place, <edited>.

John Reynolds: Let's leave the name calling to other boards, eh?
 
John Reynolds: Let's leave the name calling to other boards, eh?

And please leave your usual forum manners to the other boards you frequent, Althornin.

This forum isn't NVNews' nor 3DGPU's nor Anand's.
 
Althornin said:
However, nVidia was the only player in a position to help get TC used for better...

Hmmm, ATI 128 PRO, Kyro, VSA-100, Savage4, Savage2000. All way old chips that had support for DXTC in hardware.

How is it that NVIDIA was the only company out there in position to properly evangialize it?

Of course, you could say that S3 and 3dfx are out of the picture, so obviously they weren't "players in position". You could marginally argue that ImgTech hasn't ever been a player. But ATI?
 
Althornin, you argument is flawed beyond belief. Nvidia can't force[i/] anyone to use S3TC, or any other feature for that matter. They can (and do) provide tools, tutorials and consultations, but they can't make anyone use them if they don't want to. They don't have SWAT teams on standby, ready to break into developer’s offices at force them to use S3TC at gunpoint.

Besides, back in 1999 we had 4 high-end third-generation consumer cards: Rage 128(PRO), V3, TNT2, Savege4 and G400 (MAX). Outside S3, only Rage 128 had S3TC support and back then ATI was the number one in the industry selling more chips then 3dfx and Nvidia combined. I could try and blame ATi for not using their dominant position to push S3TC, but that would be as ridiculous as what you are doing right now.
 
Althornin said:
Whatever. Your argument is ludicrous. I dont have moeny. nVidia does. I dont make gfx chips. nVidia does. I dont have an audince to market to. nVidia does.
Flawed argument.
My argument is nNOT irrational. You may not like the idea - thats another story. you may not even agree (obviously you dont).
However, nVidia was the only player in a position to help get TC used for better IQ and ithey sat back and let nothing happen. If you sat back and let a kid get run over by a bus, i'd blame you also (note: i am aware this argument is a ludicrous as...your argument against me!).

You have a very strange view of the world. In your world NVidia is guilty of some heinous crime because you feel that they are not doing enough for your cause. Where is your webpage trumpeting IQ over Framerate? Where is your petition gathered door-to-door and sent to developers? Surely there is so much more you could be doing!
 
Well, now that all the "pleasantries" have been dispensed back and forth...

On the surface, this is a very interesting topic.

We all know that hardware companies have various pull with developers.. either directly or indirectly, and in the case of NVIDIA it's always been both. Indirectly from the standpoint of things broken in their API, developers will usually try to avoid by recommendation.. and directly for cases of things like 3dmark, Messiah, Dronez, Tribes2, Aquanox, etc.etc.etc.

People can try and BS their way out of this reality, but it is true and you'll only be joshing yourself and no one else. :) There is nothing wrong with a hardware manufacturer offering additional support, financial capital, lots of support resources or similar to developers in order to further push a technology of their products. It's done every day and has been the case for the past 10+ years (3dfx + many, NVIDIA + many, etc.etc.).

Now on the topic of compressed textures and their usage as more than simple bandwidth saving measure versus using them for actually *increasing* texture detail? I don't think one can make a factual case for NVIDIA's involvement (or lack of involvement) either way since this has never been a showcase or detriment for their hardware... but instead just a supported feature.. thus "neutral."

In a nutshell, it's not a wise business ethic to strongly urge and finance extra support for a technology unless there is some chance of return on investment. Obviously, there are costs associated with steering developers one way or another and these costs need to be justified.

In the case of S3, pushing S3TC was a singular designating feature at the time that would show definate return with increased S3 product sales as they were the sole possessors of such technology... but later licensed to others.

NVIDIA was high on the pixel/vertex shader wagon for a while, but now it's old news and no longer should be cost absorbed since ATI cards have this technology. Luckily a few developers were able to get the push from NVIDIA to emphasize this past "unique" feature in games like DroneZ, Aquanox, and Morrowind.

So for NVIDIA to push/tout S3TC or compressed textures.. what would be the payback for the company? And why isn't equal pressure put on ATI or other companies that support various methods of texture compression? Wouldn't all companies stand to profit from this? So this is truly what it comes down to.

And as already mentioned- it's this combined with the fact that developers are either: a) lazy, b) incompetent, c) disinterested, d) targeting a model platform with poor or little support for said feature, or e) all of the above.

Just my $0.02,
-Shark
 
Back to the original topic, I don't think that Nvidia could be accused of discouraging use of DXTC, but I think they may tend to encourage the use of the "bigger" DX modes for textures over DXT1 simply because of the way their hardware handles DXT1.

This is a real shame because the (opaque-only) quality of DXT2-5 is no better than DXT1 and they are twice the size (8bpp instead of 4bpp)!
 
Hyp-X said:
You cannot enable TC on V3/TNT because (while the software could compress the textures all the same), the graphic chip cannot use the compressed texture. (Just like V3 cannot use - say - 32bit textures.)

Just feel like correcting this:

Voodoo3 can and does use 32-bit textures. It can't store a 32-bit frame buffer though.
 
Tagrineth said:
Hyp-X said:
You cannot enable TC on V3/TNT because (while the software could compress the textures all the same), the graphic chip cannot use the compressed texture. (Just like V3 cannot use - say - 32bit textures.)

Just feel like correcting this:

Voodoo3 can and does use 32-bit textures. It can't store a 32-bit frame buffer though.

Texture size on the V3 was limited to 256x256, though. . .right?
 
Tagrineth said:
Just feel like correcting this:

Voodoo3 can and does use 32-bit textures. It can't store a 32-bit frame buffer though.

I don't remember seeing any 32bit texture formats listed, when I last checked it. (That was at least a year ago.)
Can you tell me what formats does it support? (Or send a dxcaps dump?)

Not that it matters much...
 
32 bit textures were introduced with Napalm

From the Glide Sources:

Code:
/* Napalm extensions to GrTextureFormat_t */
#define GR_TEXFMT_ARGB_CMP_FXT1           0x11
#define GR_TEXFMT_ARGB_8888               0x12
#define GR_TEXFMT_YUYV_422                0x13
#define GR_TEXFMT_UYVY_422                0x14
#define GR_TEXFMT_AYUV_444                0x15
#define GR_TEXFMT_ARGB_CMP_DXT1           0x16
#define GR_TEXFMT_ARGB_CMP_DXT2           0x17
#define GR_TEXFMT_ARGB_CMP_DXT3           0x18
#define GR_TEXFMT_ARGB_CMP_DXT4           0x19
#define GR_TEXFMT_ARGB_CMP_DXT5           0x1A
 
Colourless said:
32 bit textures were introduced with Napalm

From the Glide Sources:

Code:
/* Napalm extensions to GrTextureFormat_t */
#define GR_TEXFMT_ARGB_CMP_FXT1           0x11
#define GR_TEXFMT_ARGB_8888               0x12
#define GR_TEXFMT_YUYV_422                0x13
#define GR_TEXFMT_UYVY_422                0x14
#define GR_TEXFMT_AYUV_444                0x15
#define GR_TEXFMT_ARGB_CMP_DXT1           0x16
#define GR_TEXFMT_ARGB_CMP_DXT2           0x17
#define GR_TEXFMT_ARGB_CMP_DXT3           0x18
#define GR_TEXFMT_ARGB_CMP_DXT4           0x19
#define GR_TEXFMT_ARGB_CMP_DXT5           0x1A

Since GLIDE doesn't go through WHQL, there's nothing preventing the drivers from truncating 8888 into 565, 1555 or 4444 as needed.

I'm not saying that this is the case, but you can't trust a bunch of enums.
 
Has anyone here tried to use 32bit textures on a Voodoo3?

It looks freaking horrible. Asteroids (http://asteroids3d.sourceforge.net) is designed from the ground up for Geforce class cards, though it will run on older cards - and it's bizarre some of the stuff the voodoo card does to it. All the menus instantly look worse because of the 256x256 texture limitation... and the graphics look absolutely terrible. It's all dithered down and looks disgusting.

Anyhoo.
 
Drivers supporting something is different to Hardware supporting something. The Voodoo 3 OpenGL ICD will gladly use textures that are 32 bit, but internally the format is converted to 16 bit.

All glide texturing functions fail if you attempt to use the 32 bit Napalm extensions on a Voodoo 3.

According to D3DCAPS, the D3DFMT_A8R8G8B8 and D3DFMT_X8R8G8B8 are not valid formats for textures.

-Colourless Dragon
 
Back
Top